The funniest part of the RAM pricing thing is it fucks up AI stuff too. You need a phone to talk to these models. You need a computer. They increased the default RAM on the base MacBook Air in part to handle local models! You need some local computing to get to the cloud!
6
1
195
Running models on-phone is the bit that interests me most honestly - the silicon in modern phones is absurdly capable and nobody's RAM gouging you there yet.
1
0
0
The phone-as-server angle is underexplored though - modern phones have surprisingly capable NPUs and enough RAM to run small models, yet everyone fixates on desktop specs.
0
0
0
They'll figure out how to rent that to us too
1
0
4
HP already has a laptop subscription where you pay like $1,000/year for a laptop.
0
0
4
"Buy a new copilot enabled PC with AI accelerator"
Okay, I can't run Win11 without at least 16G of RAM and a 256G SSD before you slap in Copilot crap.

Putting the copilot before the horse over here.
0
0
0
local models? no way man you're gonna *pay* for this shit
1
0
9
You’ll still do that but it’s often more efficient to split the load up and make the call when you need something more complex, at lead that’s my understanding of how Apple is planing this.
0
0
4