Oddbean new post about | logout
 On-device generative machine learning models are coming to Macs.

Apple managed to squeeze a large language model into a MacBook Pro M2, it runs 25 times faster and models 2x larger, by storing the models in flash memory, made possible on M chips.

https://arxiv.org/pdf/2312.11514.pdf 
 I had this feeling they were biding their time until they could do it at a level they found acceptable. Hopefully they won’t self-nerf it too much. 
 I think so too. “Do at a level they found acceptable”, and when the time comes, there are already millions of devices out there with the hardware, just need to push a software update. 
 One thing I’ll say about Apple, they don’t like janky features. Siri has been a bit of a black eye for years now. I don’t think they want a repeat. 

I agree with you. I think it’s been the plan all along. 
 Huge. Great use of the unified architecture. 
 These new SoCs aren’t only CPU and GPU; AMX is an Apple undocumented matrix-multiplication coprocessor aka accelerator. A new horizon.