Oddbean new post about | logout
 not sure about that 

My M3 sweats hard on llama3 70b

Think we are 3 years from self hosting. models will get smaller and more task specific and processors will get better 

It's a bit daft that the model I use for coding also knows ancient Greek and sumarian for example  
 In all fairness, I'm not really going to do anything AI-related on this computer. If I do that, I'll build a dedicated server for it. But when I'm looking at high-end laptops, they all have the same AI-ready marketing shtick. 
 exactly it's bullshit

they are AI ready with Chrome to connect to chatgpt

I'm only interested in self hosted AI. it's gonna be huge in three years 

OpenAI = ABC News
Self Hosted = Library of Alexandria