Oddbean new post about | logout
 Can it run on the phone directly? If the phone already has the thread loaded, then it could easily generate that summary. I tested Gemini on Android and it works, but it gets quite expensive and it's not private at all. 

Future is bright. 
 Yes the model I tested on was phone sized! There are some models “smallish” (2gb) with parameters that can be sampled with memory available in phones (in theory). I noticed summarization quality was quite good even on very small models. Notedeck is desktop though so i will likely try there first. 
 Nice! I wonder how well phones will handle these models. Android pushes apps to be under 512MB. But maybe they can offer a single, larger model that is pre-loaded at all times for all apps. Similar to how they do translate models these days. 

But yeah, I think testing on the desktop will be the way to go for now. Things will change so much on phone libs that I am not sure if it's worth trying to something like that now on phones.  
 This is the future fren. Damus will be my all around web buddy if local llm prechews some of the hard to swallow bits for me. Will even buy the new pixel (calyx) and figure out how to disable BLE just for this. make sure you leave room for custom llm endpoints though since some of us run private remote llm to phone and would like to continue. Just a thought. great news though. keep grinding