Oddbean new post about | logout
 nostr:npub1j46f9va5qdrmamcwmncew7g8mwekgma386dhws6k04fsnlkqpcpsj23gm7  nostr:npub1yl37a83lkqwwzdd6qj7clagncs543pvs2uwkzjyzmzkgwhnu6dhq6ey6aq 

the OSS LLM world is thriving! And, lots of people are working on getting these working locally without melting the machine :)

Check out https://bootcamp.uxdesign.cc/a-complete-guide-to-running-local-llm-models-3225e4913620

I don’t know quite what the term would be other than “local LLM” - however, regarding ML models with very specific purposes running on little boards, look up TinyML.  Thriving community there as well.