Oddbean new post about | logout
 I use 90% local llm with personal vector cache stored locally on encrypted drive. Llms are moving to edge. Check out dolphin for uncensored models.