Oddbean new post about | logout
 Awesome dude. Any models specific for code coming soon? 
 It's Running a few, any specific you looking for? 
 Heard positive things about https://ollama.ai/library/phind-codellama 
 Is there any advantage in using running that for you instead of you running it locally for this use case? 
 I don't have enough RAM, and even when I try with similar models for my scarce RAM they are slow