Oddbean new post about | logout
 I have a PC - 6GB card and 16GB RAM. 

I mean,  for the life of me I can't seem to find the right model to use. Llama seems great, in fact, imo it runs better than ChatGPT but that's because they have more nlp from Facebook/Meta platforms... 

Is there a way to run lllama on local without the need for AWS or Azure or a 4090 GTX