Oddbean new post about | logout
 What hardware are you using to run that model?  I been testing running models locally with ollama. Llama 3.1 70B model works well on my machine.