Oddbean new post about | logout
 Seriously if you are this concerned about how you are using an LLM then just run one locally. Llama3 is available is a bunch of different sizes for whatever machine you have. Even the small ones are decent.