Oddbean new post about | logout
 Our AI model is fully open source, but we make sure it's so huge that nobody but us can afford to run it. Then we run it for you and collect all the data from the closed source front-end UI you must use to access it. 
 What do you mean by can’t afford to run it? 
 You would need a data center full of these Nvidia premium GPU to run a 107B model. 

7B or 8x7B models can run on a consumer PC. 
 I run 70B on my Mac … now curious to try 405b 
 Give updates if you do, curious to see how it goes