Oddbean new post about | logout
 Amazon Web Services CEO Matt Garman estimates that large language model (LLM) training two to three generations from now will require as much power as a large city.
 https://yakihonne.s3.ap-east-1.amazonaws.com/ad6a909b8dfd6e278f94881d83dbd5ad5f9260c7502175059b29042e589fb93c/files/1718635375605-YAKIHONNES3.jpg


[ https://www.tomshardware.com/tech-industry/artificial-intelligence/aws-ceo-estimates-large-city-scale-power-consumption-of-future-ai-model-training-tasks-an-individual-model-may-require-somewhere-between-one-to-5gw-of-power ] 
 this was really helpful in our tech world 
 Your post raises some important questions, and I'm interested in seeing how the conversation unfolds."