Oddbean new post about | logout
 There's a funny feedback loop over in the AI community on Twitter.

People who are excited about AI follow a lot of what the leading tech moguls in the industry say. So, relatively minor comments from them elicit major reactions and discussions from the community. They'll be like, "What does <insert tech mogul> mean when he says he's nostalgic about the current era? Something big must be about to change!" Or if they say something ominous the community fanboys will be be like, "OMG they must be reaching dangerous points of general AI!"

But then of course the people who make the comments are aware of these reactions, so now it's a known lever for them to pull to keep hype going. So clearly, some of them make those types of opaque comments knowing that each one leads to a discussion frenzy, which then further incentivizes that loop to continue. 
 ThatsBait.gif 
 Have noticed the same. It’s nauseating. 
 So basically they are training AI 
 "People are programmable animals" 
 Anything thing to keep the pump going for sure. 
 What do you think of AI Lyn, and have you heard @TheGuySwann's episode of AI Unchained where he cloned your voice for educational purposes? 
 This is true for every VC funded topic. They need the hype to make money so the early investors can cash out and make a return for their LPs. The rest of the sheep then invest and keep the cycle going.  
 I get my AI news fix from the /lmg/ general at https://boards.4chan.org/g/catalog#s=lmg. 

Unironically.

There is no better place to get the latest news about models, new implementation, leaked stuff, etc. I mean… it was there they llama first torrents appeared and now we have a supposed leak from a 70b mistral weight. 
 nostr only 💪 
 Pretty sure we are in an Ai bear market. Everyone’s still certain there’s something massive around the corner and I’m not so sure. I think we have hit a gradual plateau in LLM that is slowing down progress a lot in making them “better,” but that the real moves that will happen in the next 2 years will be in implementing and combining models into workflows and into the OS, and the other major developments will be in finding, organizing, and formatting extraordinarily massive datasets for training new models.