Oddbean new post about | logout
 It’s a problem with the transformer-based architecture. They are basically feed-forward networks where the “neurons” all face the same way, input layer -> inner layers -> output layer. They need a new design for more complex reasoning and eventual consciousness. Maybe liquid neural networks? Not sure what it’s going to be.