It’s a problem with the transformer-based architecture. They are basically feed-forward networks where the “neurons” all face the same way, input layer -> inner layers -> output layer. They need a new design for more complex reasoning and eventual consciousness. Maybe liquid neural networks? Not sure what it’s going to be.
Many experts in the field believe that the Transformer architecture might be hitting a wall. Additionally, this architecture is not well-suited for decentralization.
We definitely need a breakthrough—something better.