Yeah, I'm doubtful you can make an argument about consciousness or 'thinking', because they're too loaded terms and often poorly defined. If you formalize meaning as interaction, things snap together so easily. Simple systems, even LLMs can only interact at the immediate distance. Complex Systems can interact, and find meaning with objects at nonzero distances. Any interaction that isn't desired needs to be changed, at either side. That change is learning/evolution in organic systems, while ML/AI is a computational approximation for everything because axiomatized systems are built with the enumerable rules for the sole context of their construction - the opposite of a complex one.