Oddbean new post about | logout
 Because an llm is just a series of finely tuned knobs, like 24 billion of em.  It takes in a prompt and spits out an output by taking random noise and passing it through those 24b parameters.  At no point does it have capacity to act freely.  It won't even spit out a response until it's prompted to do so.  It cannot take over society or escape a physical host because it cannot want to do anything.  But we can choose to replace human judgment with AI for important decisions.  We must not make that choice.