@58fbd252 @9709757e "Hallucination" has become a term of art,. Personally, I'm fine with it as long as we recognize -- as the term obscures -- that chat AI *always* hallucinates, for the reason you say. It's just that more often than not, its hallucinations turn out to be true.
@953ba58b @9709757e I know it’s a term of art, and I hate it. Exactly for the reason you say: it obscures the fact that the LLMs are detached from reality always and by design. It is meant to suggest that this is abnormal or exceptional behavior.