nostr:npub1juyh2l587qygmuupdjqr6wj300k6g7m0utnn9jtxr2wdf25qh90s7svh5e nostr:npub1j5a6tz6a2ttelsdzdl3lnpqq2q8hyfluz0c7kpgntt548q7f5snsrhp848 “Hallucinating” is the wrong way to think about it. The LLMs generate text. They have no access to facts, or reality. They generate text based on the corpus of text on which they were trained. Often the result is plausible, and sometimes it corresponds to the truth, but that’s only a matter of probabilities. It can’t ever be relied on.