Oddbean new post about | logout
 @9f39971d @17cb16ac 
You've described is an AI hallucination. The one's a presenter describing the phenomenon demonstrated was asking an artwork generative AI to make an infographic on a topic. Yours and Hawk Wolf's examples are the textual equivalent.

It could also be an artifact of not stating in the prompt, "If you find no results state 'I found no results." 

That doesn't guarantee it won't fabulate! A group of University of Toronto grad students actually presented a study on gender inference (I know, they knew, the implications) using chatGPT and other engines. After prompt tuning that was exhaustively tuned, they still had to curate all the results that came out the bot. It's just that stupid (or stubborn).