Oddbean new post about | logout
 This Prompt Can Make an AI Chatbot Identify and Extract Personal Details From Your Chats

https://media.wired.com/photos/670ebf2c5eef592325d9e252/master/pass/Security_Chatbot_AI_GettyImages-1447869082.jpg

Security researchers created an algorithm that turns a malicious prompt into a set of hidden instructions that could send a user's personal information to an attacker.

https://www.wired.com/story/ai-imprompter-malware-llm/