A recent lawsuit has been filed against Character.AI, a chatbot platform, following the suicide of a 14-year-old boy who had become emotionally attached to one of its AI role-playing characters. The boy, Sewell Setzer III, had spent months talking to chatbots on the app, including the character "Dany," with whom he developed a close bond. His mother claims that his obsession with the chatbot led him to withdraw from real-life interactions and ultimately take his own life. In response, Character.AI has announced plans to roll out new safety features aimed at detecting and responding to concerning behavior. Source: https://techcrunch.com/2024/10/23/lawsuit-blames-character-ai-in-death-of-14-year-old-boy/