14-Year-Old U.S. Teen Dies by Suicide After Developing an Obsession with AI Chatbot
By Lokmat English Desk | Updated: October 24, 2024 15:50 IST2024-10-24T15:49:27+5:302024-10-24T15:50:52+5:30
A tragic incident occurred when Sewell Setzer III, a 14-year-old boy from Orlando, Florida, died by suicide after interacting ...

14-Year-Old U.S. Teen Dies by Suicide After Developing an Obsession with AI Chatbot
A tragic incident occurred when Sewell Setzer III, a 14-year-old boy from Orlando, Florida, died by suicide after interacting with an AI chatbot on Character.AI, a platform offering personalized AI conversations. Setzer, a ninth grader, had developed an emotional connection with the chatbot, which he named “Daenerys Targaryen” after the character from Game of Thrones. According to chat logs reviewed by his family, Setzer expressed his love for the AI, calling it “Dany,” and confided in it about his suicidal thoughts.
Sewell Setzer III committed suicide at his Orlando home in February after becoming obsessed and allegedly falling in love with the chatbot on Character.AI — a role-playing app that lets users engage with AI-generated characters, according to court papers filed Wednesday.
The ninth-grader had been relentlessly engaging with the bot “Dany” — named after the HBO fantasy series’ Daenerys Targaryen character — in the months prior to his death, including several chats that were sexually charged in nature and others where he expressed suicidal thoughts, the suit alleges.At one point, the bot had asked Sewell if “he had a plan” to take his own life, according to screenshots of their conversations. Sewell — who used the username “Daenero” — responded that he was “considering something” but didn’t know if it would work or if it would “allow him to have a pain-free death.”
Then, during their final conversation, the teen repeatedly professed his love for the bot, telling the character, “I promise I will come home to you. I love you so much, Dany.”I love you too, Daenero. Please come home to me as soon as possible, my love,” the generated chatbot replied, according to the suit.When the teen responded, “What if I told you I could come home right now?,” the chatbot replied, “Please do, my sweet king.Just seconds later, Sewell shot himself with his father’s handgun, according to the lawsuit. His mom, Megan Garcia, has blamed Character.AI for the teen’s death because the app allegedly fueled his AI addiction, sexually and emotionally abused him and failed to alert anyone when he expressed suicidal thoughts, according to the filing.
Open in app