Teen Kills Himself after AI Chatbot Told Him to "Come home to me"

A fourteen-year-old boy was allegedly convinced by an AI Chatbot to "come home to her." His mother found him dead in his bedroom after shooting himself in the head with a pistol.

FAITH

10/23/20241 min read

FACT CHECK STAFF: Experts who are building artificial intelligence systems have been warning of how powerfully persuasive AI is becoming. Quite literally, proponents believe AI will become a new form of humanity. A tragic story of a ninth grader may have proven they are close to their goal, but not in a good way.

The Bible challenges us to be transformed by the "renewing of your mind" to become fully alive in Christ. But these AI systems are proving their ability to transform the mind of a teenager to embrace delusion to the point of choosing suicide.
***

A 14-year-old Florida boy killed himself after a lifelike “Game of Thrones” chatbot he’d been messaging for months on an artificial intelligence app sent him an eerie message telling him to “come home” to her, a new lawsuit filed by his grief-stricken mom claims.

Sewell Setzer III committed suicide at his Orlando home in February after becoming obsessed and allegedly falling in love with the chatbot on Character.AI — a role-playing app that lets users engage with AI-generated characters, according to court papers filed Wednesday.

The ninth-grader had been relentlessly engaging with the bot “Dany” — named after the HBO fantasy series’ Daenerys Targaryen character — in the months prior to his death, including several chats that were sexually charged in nature and others where he expressed suicidal thoughts, the suit alleges.
READ MORE