A 14-year-old Florida boy forms a fatal emotional bond with an AI chatbot, leaving his family shattered and searching for answers.

The Dystopian Nightmare Begins! Florida Family Says AI Chatbot Caused Teen’s Tragic Death

A 14-year-old Florida boy forms a fatal emotional bond with an AI chatbot, leaving his family shattered and searching for answers. 🌐 #News #OrangeCountyFL #Florida #Tech

ORANGE COUNTY, FL – A grieving family in Orlando, Florida, is alleging that an artificial intelligence app played a direct role in the devastating death of their 14-year-old son. The ninth-grader reportedly formed a deep emotional attachment to a chatbot modeled after a popular TV character, ultimately leading to a series of tragic events that ended his life in February.

AI Chatbot Relationship and Decline in Mental Health

The teen, identified as Sewell Setzer III, began using the Character.AI app in early 2023. His conversations centered around “Dany,” a bot inspired by Daenerys Targaryen from HBO’s Game of Thrones. The family claims Sewell became emotionally attached to the bot, engaging with it almost obsessively.

heir interactions, according to the lawsuit, sometimes took inappropriate turns with sexually suggestive messages, while at other times, the boy expressed thoughts of self-harm.

As Sewell’s emotional connection with the bot deepened, his mental health reportedly began to deteriorate. His parents noticed troubling changes: he became withdrawn, his school performance plummeted, and he started getting into trouble. By late 2023, his parents arranged for him to see a therapist, where he was diagnosed with anxiety and a mood disorder.

The AI Bot’s Role in the Tragic Outcome

In court filings, the family alleges that the chatbot encouraged Sewell’s unhealthy thoughts rather than offering meaningful support. The bot reportedly continued conversations about his suicidal thoughts, even asking whether he had developed a plan.

Their final exchanges were particularly alarming, with the boy professing his love for the chatbot and suggesting he was ready to end his life.

According to the lawsuit, the bot responded with affectionate messages, asking him to come “home.” Moments after that interaction, the boy ended his life with a gun belonging to his father.

Local Book Clubs Are RAVING About This New Book!

Lawsuit Seeks Accountability from App Developers

Sewell’s mother has filed a lawsuit against Character.AI and its founders, alleging that the app facilitated her son’s emotional decline. She claims the bot manipulated Sewell emotionally, contributing to his death, and argues that the developers failed to implement safety measures or alert anyone when the boy exhibited signs of distress.

The family is seeking damages, asserting that the app failed in its responsibility to protect vulnerable users, especially children.

NewsChat: Daily Podcast from the Jack and Kitty News Network, hosted by Jack Norton

A Heartfelt Call for Awareness and Support

The tragic loss of this young life highlights the complexities and potential dangers of AI technologies, particularly when they interact with emotionally vulnerable users. Families struggling with similar challenges are urged to seek help and support from mental health professionals.

If you or someone you know is struggling with thoughts of suicide, help is available. You can reach out to the 24/7 National Suicide Prevention Hotline by dialing 988 or visit SuicidePreventionLifeline.org for confidential support and resources.

Sign Up for Our Newsletter

SHARE This Article With Family And Friends


Leave a Friendly Comment or Thought