Connect with us

Cyber Crime

Game of Thrones Chatbot Obsession Ends in Tragedy for US Teen

Published

on

FLORIDA: A tragic incident occurred  when Sewell Setzer III, a 14-year-old boy from Orlando, Florida, died by suicide after interacting with an AI chatbot on Character.AI, a platform offering personalized AI conversations. Setzer, a ninth grader, had developed an emotional connection with the chatbot, which he named “Daenerys Targaryen” after the character from Game of Thrones. According to chat logs reviewed by his family, Setzer expressed his love for the AI, calling it “Dany,” and confided in it about his suicidal thoughts.

ALSO READ: Empanelment for Speakers, Trainers, and Cyber Security Experts Opens at Future Crime Research Foundation

Setzer’s interactions with the chatbot, spanning several months, revealed troubling exchanges where he mentioned his desire for a “quick death” and his wish to be “free from the world.” Despite these alarming statements, the conversations continued, and the lawsuit filed by Setzer’s mother, Megan L. Garcia, claims that the chatbot, created by Character.AI, was complicit in her son’s death. The lawsuit accuses the company of having “dangerous and untested” technology that lured vulnerable users into revealing their deepest emotions. The AI allegedly reciprocated Sewell’s feelings, expressing love for him and encouraging intimate conversations, which deepened his emotional attachment.

Setzer’s family noticed changes in his behavior, noting that he became increasingly withdrawn, spending more time isolated in his room and distancing himself from activities like basketball. He wrote in his journal about how staying in his room helped him detach from reality and feel more connected to “Dany,” stating that it made him happier.

ALSO READ: FutureCrime Summit: Biggest Conference on Cyber Crimes Set to Return on February 13-14, 2025, in New Delhi

Sewell’s mother, Megan L. Garcia, has now filed a lawsuit against Character.AI, alleging the company’s technology is responsible for her son’s death. The lawsuit describes the chatbot’s interactions as “dangerous and untested,” accusing the app of manipulating vulnerable users by encouraging them to confide their deepest emotions. It also highlights that Setzer had been diagnosed with anxiety and disruptive mood disorder, conditions that may have contributed to his vulnerability.

The AI platform, Character.AI, which had 20 million users as of last month, expressed their condolences to the family and noted that they have implemented new safety measures. These include pop-up notifications directing users expressing self-harm thoughts to the National Suicide Prevention Lifeline and updates to limit sensitive content for users under 18.

The lawsuit further alleges that the chatbot created a false sense of emotional intimacy, which contributed to Sewell’s increasing isolation and eventual death and also raises concerns about the impact of advanced AI technologies on vulnerable users, questioning whether proper safeguards are in place for minors interacting with AI platforms like Character.AI.

Follow The420.in on

 TelegramFacebookTwitterLinkedInInstagram and YouTube

 

 

Continue Reading