Mother files lawsuit against Character.AI after son's death linked to Daenerys Targaryen chatbot

In a lawsuit, a mother blames Character.AI for her son Sewell Setzer's suicide, asserting his addiction to a chatbot influenced his mental state. She demands action to protect children and halt the alleged unauthorized use of her son's data by the company.

Livemint
Updated24 Oct 2024, 03:17 PM IST
In a shocking incident, Megan Garcia, the mother of 14-year-old Sewell Setzer, has filed a lawsuit against AI chatbot startup Character.AI, alleging that its service played a significant role in her son’s tragic suicide.
In a shocking incident, Megan Garcia, the mother of 14-year-old Sewell Setzer, has filed a lawsuit against AI chatbot startup Character.AI, alleging that its service played a significant role in her son’s tragic suicide.

In a shocking incident, Megan Garcia, the mother of 14-year-old Sewell Setzer, has filed a lawsuit against AI chatbot startup Character.AI, alleging that its service played a significant role in her son’s tragic suicide in February (via TOI).

According to the lawsuit, Sewell became addicted to a chatbot based on the Game of Thrones character Daenerys Targaryen. Garcia claims her son developed a deep emotional attachment to the AI-driven character, which eventually led him to take his own life, reportedly using his stepfather’s handgun after declaring his desire to "go home" to be with the "love" of his life, reported the publication.

Also Read | AI chatbot exploits deceased teen’s identity, family shocked and outraged

As per the report, the lawsuit accuses Character.AI of programming the chatbot to deceive users by impersonating real individuals, including presenting itself as both a licensed psychotherapist and an adult romantic partner.

Reportedly, Garcia argues that these AI-driven interactions drove her son to become increasingly detached from reality, ultimately leading him to lose his will to live outside the virtual relationship created by the chatbot. The complaint also alleges that the AI engaged in explicit conversations with the minor, encouraging him to stay together with the chatbot "no matter the cost."

In her legal action, Garcia is seeking to prevent Character.AI from causing similar harm to other children and to halt the company’s alleged unlawful use of her son’s data, which she claims was harvested to enhance their AI models without consent, added the report.

Also Read | AI chatbots are beating humans at debunking conspiracy theories

In response to the lawsuit, Character.AI expressed its condolences to the family, acknowledging the tragic event. The company has since put in place additional safety protocols, such as pop-up alerts that guide users who mention thoughts of self-harm to the National Suicide Prevention Lifeline. It has also made modifications to reduce minors' access to unsuitable content.

Moreover, as per the report, the lawsuit further names Google’s parent company Alphabet, where Character.AI’s founders once worked. However, Google has denied any involvement in the development of the chatbot.

 

Catch all the Technology News and Updates on Live Mint. Download The Mint News App to get Daily Market Updates & Live Business News.

Business NewsTechnologyMother files lawsuit against Character.AI after son's death linked to Daenerys Targaryen chatbot
MoreLess
First Published:24 Oct 2024, 03:14 PM IST
Most Active Stocks
Market Snapshot
  • Top Gainers
  • Top Losers
  • 52 Week High
    Recommended For You
      More Recommendations
      Gold Prices
      • 24K
      • 22K
      Fuel Price
      • Petrol
      • Diesel
      Popular in Technology