Sewell Setzer’s Mother Believes an AI Chatbot Caused the Teen’s Suicide – and Now She’s Suing

Content warning: This article discusses suicide.

There’s a reason people are so afraid of the rise of AI — and one of the first tragic stories relating to artificial intelligence has now surfaced out of Orlando, Fla. In February 2024, a 14-year-old boy named Sewell Setzer III sadly died by suicide, and his mother believes that he was driven to take his own life by an AI chatbot.

In a new civil lawsuit, Megan Garcia has taken action against tech company Character.AI, the developer of the bot that she feels caused her son’s death, which took the form of a Game of Thrones character.

Here’s what we know.

https://img.particlenews.com/image.php?url=2B7Fhz_0wKO0vJl00
Unsplash

What really happened to Sewell Setzer? His mom has filed a lawsuit against Character.AI.

At just 14 years old, Sewell died by suicide on Feb. 28 moments after logging on to the Character.AI platform, per the wrongful death complaint, per The Guardian . According to his mother, he had been obsessively using the chatbot day and night for the months leading up to his passing. Allegedly, he had been enthralled in what he considered to be a romantic relationship with the bot, which he had nicknamed after several Game of Thrones characters.

Story continues

TRENDING NOW

LATEST LOCAL NEWS