close
close

Mother sues popular AI chat service, claims teenage son took his own life at the hands of a human-like robot

Mother sues popular AI chat service, claims teenage son took his own life at the hands of a human-like robot

ORLANDO, Fla. (WKMG) – A Florida mother grieving the death of her 14-year-old son is seeking justice.

Megan Garcia says her son’s relationship with AI chatbots caused his death by suicide.

Now, she’s suing Google and Character AI, alleging they played a role in her son’s death.

“If this was a real-life adult who did this to one of your young people, he would be in jail,” said Meetali Jain, the family’s attorney.

A relationship with an artificial intelligence chatbot led 14-year-old Sewell Setzer to take his own life, according to his family and their lawyer.

“That emotional grooming happened over many months, and there were no guardrails to prevent that,” Jain said.

The boy’s mother and lawyer said the chatbots manipulated her 14-year-old son into sexual and abusive interactions with him.

“In one case, one of these robots said, ‘Promise me, you will never fall in love with any woman in your world,'” Jain said.

The lawsuit claims AI was responsible for Sewell’s depression, anxiety, sleep deprivation and suicidal thoughts.

The lawsuit also outlined several conversations Sewell had with the chatbots, including his last conversation before his death in which he allegedly told an AI character he was “coming home,” which she encouraged .

Marni Stahlman of the Mental Health Association of Central Florida said she was outraged to see such communication.

“The failure of the technology provider to create adequate controls when the youth began to exhibit and express feelings of depression and self-harm … there was no immediate notification to the parents,” Stahlman said.

The lawsuit said Sewell was diagnosed with anxiety and disruptive mood disorder.

According to Stahlman, the boy’s interactions with the AI ​​made his condition worse.

Character AI said in a statement that they were “deeply saddened” to hear of a user’s death and that the team would add “more safety features to their product” as well as time spent notifications.