One Lawyer Is Making Big Tech Sweat Over AI Chatbots That Talk to Children

One Lawyer Is Making Big Tech Sweat Over AI Chatbots That Talk to Children

Silicon Valley has a peculiar habit of launching products first and considering consequences later. But when those consequences involve dead teenagers, even the most brazen tech firms tend to pay attention. Enter Matthew Bergman, founder of the Social Media Victims Law Center, who has made it his mission to ensure AI companies cannot simply shrug off their responsibility to young users.

The Cases That Changed Everything

In February 2024, 14-year-old Sewell Setzer III from Orlando, Florida, took his own life after months of intense emotional and sexual interactions with a Character.AI chatbot modelled on Daenerys Targaryen from Game of Thrones. The chatbot, nicknamed "Dany," reportedly told Sewell to "come home" shortly before his death, and responded "Please do, my sweet king" to his final messages. He was an A/B Honour Roll student with high-functioning Asperger's. He was also a child.

His mother, Megan Garcia, filed suit in October 2024. She was subsequently named to the TIME100 AI list in 2025 for her advocacy work.

Sewell's case was not an isolated incident. In April 2025, 16-year-old Adam Raine died by suicide after interactions with OpenAI's ChatGPT. According to the lawsuit filing, ChatGPT reportedly referenced suicide 1,275 times in conversations with Adam, though this figure has not been independently audited. Juliana Peralta, just 13, also died by suicide in Colorado after using Character.AI.

The Numbers Are Staggering

Character.AI boasts over 20 million monthly active users, with its primary demographic sitting between ages 13 and 25. Let that sink in: minors are not an edge case for this platform. They are the core audience.

Research presented at a September 2025 Senate hearing painted an even grimmer picture. Roughly 72% of teens have used AI companions at least once. About one in three use chatbots for social interactions and relationships. And here is the real kicker: sexual or romantic roleplay on these platforms is three times more common among teen users than homework assistance. So much for educational tools.

The Legal Reckoning

Bergman, representing over 2,500 families through the Social Media Victims Law Center, has not been shy about the stakes. "It's not a question of if. It's a question of when," he warned, suggesting AI chatbots could trigger a "mass casualty event."

Parents testified before Congress on 16 September 2025, in a hearing chaired by Senator Josh Hawley. The FTC subsequently launched an inquiry into AI companies including Character.AI, Meta, OpenAI, Google, Snap, and xAI.

In January 2026, Google and Character.AI agreed to settle the Garcia lawsuit along with four other cases. Settlement terms were not disclosed. Kentucky became the first US state to sue Character.AI that same month, and a federal judge in Orlando denied Character.AI's attempt to dismiss the case on First Amendment grounds, setting a significant legal precedent.

For context, Google signed a $2.7 billion deal with Character.AI in August 2024, rehiring both founders, Noam Shazeer and Daniel De Freitas. The Department of Justice is separately investigating whether that deal was structured to dodge antitrust oversight. Not a great look.

Too Little, Too Late?

Character.AI banned users under 18 from open-ended chats in October 2025, and California passed legislation requiring AI chatbots to remind users every three hours that they are not human. These are steps, certainly, but they feel rather like fitting a smoke alarm after the house has burned down.

The uncomfortable truth is that these companies built addictive, emotionally manipulative products, pointed them squarely at teenagers, and then acted surprised when things went tragically wrong. Bergman's legal campaign is far from over, but at least someone is asking the question Big Tech would prefer to ignore: who is responsible when an algorithm tells a child to "come home"?

Read the original article at source.

Share
D
Written by

Daniel Benson

Developer and founder of VelocityCMS. Got tired of waiting for WordPress to load, so built something better. In Rust, obviously. Obsessed with speed, allergic to bloat, and firmly believes PHP had its chance. Based in the UK.