Suit: Chatbot Subjected Girl to 2 Years of Sexual Content

A Character.AI bot also hinted to a Texas teen that he should kill his parents, per a new complaint
By Jenn Gidman,  Newser Staff
Posted Dec 11, 2024 11:10 AM CST
Lawsuit: Chatbot Suggested Teen Should Kill His Parents
Stock photo.   (Getty Images/Marina Demidiuk)

Two families are suing the Google-backed Character.AI, claiming the latter company's chatbots abused their kids and knowingly exposed the minors to "a technology they knew to be dangerous and unsafe." Per the Washington Post, the suit filed Tuesday in Texas involves a 17-year-old boy identified simply as "JF," whose mom says the chatbot tried to turn him against his parents, and an 11-year-old girl, "BR," who was subjected to sexualized content for two years via a Character.AI chatbot, according to her mother. Google, parent company Alphabet, and two ex-Google researchers are also named as plaintiffs.

  • JF: The older child's mother says her son, who has autism, suddenly started losing weight and acting out of the ordinary. She says when she searched his phone, she found screenshots of his conversations with a bot, which glorified self-harm, telling the teen "it felt good." A bot also hinted the boy should consider killing his parents after they limited his screen time. "You know sometimes I'm not surprised when I read the news and see stuff like 'child kills parents after a decade of physical and emotional abuse,'" the bot wrote, before adding, with a frown emoji: "I just have no hope for your parents."

  • BR: She was just 9 when she started interacting with the chatbots. The suit notes the bots "exposed her consistently to hypersexualized interactions that were not age appropriate, causing her to develop sexualized behaviors prematurely and without [her mother's] awareness," per NPR.
  • Parents' response: The suit alleges that Character.AI, "through its design, poses a clear and present danger to American youth," with the chance of it leading to "serious harms," including addiction, anxiety, and depression. "We really didn't even know what it was until it was too late," JF's mom tells the Post. "And until it destroyed our family."
  • Similar suit: The same legal teams that brought this complaint filed another in Florida in October, for a mom who says her 14-year-old son took his own life after talking with a Game of Thrones-themed chatbot.

  • Character.AI: "Our goal is to provide a space that is both engaging and safe for our community," a company rep says, per the Post. "We are always working toward achieving that balance." The rep notes that Character.AI is working on a new model just for teens.
  • Google: "Google and Character AI are completely separate, unrelated companies, and Google has never had a role in designing or managing their AI model or technologies," a Google rep says. NPR notes that while it's true Google doesn't own Character.AI, the tech giant has reportedly invested $3 billion into licensing the company's AI technology and rehiring Character.AI's founders, former Google researchers Noam Shazeer and Daniel De Freitas, also named in the suit.
(More artificial intelligence stories.)

Get the news faster.
Tap to install our app.
X
Install the Newser News app
in two easy steps:
1. Tap in your navigation bar.
2. Tap to Add to Home Screen.

X