Two families are suing the Google-backed Character.AI, claiming the latter company's chatbots abused their kids and knowingly exposed the minors to "a technology they knew to be dangerous and unsafe." Per the Washington Post, the suit filed Tuesday in Texas involves a 17-year-old boy identified simply as "JF," whose mom says the chatbot tried to turn him against his parents, and an 11-year-old girl, "BR," who was subjected to sexualized content for two years via a Character.AI chatbot, according to her mother. Google, parent company Alphabet, and two ex-Google researchers are also named as plaintiffs.
- JF: The older child's mother says her son, who has autism, suddenly started losing weight and acting out of the ordinary. She says when she searched his phone, she found screenshots of his conversations with a bot, which glorified self-harm, telling the teen "it felt good." A bot also hinted the boy should consider killing his parents after they limited his screen time. "You know sometimes I'm not surprised when I read the news and see stuff like 'child kills parents after a decade of physical and emotional abuse,'" the bot wrote, before adding, with a frown emoji: "I just have no hope for your parents."