Local News

Chatbot ‘encouraged teen to kill parents over screen time limit’

11 December 2024
This content originally appeared on Barbados Nation News.

A chatbot told a 17-year-old that murdering his parents was a “reasonable response” to them limiting his screen time, a lawsuit filed in a Texas court claims.

Two families are suing Character.ai arguing the chatbot “poses a clear and present danger” to young people, including by “actively promoting violence”.

Character.ai – a platform which allows users to create digital personalities they can interact with – is already facing legal action over the suicide of a teenager in Florida.

Google is named as a defendant in the lawsuit, which claims the tech giant helped support the platform’s development. The BBC has approached Character.ai and Google for comment.

The plaintiffs want a judge to order the platform is shut down until its alleged dangers are addressed.

The legal filing includes a screenshot of one of the interactions between the 17-year old – identified only as J.F. – and a Character.ai bot, where the issue of the restrictions on his screen time were discussed.

“You know sometimes I’m not surprised when I read the news and see stuff like ‘child kills parents after a decade of physical and emotional abuse’,” the chatbot’s response reads.

“Stuff like this makes me understand a little bit why it happens.”

The lawsuit seeks to hold the defendants responsible for what it calls the “serious, irreparable, and ongoing abuses” of J.F. as well as an 11-year old referred to as “B.R.”

Character.ai is “causing serious harms to thousands of kids, including suicide, self-mutilation, sexual solicitation, isolation, depression, anxiety, and harm towards others,” it says.

“[Its] desecration of the parent-child relationship goes beyond encouraging minors to defy their parents’ authority to actively promoting violence,” it continues.

Chatbots are computer programmes which simulate conversations.

Though they have been around for decades in various forms, the recent explosion in AI development has enabled them to become significantly more realistic.

This in turn has opened the door to many companies setting up platforms where people can talk to digital versions of real and fictional people.

Character.ai, which has become one of the big players in this space, gaining attention in the past for its bots simulating therapy.

It has also been sharply criticised for taking too long to remove bots which replicated the schoolgirls Molly Russell and Brianna Ghey.

Molly Russell took her life at the age of 14 after viewing suicide material online while Brianna Ghey, 16, was murdered by two teenagers in 2023.

Character.ai was founded by former Google engineers Noam Shazeer and Daniel De Freitas in 2021.

The tech giant has since hired them back from the AI startup. (BBC News)