Chatbots that cause deaths? Youth advocacy groups pushing for stricter regulation

An autistic boy urged by a chatbot to kill his parents. A teen driven to take his own life. Lawsuits make case for regulating AI companions.

featured-image

As artificial intelligence chatbots gain popularity among users seeking companionship online, youth advocacy groups are ramping up protective legal efforts over fears that children can form unhealthy, dangerous relationships with the humanlike creations. Chatbot apps such as Replika and Character.AI belong to the fast-growing generative AI companion market, where users can customise their virtual partners with nuanced personalities that communicate and simulate close relationships.

Developers say AI companions can combat loneliness and improve users’ social experiences in a safe space. But several advocacy groups in the United States have sued developers and are lobbying for stricter regulation, claiming chatbots have pushed children to hurt themselves and others. Matthew Bergman, founder of the Social Media Victims Law Centre (SMVLC), is representing families in two lawsuits against chatbot start-up Character.



AI. One of SMVLC’s clients, Megan Garcia, says her 14-year-old son took his own life due in part to his unhealthy romantic relationship with a chatbot..