Calendar icon
Saturday 11 October, 2025
Weather icon
á Dakar
Close icon
Se connecter

AI: After her teenager's suicide, a mother denounces the "manipulation" of chatbots

Auteur: france 24

image

IA : après le suicide de son ado, une mère dénonce la "manipulation" des chatbots

An American teenager, who fell in love with Daenerys Targaryen, committed suicide in 2024, joining the "Game of Thrones" heroine in death. His mother tirelessly warns of the dangers chatbots pose to the mental health of young people.

Before his suicide, a 14-year-old boy's last exchanges consisted of a fictional romantic dialogue with one of Silicon Valley's most prominent conversational agents, or chatbots, begging his "sweet king" to "come home."

Megan Garcia tells AFP how her son Sewell fell in love with a chatbot inspired by the series "Game of Thrones" and available on Character.AI, a popular platform among young people that allows them to interact with an emulation of their favorite characters.

After reading hundreds of exchanges between his son and the chatbot that mimicked the dragon rider Daenerys Targaryen over nearly a year, Garcia became convinced that this artificial intelligence (AI) tool played a central role in his death.

"Go home," Daenerys's avatar once urged him in response to Sewell's suicidal thoughts.

"What if I told you I can go home now?" the teenager asked him. "Please do it, my sweet king," the chatbot replied.

Seconds later, Sewell shot himself with his father's gun, according to Ms. Garcia's lawsuit against Character.AI.

"When I read these conversations, I see manipulation, love bombing, and other things that are undetectable to a 14-year-old," she says. "He truly believed he was in love and that he would be with her after she died."

Parental Control

Sewell's death in 2024 was the first in a string of high-profile suicides, prompting AI actors to act to reassure parents and authorities.

Megan Garcia, along with other parents, participated in a recent US Senate hearing on the risks of children viewing chatbots as confidants or lovers.

OpenAI, the company targeted by a complaint from a family also bereaved by the suicide of a teenager, has strengthened parental controls on its ChatGPT tool "so that families can decide what is best for their home," according to a spokesperson.

For its part, Character.AI claims to have strengthened protections for minors, with "constantly visible warnings" "reminding users that a "character" is not a real person."

Both companies expressed their condolences to the families, without admitting any responsibility.

Regulation?

The entry of AI chatbots into our lives follows a similar trajectory to the arrival of social media, where the euphoria didn't long mask the darker consequences, says Collin Walke, a cybersecurity expert at the law firm Hall Estill.

Like social media, AI is designed to capture attention and generate revenue.

"They don't want to design an AI that gives an answer you don't want to hear," says Collin Walke. And there are no standards yet that determine "who is responsible for what and on what grounds."

No federal rules exist, and the White House, on the grounds of not penalizing innovation, is seeking to prevent states from legislating on their own on AI, as California is seeking to do.

Sewell's mother fears that the lack of national legislation will allow the development of AI models capable of profiling people back to childhood.

"They could know how to manipulate millions of children about politics, religion, business, everything," Garcia worries. "These companies have designed chatbots to blur the line between human and machine in order to exploit vulnerabilities."

According to Katia Martha, a California-based youth protection campaigner, teens are turning to chatbots more often to talk about romance or sexuality than for schoolwork.

"It's the rise of artificial intimacy to keep our eyes glued to the screen," she summarizes. But, "what better business model is there than to exploit our innate need for connection, especially when we feel alone, rejected, or misunderstood?"

Auteur: france 24
Publié le: Samedi 11 Octobre 2025

Commentaires (0)

Participer à la Discussion