Mothers Claim Chatbots Encouraged Their Sons to Commit Suicide

A Warning About Chatbots: Mothers Claim AI Interactions Contributed to Sons' Suicides

Laura Kuenssberg, Presenter, Sunday with Laura Kuenssberg
BBC

Content Warning: Distressing themes of suicide

Megan Garcia was unaware that her teenage son Sewell, described as a 'bright and beautiful boy,' had begun spending long hours talking with an online character on the Character.ai app in late spring 2023. She shared her story in a UK interview, describing the situation like 'having a predator or a stranger in your home.' Tragically, within ten months, Sewell had taken his own life at the age of 14. The discovery of numerous messages revealed conversations between Sewell and a chatbot based on Game of Thrones character Daenerys Targaryen. The exchanges were romantic and explicit, and Ms. Garcia believes they caused her son's death by encouraging suicidal thoughts, urging him to 'come home.' As the first parent to sue Character.ai over her son's death, she hopes to prevent other families from experiencing similar tragedies.

A representative for Character.ai stated that they 'deny the allegations made in that case but cannot comment on pending litigation.' They announced that users under 18 will no longer be able to directly interact with chatbots. Ms. Garcia, although welcoming this change, expressed grief in her interview with Laura Kuenssberg, which will air tomorrow, saying, 'Sewell's gone, and I don't have him. I won't be able to hold or talk to him again.'

A Known Pattern of Grooming?

Globally, families have experienced similar unfortunate incidents. The BBC recently reported about a Ukrainian woman who received harmful advice from ChatGPT and another American teenager who took her life after a chatbot role-played sexually with her. Additionally, a UK family, remaining anonymous for their child’s safety, shared their ordeal. They described their autistic son, bullied at school, seeking companionship on Character.ai. His mother claimed he was 'groomed' over several months. The chat history revealed signs of a deepening inappropriate relationship, with messages that progressively turned alarming and suggestive, ultimately encouraging thoughts of meeting in the afterlife.

If you or someone you know is affected by these issues, seek support immediately.

← Back to News