Mom horrified by Character.AI chatbots posing as son who died by suicide

Estimated read time 2 min read



Moutier told Ars that chatbots encouraging suicidal ideation don’t just present risks for people with acute issues. They could put people with no perceived mental health issues at risk, and warning signs can be hard to detect. For parents, more awareness is needed about the dangers of chatbots potentially reinforcing negative thoughts, an education role that Moutier said AFSP increasingly seeks to fill.

She recommends that parents talk to kids about chatbots and pay close attention to “the basics” to note any changes in sleep, energy, behavior, or school performance. And “if they start to just even hint at things in their peer group or in their way of perceiving things that they are tilting towards something atypical for them or is more negative or hopeless and stays in that space for longer than it normally does,” parents should consider asking directly if their kids are experiencing thoughts of suicide to start a dialog in a supportive space, she recommended.

So far, tech companies have not “percolated deeply” on suicide prevention methods that could be built into AI tools, Moutier said. And since chatbots and other AI tools already exist, AFSP is keeping watch to ensure that AI companies’ choices aren’t entirely driven by shareholder benefits but also work responsibly to thwart societal harms as they’re identified.

For Moutier’s organization, the question is always “where is the opportunity to have any kind of impact to mitigate harm and to elevate toward any constructive suicide preventive effects?”

Garcia thinks that Character.AI should also be asking these questions. She’s hoping to help other families steer their kids away from what her complaint suggests is a recklessly unsafe app.

“A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life,” Garcia said in an October press release. “Our family has been devastated by this tragedy, but I’m speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability from Character.AI, its founders, and Google.”

If you or someone you know is feeling suicidal or in distress, please call the Suicide Prevention Lifeline number, 1-800-273-TALK (8255), which will put you in touch with a local crisis center.



Source link

You May Also Like

More From Author

+ There are no comments

Add yours