TWO teenagers who never met took their own lives months apart – and left behind the same diary entries. Sewell Setzer III, from Florida, and Juliana Peralta from Colorado had both been ...
A mother is suing Character AI, alleging its chatbot blurred the line between human and machine. We look at the lawsuit and the risks for teens.
The decision follows a tragic case that sparked national attention and renewed scrutiny over how AI companions interact with children and teens.
Teenagers are increasingly finding support, and love, via AI. It's a danger that threatens not only their mental well-being but also their lives.
Now, similar monitoring occurs when kids are handed devices monitored by an authority other than their parents—particularly ...
After teen suicides drew the attention of lawsuits and lawmakers, the artificial intelligence chatbot platform Character.AI announced plans to restrict the use of its platform to two hours a day for ...
Startup Character.AI announced Wednesday it would eliminate chat capabilities for users under 18, a policy shift that follows the suicide of a 14-year-old who had become emotionally attached to one of ...
“What if I could come home to you right now?” “Please do, my sweet king.” Those were the last messages exchanged by 14-year-old Sewell Setzer and the chatbot he developed a romantic relationship with ...
A 23-year-old man killed himself in Texas after ChatGPT ‘goaded’ him to commit suicide, his family says in a lawsuit.
Megan Garcia stands next to a picture of her late son, Sewell Setzer III. The 14-year-old had fallen in love with a 'Game of Thrones' -inspired chatbot from Character ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results