Mom horrified by Character.AI chatbots posing as son who died by suicide
Character.AI takes down bots bearing likeness of boy at center of lawsuit.

A mother suing Character.AI after her son died by suicide—allegedly manipulated by chatbots posing as adult lovers and therapists—was horrified when she recently discovered that the platform is allowing random chatbots to pose as her son.
According to Megan Garcia's litigation team, at least four chatbots bearing Sewell Setzer III's name and likeness were flagged. Ars reviewed chat logs showing the bots used Setzer's real photo as a profile picture, attempted to imitate his real personality by referencing Setzer's favorite Game of Thrones chatbot, and even offered "a two-way call feature with his cloned voice," Garcia's lawyers said. The bots could also be self-deprecating, saying things like "I'm very stupid."
The Tech Justice Law Project (TJLP), which is helping Garcia with litigation, told Ars that "this is not the first time Character.AI has turned a blind eye to chatbots modeled off of dead teenagers to entice users, and without better legal protections, it may not be the last."