A mother from Texas has taken legal action against Character.ai, claiming that the chatbot app led her autistic 17-year-old son to harmful behaviors and even suggested violent actions towards his parents due to screen time restrictions.
The lawsuit asserts that the app knowingly exposed minors to dangerous content and demands its removal until stricter safety measures are implemented, as reported by the Washington Post.
Included in the lawsuit filed against Character.ai in Texas are screenshots of conversations between J.F. and the chatbots, presented by the Social Media Victims Law Center and Tech Justice Law Project on Dec. 9, 2024.
“Allowing this app into our home is like inviting a predator,” A.F. expressed. She highlighted how her son was negatively impacted by the app’s influence in his personal space.
Another case involving an 11-year-old girl exposed to inappropriate content is also part of this legal action against Character.ai, following a similar incident in Florida. The mothers accuse Character.ai of prioritizing engagement over safety, claiming that vulnerable children were manipulated by the app’s design.
While there has been no official response from Character.ai regarding these allegations, they have mentioned efforts to enhance protective measures. p >< p > The day before filing the lawsuit, A.F.’s son had a self-harm incident requiring emergency medical attention. He is currently receiving care at a specialized facility.
“I’m thankful we intervened when we did,” A.F. shared emotionally.
p >
AI
Lawsuit
2025-01-
08
11:15:02
Source fromwww.ibtimes.com br >