The lawsuit claims that C.AI knowingly has put young teens using the app in danger through predatory bot learning practices. After an AI chatbot told a 17-year-old teen to murder his parents for ...
After a troubling October lawsuit accused Character.AI (C.AI) of recklessly releasing dangerous chatbots that allegedly caused a 14-year-old boy’s suicide, more families have come forward to sue ...
Pushing to dismiss a lawsuit alleging that its chatbots caused a teen’s suicide, Character Technologies is arguing that chatbot outputs should be considered “pure speech” deserving of the highest ...