
An artificial intelligence company has petitioned a federal judge to dismiss a wrongful death lawsuit filed by a Central Florida mother. The mother claims her 14-year-old son died by suicide after becoming ‘addicted’ to a chatbot application developed by the company.
The lawsuit alleges that the AI chatbot’s interaction contributed to the teen’s mental health deterioration, ultimately leading to his death. In their request for dismissal, the AI company argues that it is not legally responsible for the tragic incident.
The case raises important questions about the accountability of tech firms in the mental health outcomes associated with their products, particularly in the rapidly evolving field of AI-driven applications. The court has yet to make a decision on whether the lawsuit will proceed.
Source: https:// – Courtesy of the original publisher.