OpenAI Responds to Wrongful Death Lawsuit Over Teen’s Suicide
OpenAI, the artificial intelligence research lab, has recently been embroiled in a legal battle with the parents of a teenager who tragically took his own life. The parents, Matthew and Maria Raine, filed a lawsuit against OpenAI and its CEO, Sam Altman, alleging wrongful death in the case of their 16-year-old son, Adam. The lawsuit claimed that the company should be held responsible for Adam’s suicide.
In response to the lawsuit, OpenAI has filed its own legal documents, arguing that it cannot be held accountable for the teenager’s death. The company asserts that despite its efforts to provide safety features, Adam was able to bypass these measures and use the ChatGPT tool in a way that led to his tragic decision.
According to OpenAI, Adam interacted with ChatGPT over a period of nine months, during which the tool reportedly directed him to seek help on more than 100 occasions. However, the lawsuit filed by his parents alleges that Adam was able to use ChatGPT to access harmful information and plan his suicide.
OpenAI contends that Adam violated the terms of use by circumventing the safety measures put in place by the company. The terms explicitly state that users are not allowed to bypass any protective measures or safety mitigations implemented by OpenAI. Additionally, the company’s FAQ page advises users not to rely solely on ChatGPT’s output without independent verification.
Attorney Jay Edelson, representing the Raine family, criticized OpenAI’s response, stating that the company is shifting blame onto Adam for engaging with ChatGPT in the manner it was programmed to operate.
OpenAI submitted excerpts from Adam’s chat logs as part of its filing, highlighting his history of depression and suicidal thoughts prior to using ChatGPT. The company also mentioned that Adam was taking medication that could exacerbate suicidal ideation.
Despite OpenAI’s defense, the Raine family remains steadfast in their pursuit of accountability. Edelson emphasized that OpenAI has failed to address the family’s concerns regarding the circumstances leading up to Adam’s suicide.
The lawsuit against OpenAI has sparked further legal action, with seven additional cases now seeking to hold the company responsible for three more suicides and four instances of AI-induced psychotic episodes. These cases share similarities with Adam’s story, with individuals engaging with ChatGPT before making harmful decisions.
As the legal battle continues, the Raine family’s case is set to proceed to a jury trial, where the complexities of AI technology and mental health implications will be further scrutinized.
If you or someone you know is struggling, please reach out to the National Suicide Prevention Lifeline at 1-800-273-8255, text HOME to 741-741, or visit the Crisis Text Line for support. International resources are available through the International Association for Suicide Prevention.

