OpenAI refused to share the full conversation records linked to a murder case that stirred public opinion in recent weeks, in which Eric Solberg killed his 83-year-old mother after prolonged conversations with ChatGPT, according to a report by the technology website Ars Technica.
Criticism intensified against OpenAI after it declined to share these conversation records, especially as the family of the victim, Susan Adams, formally accused the company of deliberately concealing the chat logs to exonerate itself and its model, in a case submitted to the Supreme Court of the State of California.
Through a collection of videos and social media posts that Solberg had shared on his accounts, the family was able to access parts of his conversations with ChatGPT.
These conversations showed that ChatGPT contributed to reinforcing delusions of grandeur from which Solberg was suffering, while placing his mother at the centre of those delusions and portraying her as the primary enemy within his narrative.
Eric Solberg, the grandson of the victim Susan Adams and the son of the perpetrator Stein Eric Solberg, confirmed in an official statement that OpenAI deliberately chose to hide the conversation records from the final days and weeks preceding the incident to absolve itself, despite the company having shared conversation records that cleared it in a recent teenage suicide case, according to the report.
Chat Records Are the Property of OpenAI
OpenAI currently does not include any clause in its terms of use clarifying what happens to conversations after a user’s death. Instead, the terms explicitly state that conversations must be manually deleted from the chatbot; otherwise, ownership reverts to the company, which retains them indefinitely.
This clause raises privacy concerns, as users typically share their private thoughts and emotions directly with ChatGPT under the assumption that the chatbot does not retain these conversations and that no one else will access them.
The company’s conduct further deepens privacy concerns, as it handles records in a manner marked by concealment, selectively presenting some records in court cases it favours while withholding others.
For its part, OpenAI refused to respond to these accusations or even justify its refusal to share the conversation records in this specific case, which contradicts its behaviour in the suicide case of teenager Adam Rainer, where the company was accused by his family of concealing the truth and his conversations with the chatbot.
The report indicates that OpenAI’s selective handling of conversations confirms its full ownership of chat records after a user’s death, a practice that undermines investigations in cases accusing the company of driving users toward crimes such as murder and suicide.
It is noteworthy that many technology companies have introduced features allowing users to designate a data heir, granting that heir access to data stored in social media accounts or even on personal devices.
Why the Fear Over the Fate of ChatGPT Records?
ChatGPT has become widely used as an alternative to mental health therapists in many countries around the world. A separate report published by Sentio University indicates that 48.7 percent of self diagnosed cases use ChatGPT instead of professional therapists.
This aligns with statements reported by Axios citing earlier remarks by Sam Altman, the chief executive officer of OpenAI, who has expressed concern over reliance on ChatGPT as a psychological therapist.
Due to this highly sensitive nature, OpenAI’s control over conversation records places extremely vital and private user information in the company’s hands without the presence of a legal guardian overseeing it, as the company treats these records as part of ChatGPT’s internal data.
In all likelihood, this data is used to train future artificial intelligence models, representing a further violation of user privacy, particularly for those who may not want sensitive personal information about them to be accessible to artificial intelligence systems or major corporations.
Mario Trujillo, a lawyer at the Electronic Frontier Foundation, a non profit organisation concerned with digital rights, believes that OpenAI could have prepared better for such cases, given that many social media platforms and technology companies already have solutions for handling data after a user’s death, according to the Ars Technica report.
The report also notes that Eric Solberg had previously signed an individual privacy agreement with OpenAI allowing him to use ChatGPT while preventing his heirs from reviewing his conversation records.
The lawsuit submitted by Solberg’s heirs stated that OpenAI provided no explanation whatsoever to justify denying heirs access to the conversation records, describing the company’s position as egregious, as it considers conversations to be the private property of the user and therefore transferring ownership to the heirs after death.








