Chatgpt users may want to think twice before returning to AI application for treatment or other types of emotional support. According to Openai Sam Altman’s chief executive, the AI industry has not yet understood how to protect the privacy of users when it comes to these most sensitive conversations, because there is no doctor-patient confidentiality when your document is AI.
Exec made these comments on a recent episode of Theo Von’s podcast, this last weekend with Theo Von.
Responding to a question about how AI works with today’s legal system, Altman said that one of the problems that still do not yet have a legal or political context for AI is that there is no legal confidentiality for user conversations.
“People talk about the most personal sh ** in their life in Chatgpt,” Altman said. “People use it – young people, especially, use it – as a therapist, life coach, have these relationship problems and [asking] “What should I do?” And right now, if you talk to a therapist or a lawyer or a doctor about these problems, there is a legal privilege of it. There is a confidentiality of doctor-patient, there is legal confidentiality, anything else. And we haven’t figured it out yet when you talk to chatgpt. ”
This could create a concern about privacy for users in the event of a lawsuit, Altman added because Openai would be legally obliged to produce these conversations today.
“I think this is very hasty. I think we must have the same concept of privacy for your talks with the AI we do with a therapist or whatever – and no one should think of it even a year ago,” Altman said.
The company understands that the lack of privacy could be an exclusion for the wider adoption of the user. In addition to AI’s request for so many online data during the training period, it is required to produce data from user conversations in certain legal frameworks. Already, openai He has fought by court order In her lawsuit with the New York Times, who would require to store the talks of hundreds of millions of Chatgpt users worldwide, excluding them from Chatgpt Enterprise customers.
TechCrunch event
Francisco
|
27-29 October 2025
In a statement on her website, Openai stated that this command is attractive, called “excessive”. If the Court could prevail over Openai’s decisions about the privacy of the data, it could open the company for further demand for legal discovery or purposes of law enforcement. Today’s technology companies are regularly called for user data to assist in criminal prosecutions. But in recent years, there have been additional concerns about digital data, as laws have begun to limit access to established freedoms, such as the woman’s right to choose.
When the Supreme Court overturned Roe by Wade, for example, customers began to move to more private period surveillance applications or Apple Health, which encrypt their records.
Altman asked Podcast host for his own use of chatgpt, too, as Von said he did not speak with AI Chatbot much because of his own concerns about privacy.
“I think it makes sense … really wanting the clarity of privacy before you use [ChatGPT] Very – like legal clarity, “Altman said.
