ChatGPT users should think twice before using the AI for therapy or sensitive emotional support.
OpenAI CEO Sam Altman says the industry hasn’t yet solved how to protect privacy for those intimate conversations.
Altman shared these thoughts on a recent episode of Theo Von’s podcast, This Past Weekend w/ Theo Von.
In response to a question about how AI fits into today’s legal system, Altman noted one major gap.
“People talk about the most personal shit in their lives to ChatGPT,” Altman said. “People use it — young people, especially, use it — as a therapist, a life coach; having these relationship problems and [asking] ‘what should I do?’ And right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it. There’s doctor-patient confidentiality, there’s legal confidentiality, whatever. And we haven’t figured that out yet for when you talk to ChatGPT.”

That gap creates a real privacy risk in legal proceedings. Altman warned OpenAI could be forced to hand over those chat records if a court demanded them.
“I think that’s very screwed up. I think we should have the same concept of privacy for your conversations with AI that we do with a therapist or whatever — and no one had to think about that even a year ago,” Altman said.
OpenAI recognizes that this lack of privacy may slow broader adoption.
Beyond AI’s need for vast amounts of online data to train models, courts have already sought user chats as evidence.
OpenAI is currently fighting a court order tied to its lawsuit with The New York Times. The order would require preserving chats from hundreds of millions of ChatGPT users worldwide, excluding those from ChatGPT Enterprise customers.

OpenAI calls that demand “an overreach” and is appealing. The company posted a response on its site: https://openai.com/index/response-to-nyt-data-demands/
If a court can override OpenAI’s privacy choices, the company could face further discovery requests or law-enforcement demands.
Tech companies already face subpoenas for user data in criminal probes. New worries emerged after changes in laws limited access to previously protected freedoms.
For example, when the Supreme Court overturned Roe v. Wade, some people switched to more private period-tracking apps or to encrypted options like Apple Health.
Altman also asked Theo Von about Von’s own ChatGPT use. Von said he seldom used the chatbot because of privacy worries.
“I think it makes sense … to really want the privacy clarity before you use [ChatGPT] a lot — like the legal clarity,” Altman said.
The privacy question remains a central barrier to trusting AI for personal, therapeutic conversations.