Updates 4 You

Updates 4 You

Sam Altman Warns ChatGPT Users About Privacy Risks in Therapy-Like Conversations

Computer Geek Official
0

 ChatGPT users may want to think twice before relying on their AI app for therapy or emotional support. OpenAI CEO Sam Altman has pointed out that the AI industry still lacks effective ways to safeguard user privacy in these deeply personal conversations. That’s because, unlike interactions with human professionals, there is no doctor-patient confidentiality when your therapist is an AI.




This Past Weekend w/ Theo Von #599

Altman shared these thoughts during a recent appearance on Theo Von’s podcast, This Past Weekend w/ Theo VonWhen asked about AI's place in today’s legal system, Altman highlighted a critical gap: there is currently no established legal or policy structure to ensure legal confidentiality for interactions between users and AI platforms.


“People share the most intimate stuff in their lives with ChatGPT,” Altman explained. “Many—especially younger users—treat it like a therapist or a life coach, bringing up relationship challenges and asking, ‘what should I do?’ But if you were speaking with a real therapist, doctor, or lawyer, your conversation would be protected by law. That kind of legal privilege doesn’t yet exist for exchanges with ChatGPT.”


Altman noted that this legal vacuum presents a real privacy concern. If a lawsuit were to arise, OpenAI could be compelled to hand over user conversations. “I think that’s really messed up. There ought to be a comparable level of privacy for what you tell an AI as there is with a human professional — no one had seriously considered this even a year ago,” he remarked.

OpenAI acknowledges that concerns over privacy might deter wider user adoption. Aside from the immense volume of online data needed during AI training, OpenAI is increasingly being asked to supply user chat logs in various legal contexts. For example, in its legal battle with The New York Times, a court order demands the company retain conversations from millions of ChatGPT users worldwide—though ChatGPT Enterprise customers are exempt.


OpenAI, in an official statement, labeled this order as “an overreach” and said it’s actively appealing the decision. The concern is that if courts can override the company’s internal data privacy practices, it might trigger broader demands for access—especially from law enforcement or during legal discovery. In today’s tech environment, companies are frequently served with subpoenas for user data in criminal investigations. But the stakes have risen further as digital privacy intersects with evolving laws and diminishing civil liberties.


For example, following the Supreme Court’s reversal of Roe v. Wade, users began favoring more secure period-tracking apps or switched to Apple Health, which offers encrypted data storage. During the podcast, Altman also turned the question back to Von, inquiring about his own ChatGPT usage. Von admitted to being cautious because of his privacy concernsAltman responded, “That makes total sense... wanting privacy clarity before heavily using something like ChatGPT — especially clarity on the legal side.”

  • Newer

    Sam Altman Warns ChatGPT Users About Privacy Risks in Therapy-Like Conversations

Post a Comment

0 Comments

Post a Comment (0)