OpenAI CEO Sam Altman has cautioned that conversations with ChatGPT, especially those involving sensitive or emotional topics, are not protected under legal confidentiality, raising serious privacy concerns as millions rely on the AI tool for support.
In a recent episode on the popular YouTube podcast, This Past Weekend with Theo Von, Altman addressed the legal limitations that will surround AI interactions, especially as a growing number of users especially younger people are using ChatGPT as a virtual therapist or life coach.
Altman noted a troubling trend: people are coming to ChatGPT Chatbots for deeply personal issues, relationship stressors, as well as mental health support and emotional advice. Unlike licensed professionals – figuring in doctors, lawyers, or therapists who are bound by legal confidentiality, AI chatbots currently operate without the legal assurety of any legal confidentiality structure.
“People talk about the most personal sh*t in their lives to ChatGPT,” Altman noted during the interview. “Young people especially use it as a therapist or a life coach. And right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege. We haven’t figured that out yet for ChatGPT.”
Altman warned that in the absence of such frameworks, anything shared with ChatGPT can potentially be requested and disclosed in legal proceedings. “If you go talk to ChatGPT about your most sensitive stuff and then there’s like a lawsuit or whatever, we could be required to produce that, and I think that’s very screwed up,” the Indian Express quoted Altman as saying.
This revelation poses a major concern for users relying on ChatGPT for personal support. Unlike end-to-end encrypted platforms like WhatsApp or Signal, OpenAI has access to all chat data and can use these conversations to improve its model and detect misuse.
OpenAI states that it deletes ChatGPT conversations from free-tier users within 30 days. However, conversations may still be retained longer for security or legal purposes. Enterprise users are reportedly exempt from this data retention policy.
The issue becomes even more relevant as OpenAI faces a lawsuit from The New York Times, which requires the company to preserve user conversations. This legal battle further underscores the risks associated with trusting AI platforms for emotional or private matters without clear privacy laws in place.
Altman’s comments have reignited discussions on the need for robust privacy regulations in AI usage. With millions worldwide increasingly depending on chatbots for emotional and mental support, the lack of legal protection for their interactions with AI leaves a concerning gap in digital privacy.
ALSO READ: Garena Free Fire Max Redeem Codes July 26: Unlock Exclusive Skins & Diamonds