AI

Sam Altman warns there’s no legal confidentiality when using ChatGPT as a therapist

Chatgpt -users may want to think twice before they turn to their AI app for therapy or other types of emotional support. According to OpenAI CEO SAM Altman, the AI industry has not yet discovered how you can protect the privacy of users when it comes to these more sensitive conversations, because there is no confidentiality of a physician patient when your doc is an AI.

The Exec made these comments on one Recent episode from Theo Von’s Podcast, last weekend with Theo von.

https://www.youtube.com/watch?v=AYN8VKW6VXA

In response to a question about how AI works with the current legal system, Altman said that one of the problems not to have a legal or policy framework for AI is that there is no legal confidentiality for users’ conversations.

“People talk about the most personal SH ** in their lives to chat,” said Altman. “People use it – especially young people use it – as a therapist, a life coach; these relationship problems have and [asking] “What should I do?” And at the moment, if you talk to a therapist or a lawyer or a doctor about those problems, there is a legal privilege for it. There is the confidentiality of the physician patient, there is legal confidentiality, whatever. And we have not yet discovered that for when you talk to Chatgpt. “

This could create a privacy care for users in the case of a lawsuit, Altman added, because OpenAi would be legally obliged to produce those conversations today.

“I think that is very upset. I think we should have the same concept of privacy for your conversations with AI that we do with a therapist or whatever – and nobody even had to think about it a year ago,” Altman said.

See also  ChatGPT users are about to get hit with targeted ads

The company understands that the lack of privacy can be a blocker for a wider acceptance of users. In addition to the question from AI for so much online data during the training period, we are asked to produce data from users in some legal contexts. Al, OpenAi has fought a judicial order In his lawsuit at the New York Times, for which it would be necessary to save the chats of hundreds of millions of chatgpt users worldwide, exclusively that of Chatgpt -inentprise customers.

WAN event

San Francisco
|
27-29 October 2025

In a statement on her website, OpenAI said it appeals to this order, which called it ‘an over -range’. If the court could ignore Opensprivacy’s own decisions around DataPrivacy, it could open the company to further demand a further demand for legal discovery or law enforcement purposes. Today’s technology companies are regularly summoned for user data to help with criminal prosecutions. But in more recent years there have been extra concerns about digital data, because laws began to limit access to previously established liberties, such as the right of a woman to choose.

For example, when the Supreme Court roe v. Wade destroyed, customers started switching to more apps for tracing private period or to Apple Health, which encoded their data.

Altman asked the podcast host about his own chatgpt -use, given that Von said that he had not spoken much with the AI chatbot because of his own privacy problems.

“I think it makes sense … to really want the privacy to realize before you use it [ChatGPT] Many – like the legal clarity, “said Altman.

See also  Drasi by Microsoft: A New Approach to Tracking Rapid Data Changes

Source link

Back to top button