AI

You can’t libel the dead. But that doesn’t mean you should deepfake them.

Zelda Williams, daughter of the late actor Robin Williams, has a moving message for the fans of her father.

“Please, just stop sending me AI videos from Dad. Stop believing that I want to see it or that I will understand. I don’t and I don’t do that,” she wrote in a message about her Instagram story on Monday. “If you have any decency, just stop him and even to me to everyone, even for everyone. It is stupid, it’s a waste of time and energy, and believe me, it’s not what he would like.”

It is probably no coincidence that Williams was moved to post this, only a few days after the release of OpenAi’s Sora 2 video model and Sora Social App, which gives users the power to generate very realistic deep fakes of themselves, their friends and certain cartoon characters.

That also includes dead people, who are apparently honest play because it is Not illegal to defeat the deceasedAccording to the Press Law Center student.

Sora don’t let you generate videos of living people – unless it’s yours, or a friend who has given you permission to use their parable (or ‘cameo’, as OpenAI calls it). But these limits do not apply to the dead, which can usually be generated without roadblocks. The app, which is still only available via Invite, is flooded with videos from historical figures such as Martin Luther King, Jr., Franklin Delano Roosevelt and Richard Nixon, as well as deceased celebrities such as Bob Ross, John Lennon, Alex Tebek and yes, Robin Williams.

See also  OpenAI board chair Bret Taylor says we’re in an AI bubble (but that’s okay)

How OpenAi pulls the line when generating videos of the dead is unclear. Sora 2, for example, will not be former President Jimmy Carter, who died in 2024, or Michael Jackson, who died in 2009, although it made videos with the parable of Robin Williams, who died in 2014, according to the WAN tests. And although the Cameo function of OpenAI enables people to set instructions for how they appear in videos that others generate of them – guardrails who came in response to early criticism of Sora – the deceased did not say. I bet Richard Nixon would roll in his grave if he could see the deep fake I made of him pleading for the abolition of the police.

Deepfakes of Richard Nixon, John Lennon, Martin Luther King, Jr. and Robin Williams
Deepfakes of Richard Nixon, John Lennon, Martin Luther King, Jr. and Robin WilliamsImage Credits:Sora, screenshots through techcrunch

OpenAi did not respond to WAN’s request to comment on the admissibility of dead people deeper. However, it is possible that in -depth dead celebrities such as Williams fall within the acceptable practices of the company; legal precedent to show That the company would probably not be held liable for the defamation of the deceased.

WAN event

San Francisco
|
27-29 October 2025

“To see how the legacies of real people are condensed to ‘this seems vague and sounds like she, so that is enough’, just so that other people can harden Tiktoksloppen, is crazy,” Williams wrote.

The critics of OpenAI accuse the company of following a fast and loss approach to such issues, and that is why Sora was quickly flooded with AI clips of copyright characters such as Peter Griffin and Pikachu at the release. CEO Sam Altman originally said that Hollywood Studios and Agencies should explicitly unsubscribe if they did not want their IP to be recorded in Sora-generated videos. The Motion Picture Association has already called on OpenAi to take action on this issue, declaration In a statement that “established copyright legislation guarantees the rights of makers and applies here.” Since then he has said that the company will reverse this position.

See also  Former Googlers' AI startup OpenArt now creates ‘brain rot’ videos in just one click

Sora is perhaps the most dangerous Deepfe-Papable AI model that has been accessible to people so far, considering how realistic his output are. Other platforms such as Xai are left behind, but have even fewer crash barriers than Sora, making it possible to generate pornographically Deepfakes of real people. As other companies catch up open, we will create a horrible precedent when we treat real people – alive or death – such as our own personal playthings.

Source link

Back to top button