A recent video that falsely appeared to show American politician Nancy Pelosi slurring her words as if drunk has brought the issue of ‘deepfakes’ back into the spotlight.
Prominent musicians and their teams should be taking note, thinks Gregor Pryor, co-chair of the entertainment and media industry group at law firm Reed Smith – because they could easily be among the next victims of deepfake creators.
“How do you combat it? It’s very, very difficult to combat. That is the message: ‘There is this problem, it cannot be easily solved’, so what do you do? The law isn’t clearly set out, and even if it was, it doesn’t provide an immediate silver-bullet remedy,” Pryor told Music Ally.
“If you’re talent you need a combination of PR, reputation management, legal and probably crime prevention or a technologist to help you. It’s the question of what would sit within the armoury of a star who’s concerned about deepfakes?”
To give some potential examples: think about stars like Ed Sheeran and Taylor Swift, and imagine what would happen if someone digitally edited them into footage of a neo-Nazi march or a porn-movie scene; or produced a video seemingly showing them attacking a rival with racial slurs.
(The video aspect here is important: we often trust videos more than photos, thanks to our awareness that the latter can be easily manipulated using software like Photoshop, and our assumptions that it’s much harder to fake a video. What’s troubling in 2019 is that there are tools making it much easier to create deepfake videos, without needing to be a special-effects whizz.)
Pryor has been thinking about this issue for “three or four years” now, ever since encountering the case of teenager whose face had been digitally added to porn photos by a boy she’d turned down, who then shared the resulting images with schoolfriends.
“Then the technology arrived not so long after that which would much more convincingly allow you to mimic facial expressions or talk with the face of an actor or celebrity. That’s when we started seeing deepfake clauses on talent contracts,” said Pryor.
However, he pointed out that these clauses merely assigned the responsibility for taking action against deepfakes of that celebrity, without being able to specify how that action should be taken.
“There are a few questions here. Is the deepfake illegal in the first place, and if so, how? And second, what do you do about it, which is the harder thing,” said Pryor.
Is a deepfake illegal? He divided the issues here between intellectual property and non-intellectual property. In the former case: “If you’re going to use Ed Sheeran’s face, you’re going to have to get it from somewhere. Has there been a copy made of an image of the face? The probability is yes.”
The complication: the copyright for that image is much more likely to belong to a photographer or media organisation than to the celebrity themselves, so it would be down to that entity to take action on copyright-infringement grounds, rather than being in the celebrity’s power.
“Who’s got the rights to the image of the talent’s face, and who’s got the rights to the other content into which the face is synced, whether it’s Top Gun or a porn film,” said Pryor. “You can see how that’s problematic.”
However, Pryor also pointed to what he calls “pseudo-IP rights” around issues of passing off a celebrity’s image right. He cited a famous case involving Rihanna winning a ruling to ban retailer Topshop from selling a t-shirt with her face on it, even though she didn’t own the copyright to the photograph it was based on.
That case does offer a precedent that might be useful for a celebrity trying to take action against deepfake content, although Pryor warned that laws governing image-rights vary considerably from country to country: for example, they’re strong in France and Germany, but weaker elsewhere.
There are other legal avenues that a deepfaked celebrity could explore: for example, existing laws on harassment. One man in the UK received a fine and a 16-week jail term for harassment of a work colleague that included editing them into porn images. “Could that work for a pop star? Maybe,” said Pryor.
There might also be an argument that a deepfake is libellous, particularly if it causes serious harm to a celebrity’s reputation – the neo-Nazi example being a case in point. That said, deciding that fake footage of a goose-stepping Ed Sheeran is libellous is one thing; removing it from social-media platforms and tracking down the culprit quite another.
It’s not just the entertainment industry and its lawyers thinking about deepfakes: the Pelosi example shows why politicians are just as concerned about it, not to mention the harassment potential for private citizens. As an example of the latter, Pryor points to a 2018 report about sexual harassment of girls and women in public places from the UK Parliament’s Women and Equalities Committee, which specifically mentioned deepfakes.
“The report called on the government to implement a law against deepfakes that would stop the non-consensual creation and distribution of deepfake porn. But the government can only do so much: you can pass the law, but the practical impact of it is harder to legislate against,” he said.
Thus, the lack of a silver-bullet remedy. Even so, he returned to his main point: that even if a solution isn’t easy, artists’ teams must educate themselves on what deepfakes are, and what they would do if their artist falls victim. “Have a plan,” said Pryor. “Above all, have a plan.”