New videos of a young Tom Cruise have fascinated the denizens of social media, with just one hiccup – the footage is not real. The videos are deepfakes – false images created using artificial intelligence (AI) programs. While deepfake videos have appeared over the last few years, the incredible realism of the new Cruise clips has prompted curiosity and shock online.
The series was posted to TikTok and reportedly gained over 11 million views before being removed. The videos also trended on Twitter. While they were initially of mysterious origin, Fortune pointed out that Belgian visual effects engineer Chris Ume had taken credit.
Along with the AI creations comes widespread apprehension over what such technology could mean for the future of news, information, democracy, security, human rights, criminal prosecutions, and a host of other issues.
In 2018, there was speculation that a video of Gabonese President Ali Bongo was a deepfake, as it had been used to “prove” the president, who had been absent from public life for a while, was alive and well. The whole episode led up to an apparent attempt at a military coup. In Malaysia and Brazil, sex scandals have been tamped down by officials claiming footage of their illicit behavior was deepfaked. While these incidents appear relatively tame, the possibilities for abuse are endless. But what’s happening in the here and now?
Because It’s the Internet …
According to deepfake detection company Sensity, 85,047 such videos were available online as of December 2020, with the number doubling every six months – and 96% of them are pornographic.
In June, Hollywood actress Kristen Bell said she was “shocked” after discovering that her head had been superimposed onto pornographic footage:
“I was just shocked, because this is my face. [It] belongs to me … You know, we’re having this gigantic conversation about consent and I don’t consent, so that’s why it’s not OK. Even if it’s labeled as, ‘Oh, this is not actually her’ it’s hard to think about that I’m being exploited.”
According to Sensity, 99% of the subjects depicted in deepfake pornography are actresses and musicians – though YouTube videos feature a wider range of subjects, including politicians and business leaders.
A free bot called DeepNude was found in 2019 to be taking images of women and underage girls and “undressing” them by providing synthetic nude images – which were distributed on social media platforms. Sensity found that a total of around 680,000 women had been affected. It added in October 2020 that “70% of targets are private individuals whose photos are either taken from social media or private material.” According to the study, the fake photos can be used for a variety of nefarious purposes:
“These findings also allude to broader threats presented by the bot … After ‘stripping’ the target’s image using the bot, the attacker deploys it to publicly shame the target, sharing the image openly on social media, or sending it privately to the target’s relatives, friends, and acquaintances. Alternatively, the attacker extorts the target by threatening the publication of the ‘stripped’ image online unless a sum of money is transferred by a certain time. The personal information and images of targets could also be sold to other malicious actors on underground forums and marketplaces.”
A DeepNude user poll found that only 16% were interested in targeting celebrities. The most popular response, at 63%, revealed most users wanted to target “Familiar girls, whom i [sic] know in real life.”
While the bot was taken offline in 2019, Wired asserts the code was backed up and copied to be used by others.
There aren’t many laws yet equipped to deal with explicit deepfake images – the striking factor shared by victims’ stories is a powerlessness to address their plight. On Feb. 26, the U.K. Law Commission recommended the formulation of criminal offenses including “sharing altered intimate images, such as deepfakes.” As for the United States, according to the Cyber Civil Rights Initiative, 46 states now have laws against “revenge porn,” although only two of those specify fake and deepfake content, says Technology Review.
Who Owns Your Face?
What is the legality of using a face – does a person hold copyright over their own visage? In 1985, actor Crispin Glover appeared in Back to the Future as the George McFly character but did not agree to appear in the sequels. The producers of Back to the Future II decided to get around this by “simulating” a Glover performance by applying prosthetic makeup to another actor along with carefully spliced footage from the first film. Glover sued the studio over the alleged theft of his likeness, and the matter was settled out of court – with changes to the Screen Actors Guild rules to prevent studios from trying similar stunts in the future, at least according to Glover and his attorney.
But that was in the 80s, before the internet hosted the public’s photos. Other lawsuits have followed over the years involving celebrities’ likenesses being used, though the law is still murky.
Nowadays, practically everyone who has ever used a social media website has their face out there where it can be used and abused by just about anyone. YouTube influencers have felt the sting of foreign manufacturers using their images to advertise products without their permission – not to mention that some social media sites claim rights to any content they host. And that’s not even including facial recognition companies scraping the internet for images to add to their databases, a practice that has been criticized by digital rights groups but is being paid for by law enforcement agencies.
In terms of deepfakes, hip-hop star Jay-Z attempted to strike from the internet a clip that depicts his voice rapping to Hamlet and Billy Joel’s “We Didn’t Start the Fire.” The rapper made an unsuccessful copyright claim against YouTube channel Vocal Synthesis, which posts various fake celebrity readings.
Scarlett Johansson, the target of deepfake pornography and other depictions of her face and body, doesn’t hold out much hope when it comes to gaining control of this runaway train. “Even if you copyright pictures with your image that belong to you, the same copyright laws don’t apply overseas. I have sadly been down this road many, many times,” she said in 2018. “The fact is that trying to protect yourself from the internet and its depravity is basically a lost cause, for the most part.”
Suffice it to say, questions surrounding deepfakes are only just beginning.
Read more from Laura Valkovic.