What is a deepfake and how are they used as a form of revenge porn?

·6-min read
Photo credit: Katie Wilde - Getty Images
Photo credit: Katie Wilde - Getty Images

We're living in a world more enmeshed with technology than ever before. While this can be a positive - thank God for Zoom keeping us connected to loved ones throughout the pandemic - sadly, it's also lead to a rise in numbers of image-based sexual abuse.

This sad trend which sees victims, usually women, have their nude images or videos stolen or shared without their consent is often known as revenge porn. This in itself is a vile (and illegal) act - as is threatening somebody with it. However, perpetrators are now going one step further and even creating fake naked videos of people, and celebrities. These images and videos can be so realistic that they've come to be called 'deepfakes'.

Famous faces, such as Harry Potter actress Emma Watson and singer Ariana Grande, have been listed as repeat victims of pornographic deepfake abuse, after multiple falsified clips of them were shared of them on adult websites. Earlier this year, an incredibly lifelike video supposedly 'featuring' actor Tom Cruise (albeit wearing clothes), also did the rounds on social media.

To learn more about pornographic deepfakes, including how they're made, Cosmopolitan spoke to two experts - Zara Roddis, of the Revenge Porn Helpline, and Rachel Atkins, a Partner at Schillings law firm:

What is a deepfake and how do you make one?

"Deepfakes are similar to Photoshopped images - and can be just that - but in their most 'highest' form they can be AI generated imagery that maps someone's body part (usually a face) and augments this onto another body, or into another situation that the person has not been in," explains Roddis. "I think they're called deepfakes because they require really intelligent software to make the 'fake' and a huge amount of data (or a 'deep dive of data')."

However, it's not just smart technology only available to those in the know that's a part of the problem - deepfake apps, such as Fakeapp or FaceSwap, which are billed to audiences as being for little more than frivolity, also make creating doctored images and videos something pretty much anyone can do.

Photo credit: Katie Wilde - Getty Images
Photo credit: Katie Wilde - Getty Images

"These apps are badged as fun, light-hearted ways to spend your time and, for many of us, that is exactly what they are used for," says Atkins. "Still, there's also real potential to cause harm, especially as the mechanisms to create deepfakes becomes easier and the output becomes more realistic."

She adds that one of the main issues with deepfakes is that it leads to an increasingly high rate of non-consensual deepfake porn, primarily targeting women. "According to AI firm Sensity, 96% of deepfakes are pornographic and used to target women – and the number of deep fake porn clips is doubling every six months. By summer 2021, there could be as many as 180,000 porn videos 'starring' innocent people online."

How can you spot or prove something is a deepfake?

It can be a real challenge, Roddis sadly confirms. "It’s hard to know at a glance if it’s real or not, and when this comes to intimate content this is even more difficult," she explains. "When a sexual deepfake is made of someone, they can be so realistic that you couldn’t say otherwise and then the damage is done... and that damage is very similar to that of someone who has had their, genuine images or videos shared without consent. That’s where the abuse is: it’s an abuse of privacy and a violation of someone’s body."

Atkins agrees that it can often be difficult to differentiate. "Detecting an amateurish deepfake is relatively easy – the edges may be blurry, the audio inconsistent, or the video will simply look unnatural - but more credible deepfake porn, created with neural networks, will be harder to spot." She adds that ultimately, it could require machines or even digital forensics to identify a fake. "This is where the concern regarding deepfake porn lies – if it looks entirely credible then how will the victim be able to prove it isn’t real?"

What is the law on deepfake pornography?

Unfortunately, it's not clear-cut, says Atkins. "The main issue is around image ownership and rights, but if images are in the public domain, it becomes hard to fight that case legally in the UK."

She explains that the nature of the image is key, too. Using US politician Alexandria Ocasio-Cortez, who was a victim of a widely-circulated bikini shot deepfake, Atkins says that would be a case steeped in defamation. Whereas the convincing video of a fully-clothed Tom Cruise, saying nothing offensive, isn't.

"It’s the decision of the person to understand how a deepfake affects their livelihood and whether they want to take legal action. In the UK, the Domestic Abuse Act has just recently announced a sentence for up to two years in jail for those who threaten 'to disclose intimate images with the intention to cause distress'," Atkins adds. "This is a step in the right direction to help people fight cases in this sphere and I believe more will come to help fight this. Hopefully it’s not too late."

Where can you get help if you're a deepfake victim?

"Firstly, I am so sorry you've experienced this. It so problematic as the law right now does not cover Photoshopped images, so it can be difficult to get support," says Roddis. "But it is still so important to report it to the website you've seen the content on, if you can." She adds that depending on where the content has been shared, the Report Harmful Content service may also be able to assist and the Revenge Porn Helpline can offer further support too.

Atkins also recommends collecting evidence of the images on those platforms, saying it will help immensely should there be a subsequent legal battle.

What is being done to tackle deepfake porn?

"Some tech companies are looking at how the issue can be tackled by developing solutions that will recognise manipulated content," says Atkins. "One such project, Project Origin, is a result of a collaboration between Microsoft and media organisations such as the BBC."

She adds that many other tech companies are also taking action, such as Facebook, who last year said it would remove all videos that "had been edited in ways that weren't obvious to an average person, or if they misled a viewer into thinking that a person in a video said words they did not actually say". TikTok has also followed this approach.

"However, the main issue here is that detection of deepfakes is very difficult," Atkins continues. "There needs to be a combination of investment and effort from tech companies to prevent and identify deepfakes. As well as investing in deepfake detection, they must start removing deepfakes from their platform and, if deepfake non-consensual porn is found within an account, the user account should be banned for life. Laws also need to change to prevent technology companies from washing their hands of the issue."

For support and more information about deepfake revenge porn, contact the Revenge Porn Helpline, or for advice on building a legal case, reach out to Schillings.


You Might Also Like

Our goal is to create a safe and engaging place for users to connect over interests and passions. In order to improve our community experience, we are temporarily suspending article commenting