Victory! Sharing deepfakes is set to be illegal: EYNTK about the new image abuse laws coming into play

a huge victory sharing deepfakes is being made illegal here's all you need to know
EYNTK about deepfake p*rn being made illegalJaime Lee
a huge victory sharing deepfakes is being made illegal here's all you need to know
Jaime Lee

The word 'deepfake' is one that's increasingly seeping into the public consciousness, after incredible campaign work from the likes of organisations such as My Image My Choice and Not Your Porn, academics, women's charities, feminist activist Jess Davies in her BBC documentary, Deepfake Porn: Could You Be Next? and publications, such as us here at Cosmopolitan UK, who have all been highlighting how the technology is being used to disproportionately target women. If you're not familiar, deepfakes are a form of synthetically generated media (in layman's terms, it's when a photo or video of someone is misappropriated and edited into an imagined scenario; you can also deepfake audio so it sounds like someone is saying something they aren't).

Right now, we're seeing that as deepfake AI tech becomes more accessible and easy to use, with creators offering to take requests in dodgy chatrooms and nudifying apps being simple to download, sadly, there's been a staggering rise in the amount of non-consensual 'deepfake porn' being generated – which has been labelled by some as a digitally-based sexual assault.

Research conducted in 2018 by fraud detection company Sensity AI unearthed that shockingly over 90% of all deepfakes online are non-consensual pornographic clips targeting women, a figure which doesn't appear to have reversed.

This comes in tandem with other forms of image-based sexual abuse (commonly referred to as 'revenge porn') increasing; women's charity, Refuge, recently reported that the number of survivors supported by their tech team rose by 258% between 2018 and 2022.

Now, the government's newly amended Online Safety Bill is not only promising to crack down on deepfakes, but is also assuring that laws will soon be put in place to protect us from the likes of being filmed without consent, 'downblousing' (think upskirting, but focussed above the waist) and more 'traditional' forms of image-based sexual abuse (wherein genuine, not falsified images or videos, are shared without consent - with equal potential to cause huge repercussions).

Here's all you need to know about the new proposed laws...

What does the new Online Safety Bill say about deepfakes?

For the first time, an amendment to the Online Safety Bill is recognising deepfake porn as being legitimately harmful and wrong, and is making the sending, or sharing, of deepfake porn a criminal offence. Previously laws around image-based abuse of any kind fell victim to a huge loophole: you had to prove the person shared those images with intent to cause distress or humiliation (so saying "It was just a laugh!" was an easy - literal - get out of jail free card). Now, this committing the act, irrespective of reason, can be punished by a minimum recommended jail time of two years (although this is just guidance and every judge can create their own sentencing). It is the first time deepfaking has been clearly defined in a bill, which is a really positive step.

The amended Online Safety Bill is expected to come into full effect early next year. As for laws clamping down on the making and creating of deepfake porn, unfortunately that does not fall under this new legislation and is the next hurdle needing to be tackled.

deepfake illegal uk
AsiaVision - Getty Images

Theses changes are, of course, being welcomed by those of us who've been saying for years that the legal system in this country is massively lagging behind the actual world we live in, where we spend a lot of our time online. They're also part of the government's pledge to make vast improvements when it comes to the lives of women and girls as a whole, and our safety.

So, while this news - that the government are officially and publicly declaring that a) deepfake porn is a real issue and genuine cause for concern and b) have plans to try and tackle it - is music to our ears, it's important to ask: what does this new Online Safety Bill mean for us in real terms and for those who've already been victimised in this way?

In order to find out more, Cosmopolitan UK spoke to Alan Collins, a legal expert who is partner in the sex abuse team at Hugh James Solicitors, who said until that bill becomes a law - which unfortunately is no overnight process - it's "difficult to prosecute offenders, and the law has struggled to keep up with rapid social media innovation, which is also inherently difficult to police".

Meaning, whilst the message this pending law change sends out to society is really positive - that it's not okay to do this to women - policing it in real terms will still be a challenge.

Collins explains, "What appears to be proposed is a criminal offence that captures non-consensual intimate image being shared as a whole, and that this would include deepfakes [rather than a specific law on deepfakes alone]. This is intended to prohibit what has become a widespread problem of intimate images being shared online, without consent, which can so harmful."

Keen to find out whether this will make justice easier for survivors of non-consensual sharing of intimate images (genuine or not), we asked Collins if he agrees that whilst this new proposed bill is a great step in the right direction, it's not going to be a magic wand that will see harmful digital acts like this disappear overnight, or all perpetrators instantly caught – or easily reprimanded.

"The devil is always in the detail and I can see arguments being played out in court about what was consensual and what was not," he comments. "What would the case be if consensual deepfake images were accessed by a third party, for instance? Another question that needs addressing is the voluntary posting of intimate images [or images that are sexual in nature] on social media. It could be argued, and I’m not saying this is right, that the person must be consenting to their use for sexual gratification.

"The challenge for legislators is to ensure prohibition of harmful behaviour does not over reach, so that it is not driven underground, making social media much harder than it already is to police."

As for whether these laws will allow for those who've been non-consensually deepfaked or had their genuine images shared inappropriately in the past to retrospectively fight for justice, Collins believes not, unfortunately. "If these proposed changes in the law come about they are unlikely to be retrospective, but if you think today you are a victim of deepfaking or exploitation you should still contact the police."

The Revenge Porn Helpline is also a fantastic resource, that can offer both practical and emotional support, for anyone who has been a victim of any type of any image-based abuse, whether the images are real or not.

Speaking about the victory today and the work still needing to be done, survivor and campaigner Kate Isaacs from Not Your Porn, said in an Instagram post: "The proposals, especially the Government's focus on the lack of consent as the base for culpability are a step in the right direction however intention is not action [and] these laws need to be made with the mindset of getting it right first time."

Isaacs added that the laws must take on board the experience shared by relevant organisations, academics in the image-based sexual abuse sphere, campaigners and survivors. "We need a multi stakeholder commitment from government and other key state functions to tackling online violence at a systemic level [...] We are optimistic, but not naive. Intent is not ACTION, which we desperately need."

What to do if you're a victim of deepfake porn

  • Reach out to the Revenge Porn Helpline (call 0345 6000 459 or email help@revengepornhelpline.org.uk), they can offer you emotional support and advise on what the next best steps are.

  • Contact Stop Non-Consensual Intimate Image Abuse, who will be able to help you create a case for having the images removed from the internet. They have a 90% success rate of having revenge porn images taken down.

  • Take screenshots as evidence and if you feel comfortable doing so, report what has happened to the police. If the perpetrator is known, there may already be grounds for the authorities to get in touch with them - even without the new laws having yet kicked into effect.

Follow Jennifer on Instagram and Twitter

To learn more about deepfakes and how they're impacting society, watch 'Deepfake Porn: Could You Be Next?' on BBC iPlayer


You Might Also Like