Advertisement

Rishi Sunak urged to close legal loophole on deepfake pornography

A cybersecurity firm found that around 96 per cent of all deepfake videos are non-consensual pornography, while women are targets in 96 per cent of cases (Getty)
A cybersecurity firm found that around 96 per cent of all deepfake videos are non-consensual pornography, while women are targets in 96 per cent of cases (Getty)

Rishi Sunak has been urged to close a legal loophole on deepfake pornography to ensure those who create it face criminal action.

The Labour Party has warned the government is being “outpaced” by the internet on the issue of deepfakes – a growing phenomenon that involves explicit images or videos being manipulated to look like someone without their consent.

Sharing deepfakes without the person’s permission has been illegal in the UK since 31 January under the Online Safety Act.

But Labour has tabled an amendment to the Criminal Justice Bill, which is going through parliament, to make the creation of deepfakes illegal.

Have you been affected by this? email maya.oppenheim@independent.co.uk

Speaking to The Independent, shadow home secretary Yvette Cooper said: “Making deepfake intimate images and videos is an appalling violation of somebody’s autonomy and privacy and it should not be tolerated.

“Technology is increasingly being manipulated to manufacture misogynistic content and is emboldening perpetrators of violence against women and girls.”

The government must bolster the law in this area to deliver “a clear and unambiguous message that such activity is harmful and it is wrong”, Ms Cooper added.

Shadow home secretary Cooper says deepfakes without consent ‘should not be tolerated’ (PA Wire)
Shadow home secretary Cooper says deepfakes without consent ‘should not be tolerated’ (PA Wire)

Earlier this week, a Channel 4 investigation revealed the amount of deepfake pornography has exploded in recent years.

While researchers found one deepfake pornography video online in 2016, some 143,733 new deepfake pornography videos were uploaded to the 40 most used deepfake pornography sites in the first three quarters of last year – with this amounting to more than in all the previous years added up.

Analysis of the five most visited deepfake websites found more than 250 British celebrities are among the thousands of high-profile people who fall prey to deepfake pornography – with this including women who are actors, YouTubers, television personalities and musicians.

A woman who discovered deepfake pornography of herself on the internet previously told The Independent it was a “violating” ordeal as she accused her attacker of trying to “silence” and “scare” her.

“Someone was using my identity, my profile without my consent in a sexual manner. I appreciate some people don’t feel like it is that big of a deal,” she said.

“But we have hit blurry lines between perception and reality because perception is reality now. I find it abhorrent that they used my image to silence me, to scare me, or for sexual gratification without my consent.”

Many have sounded alarm bells about how deepfakes can mislead members of the public, and previous research conducted by cybersecurity firm Deeptrace indicated around 96 per cent of all deepfake videos are non-consensual pornography, while women are targets in 96 per cent of cases.

The Home Office has been approached for comment.