The Online Safety Act is now officially law – here's how it impacts women in particular

a woman in a vibrant eye catching pink and red jumper using a laptop
How the new Online Safety Act impacts womenF.J. Jimenez - Getty Images


The Online Safety Bill has now become the Online Safety Act, after receiving Royal Assent on Thursday (26 October). Sharing the news, the government said it will make "the UK the safest place in the world to be online".

However, the road to reach this point has been long – and marred with controversy, as the Act faced years of delay after it was first suggested in a government white paper back in 2019. The overdue legislation, which hopes to bring our laws in line with the digital-focussed world we live in, looks to protect users from harmful material online, and will now force social media firms to remove illegal content uploaded to its platform(s).

New offences also included in the bill tackle image-based sexual abuse, in the forms of cyberflashing and the sharing of "deepfake" pornography – however, the creation of deepfake porn in the first place is still not outlawed.

Communications regulator Ofcom will enforce the bill, with social media bosses facing substantial fines or even jail if they fail to comply.

Speaking about the Act, technology minister Michelle Donelan told Cosmopolitan UK: "It really is a historic moment, not just because this bill has taken a long time to get to this place but also because of the impact this bill is going to have.

"It will make the UK the safest place to go online. It will particularly help children, as well, to be able to utilise social media and be safe (...) and it'll help people that, at the moment, go on social media and it really deeply affects their mental health or wellbeing because they're seeing content that is illegal. These are all things that the Bill will rectify."

So what does the new legislation mean for you and your socials? Cosmopolitan UK has got you covered…

close up of woman using a smart phone
Tim Robberts - Getty Images

The new laws on image-based sexual abuse (like 'revenge porn' and deepfakes)

The 300-page Act has strengthened pre-existing laws, as well as introduced new ones, to protect against the ongoing epidemic of violence against women and girls.

The legislation will make it easier to convict someone who shares intimate images without consent, whether they're legitimate images or falsified ones (as new laws are also proposed to prevent the non-consensual sharing of pornographic deepfakes).

Those found guilty of sharing these images could face a maximum of six months jail time

Research conducted in 2018 by fraud detection company Sensity AI unearthed that shockingly over 90% of all deepfakes online are non-consensual pornographic clips targeting women, a figure which doesn't appear to have reversed.

This comes in tandem with other forms of image-based sexual abuse (commonly referred to as 'revenge porn') increasing; women's charity, Refuge, recently reported that the number of survivors supported by their tech team rose by 258% between 2018 and 2022.

If a social media platform does allow consensual pornographic material, the Online Safety Act insists ‘robust’ processes now need to be in place to verify age.

In response, Sophie Mortimer, Revenge Porn Helpline Manager at SWGfL, said: "The Online Safety Bill being passed means it will be easier to charge perpetrators who share intimate images and deepfakes without consent within the UK. This is a milestone development, that does not require the victim to evidence that the perpetrator intended to cause distress.

"Anyone who shares intimate images without consent is breaking the law and the penalties being brought forward in this new legislation will hopefully reaffirm that intimate image abuse is never okay.

"Despite this, there are still notable, important gaps that have not been addressed. Anyone who has had their intimate content shared online without consent is not protected to have that content removed from the internet, even if the perpetrator has been brought to justice. This is all because the content is not classed as ‘illegal’. Those experiencing this devastating harm deserve more than this and it is disappointing this has not been included despite the positive steps forward."

When it comes to deepfakes, there’s also still work to be done. Whilst it is now illegal share sexual deepfakes without permission, there is still no law in place presenting such material from being created in the first place.

Cosmopolitan UK has done extensive work on image-based sexual abuse and deepfakes, which you can read about here and here.

How the Online Safety Act covers trolling and bullying

The Act places a substantial duty of care on social media platforms to protect users from material that may be considered harmful, even if it is technically legal.

After tech companies complained that this makes them unfairly liable for material on their platform, an amendment that has been put into the Online Safety Act states social media sites now have to provide tools to hide material they don’t wish to see.

The Act also means children have to be protected from material that causes “serious trauma”, which includes trolling and cyberbullying.

Other material highlighted in the Act includes misogyny and disordered eating.

NSPCC chief executive Sir Peter Wanless said: "We are absolutely delighted to see the Online Safety Bill being passed through parliament. It is a momentous day for children and will finally result in the ground-breaking protections they should expect online."

close up of womans hand typing on computer keyboard
d3sign - Getty Images

How social media sites will now handle illegal material

Stricter rules have been placed on platforms to commit to remove all illegal content. This includes: child sexual abuse, extreme sexual violence, people smuggling, material that promotes or facilitates self-harm or suicide, controlling or coercive behaviour, terrorism, selling weapons or illegal drugs, and animal cruelty.

In response, social media platforms have to:

  • remove illegal content quickly or prevent it from appearing in the first place, including content promoting self-harm

  • prevent children from accessing harmful and age-inappropriate content

  • enforce age limits and age-checking measures

  • ensure the risks and dangers posed to children on the largest social media platforms are more transparent, including by publishing risk assessments

  • provide parents and children with clear and accessible ways to report problems online when they do arise

Even if the material was not filmed or created in the UK, the platform is required to remove it.

What the Online Safety Act says about encrypted messages

One of the most controversial aspects of the Act is that it forces messaging services, such as WhatsApp and Signal, to end encryption of messages.

Currently, these services have end-to-end encryption, which ensures nobody other than the sender and intended recipient of a message can read it.

WhatsApp, which is owned by Meta, has warned they may pull their messaging service to British users rather than compromising their privacy under the new bill.

However, child protection groups have warned this type of messaging is the ‘frontline’ of sexual abuse.

instagram app logo is displayed on a mobile phone screen for illustration
NurPhoto - Getty Images

How will the Online Safety Bill be regulated?

The new Act will be regulated by communications watchdog Ofcom.

Should social media platforms fail to adequately comply with the Online Safety Bill’s regulations, Ofcom has the power to issue fines of up to £18 million, or 10% of their global annual revenue.

Huge platforms, such as Meta and X, could see fines potentially run into the billions of pounds should they be found in breach.

It's a move that has received the backing of public figures, including Emily Clarkson, a self-love and body acceptance influencer who has previously been subjected to horrific online abuse. "For a very long time, if anybody's had any complaints about anything they've experienced on the internet, the attitude, kind of by the public, has been, 'If you don't like it, don't use social media.'

"You know, I've heard that firsthand. But I've also seen it happen so many times to women who've experienced much worse than what I have on the internet. It's victim blaming on a mass scale," she told Cosmopolitan UK.

"I've received abuse like you wouldn't believe. I've had so many rape threats, death threats, unsolicited sexual images, very regularly (...) it's incredibly commonplace. Point at anything and I've probably had it," Clarkson added.

"It is overdue in a sense, but it was always going to be a difficult thing to build the infrastructure for a world that already exists. So, the fact that steps are being taken, so tech companies can't just get away with what they have done in complacency for so long, and to actually have accountability and responsibility, is really exciting."

You Might Also Like