Will UK’s online safety bill protect children from adult material?


The online safety bill is due to become law this year and it imposes a duty of care on tech companies to protect children from harmful content. However, there are calls from campaigners and peers to toughen the legislation’s provisions regarding pornography. Here is what the act proposes to do on adult material.

Will the online safety bill prevent children from accessing pornography?

The bill requires all pornography websites, such as Pornhub, to ensure children do not encounter their content. This will require age-checking measures. The legislation refers to stringent age verification – checking a user’s age via government ID or an authoritative data source such as a person’s bank – as a means of doing so. Breaches of the act carry the threat of a fine of up to 10% of a company’s global turnover or, in extreme cases, blocking a website altogether.

What are the rules now?

MPs have described the legal approach to pornography in the UK as a “loose patchwork” comprising more than a dozen laws. It is a criminal offence to publish work under the Obscene Publications Act that is deemed “obscene” and it is illegal under the Criminal Justice and Immigration Act to possess an “extreme” pornographic image. It is also an offence to make, possess or distribute indecent images of a child.

The primary regulator of legal pornography offline is the British Board of Film Classification, which gives pornography age ratings – R18 for the most extreme but legal content, or 18 – but it has no control over online content.

Ofcom, the communications watchdog, already has the power to regulate UK-based “video-sharing platforms” such as TikTok, Snapchat and OnlyFans. These platforms are required to protect under-18s from videos containing R18 material such as pornography.

The age appropriate design code was introduced in 2021 and is designed to prevent websites and apps from misusing children’s data. Under its terms, social media platforms would be breaching the code if their algorithms served adult material to under 18-year-olds.

How will pornography websites prevent children from accessing adult material?

Age verification has been a troublesome issue for the government. Age-checking for pornography was announced as a Conservative policy in 2015. However, plans to introduce a nationwide age verification system for online pornography were abandoned in 2019.

The bill will not mandate use of specific technologies for age checking, although Ofcom will issue codes of practice on age assurance, which is the umbrella term for assessing the age of people online. Age verification is the term for the toughest measures, such as requiring proof of official ID.

One solution is to use age verification companies that vet a user’s age – via a range of methods including checking official ID or bank statements – and then notify the porn provider that the person wishing to access their service, who is anonymised, is over 18 years old.

Ofcom has said it will launch a consultation on protecting children from pornographic content – including on user-generated platforms such as OnlyFans – in the autumn.

Will children be protected from adult material on mainstream social media platforms?

The government has indicated that there will be clear instructions to mainstream social media sites and search engines to prevent children accessing pornographic content on their services. The bill requires sites to prevent children encountering what it terms “primary priority content”. Because it qualifies as a “user-to-user” service, subscription site OnlyFans is also covered by this part of the bill.

We will not know what is primary priority content officially until it is defined in a statutory instrument that will be published after the bill becomes law. However, pornography is expected to be on that list and it was listed as primary priority content by the previous culture secretary, Nadine Dorries, in a parliamentary statement last year. According to a timeline published by Ofcom, though, it could be more than 18 months after the bill is passed before these provisions come into effect.

Social media sites and legal pornography sites will also be required to shield all users from illegal pornography such as obscene content and child sexual abuse material.

What does the bill do about non-consensual image sharing?

The bill will update the law on sharing intimate images without someone’s consent. In England and Wales there will be a new “base offence”, where it is an offence to share an intimate image of a person if they do not consent – and the perpetrator does not believe they have consented. Currently, these offences apply if the image is shared in order to cause humiliation or distress.

The base offence will now apply regardless of the motivation, including sharing it as a joke, for social status, financial gain or “where there is no motivation at all”.

Share post:


More like this