A new government-commissioned Review – Creating a Safer World – the Challenge of Regulating Online Pornography – recommends overhauling the government’s approach to online pornography, giving the state more power to police sexual content and its use. Conservative peer Baroness Gabby Bertin led the Review, scrutinising current pornography and the legal response to it.
Waving it through
National news outlets have welcomed the Review, and we are yet to see any public figures question the Review’s recommendations. The Guardian, Independent, and ITV describe the Review and give it a broad thumbs-up. BBC News dedicated two sentences to a worry about policing sexual tastes, quoting from content creator Madelaine Thomas, but otherwise praised the recommendations.
The Daily Mail headlined their piece with the shocking claim “Teenage boys are asking teachers how to CHOKE girls during sex,” a claim repeated in the Independent, BBC, and on Women’s Hour by BBC’s Home Editor. This claim appears multiple times in the Review itself, but the actual question a teacher reported being asked was: “How can I choke someone safely?” Every reference to this claim thus far has removed the word “safely”. It’s a small difference, but completely changes the meaning. The alteration of the quote, and its repetition, is an effective way of heightening public panic about our children, thus easing a jump to the ill-justified extension of state power.
Government ministers are promising urgent action, and the Secretary of State for Science, Innovation and Technology indicated a readiness to use new powers to remove illegal content and punish providers, adding: “if I have to adapt the law in response to any gaps that emerge in these powers then of course I’ll act as swiftly as I can.”
Baroness Bertin might have expected resistance to the Review, but judging by the reception so far, it seems it could be adopted rapidly with little pushback.
Policing pornography
Currently, pornography is legislated through the Online Safety Act 2023, Criminal Justice and Immigration Act 2008, and Obscene Publications Act 1959. The British Board of Film Classification (BBFC) restricts publication of offline content, and Ofcom deals with online content. The Review recommends expanding the remit of these bodies: BBFC would moderate online content, passing it to Ofcom to enforce.
The Review recommends a new legislative Review to ensure “effective prosecution and enforcement” of possession offenses. Currently, police largely target individuals for possession of illegal pornography when that person is suspected of a different crime. This approach is likely to generate a distorted impression of who uses illegal pornography. The Review recommends:
Illegal pornography offences should be accurately tracked in the police database and a nationally agreed and consistent approach should be implemented across police forces in the UK to better record incidences of these crimes. This would improve the understanding of links between illegal pornography and other offences, particularly those of a sexual nature.
What would this “consistent approach” look like? We don’t know, but given the Review’s assumption that pornography is linked to violence, and given how Stop and Search powers are used, we should be worried about how police will target and investigate pornography possession.
Broad and vague
The Review proposes making some content illegal (choking, and incest pornography) and making some content harder to find. The aim is to make anything that the BBFC wouldn’t classify offline, illegal online. This wouldn’t just apply to choking and incest porn, but also to anything considered currently “legal but harmful.” This category is broad and vague, and I am pessimistic about what the government may decide should count. These decisions will affect people on both sides of the screen.
Parts of the Review acknowledged existing injustices, including stigmatisation and de-banking of pornography performers.
De-banking is where your bank closes your account without your consent. This can be devastating; preventing people from paying rent, buying food, and receiving wages and essential financial support. It is all too common for sex workers to be de-banked, even if their work is legal. Unfortunately, the Review suggests that only “illegitimate debanking” need be combatted. Apparently de-banking some workers is okay, if they work in the wrong kind of porn.
Current policy relating to pornography sits in multiple government departments. The Review recommends putting pornography squarely under the Home Office. This treats pornography as an essentially criminal matter. We should be very concerned about a policy shift that treats sexual entertainment, sexual labour, and sexual desire as matters of national security.
Dangerous myths
The message throughout this Review is: pornography is dangerous, and though we don’t have the data (yet) to prove how harmful it is, we need to act to protect our children.
When we treat sexual material as uniquely dangerous and dirty, we reinforce the stigmatisation of sex workers and misogyny more broadly. The myth that sexual desire is vulgar, and that good girls don’t engage with pornography, is a key element of the patriarchal standards that drive slut-shaming and violence against women, especially sex workers.
Many women do watch pornography. According to PornHub’s 2024 statistics, women users are more likely than men to search for ‘hardcore’ and ‘bondage’. We must not perpetuate the myth that real women only desire romantic, vanilla sex, and only consume softcore, high-brow erotica. Women’s tastes are varied, and the urge to police what counts as virtuous sexuality only empowers oppressors.
Giving the state the role of differentiating good, feminist content, from bad, misogynistic content, will never end well for marginalised people. The 2014 Audio-Visual Media Services Regulations aimed to draw up a list of violent and obscene content. This disproportionately impacted queer pornography and representations of women’s pleasure (sparking the spectacular face-sitting protests). Why should we think that next time they’ll get it right?
Similarly, utilising ‘proactive technology,’ as the Review advocates, is asking for trouble. Technology replicates the prejudices in the society that designs it. For example, Instagram’s algorithms for censoring content have yielded multiple scandals. Sex workers find their posts removed and accounts limited, even if they broke no rules. Black people, fat people, and queer people were having their photos removed while comparable posts from cis, straight, white people were not.
Keeping us safe from pornography?
There is plenty wrong with online pornography today, just as there is plenty wrong with Hollywood movies, television series, advertising, and all kinds of media.
Expanding state powers, and equating pornography with violence, will put us in more danger.
Featured image via the Canary