The UK Government’s controversial new Online Safety Bill is facing more criticism today after Dan Squires KC and Emma Foubister of Matrix Chambers issued a legal opinion on its “prior restraint” clause. This suggests that there is “likely to be significant interference with freedom of expression that is unforeseeable and which is thus not prescribed by law“.
The confusing and ugly mess of legislation that is the OSB was originally designed to clampdown on “harmful” internet content (i.e. via fines, website blocks by broadband ISPs and other sanctions), such as terrorist content, bullying, racism and hate speech, child abuse, self-harm, suicide imagery and conspiracy theories that incite violence etc. The aim was a noble one and something that most people could generally understand.
However, over the years, the OSB has been repeatedly expanded and now risks placing a massive administrative and legal burden on even smaller sites / content providers, which could have a chilling effect on freedom of expression as such providers move to limit their liability by restricting speech. Not to mention that some of its technical measures may even be outright unworkable for many sites and services.
Advertisement
The latest example of this appears to come from the Open Rights Group (ORG), which has received legal advice from Matrix Chambers on Clause 9(2) of the Bill, which places a duty on online platforms, such as Facebook and Twitter – among many others, to prevent users from “encountering” certain “illegal content“.
At present the law allows individuals to post without being restricted by the government or tech platforms, but if content is reported as illegal or breaches the site’s terms and conditions, the platform would take it down and would not then be liable for the content.
The new Clause turns that premise upside down. Under Clause 9(2(a), tech companies are required to “prevent users encountering” any illegal content. The only way to ensure that they are prevented, is to stop it ever appearing on the platform in the first place (the economics of the internet mean you can only viably do this via automation / AI). This is said to represent “a sea change in the way public communication and debate are regulated in this country“.
The online platform is made strictly liable for policing the illegal content, with the threat of large fines and imprisonment of its management, if they fail to do so. This will incentivise companies to err on the side of caution and block content rather than risk sanctions. Such risks have been expressed before, albeit not previously with the backing of a professional legal opinion.
Advertisement
The platform must make its own determinations of what is illegal, creating the risk that “a significant number of lawful posts will be censored” (i.e. people who aren’t either the police or the courts, generally don’t get this stuff right). The Opinion highlights the particular challenges of identifying whether content that could appear to relate to fraud, terrorism, or immigration crimes is actually illegal.
For example: “Assisting unlawful immigration under section 25 of the Immigration Act 1971 is listed in Schedule 7. The government has expressly indicated that this provision may be used to prevent people from posting photos of refugees arriving in the UK by boat if they ‘show that activity in a positive light’. It is difficult to predict how an AI algorithm will decide what constitutes showing such an image in a “positive light”. What if the image was posted without any text? Removing these images prior to publication, without due process to decide whether they are illegal, may have a chilling effect on a debate of public importance.”
The Opinion notes that there is a significant risk of interference with freedom of expression which is not “prescribed by law”. In order to be prescribed by law, the Act must have “some basis in domestic law” and (b) must be “compatible with the rule of law”, which means that it should comply with the twin requirements of “accessibility” and “foreseeability”.
The law “must afford adequate legal protection against arbitrariness and accordingly indicate with sufficient clarity the scope of discretion conferred on the competent authorities and the manner of its exercise.” The opinion concludes that Clause 9 (2) (a) fails on all these counts and therefore is not itself lawful.
Advertisement
Dr Monica Horten, Policy Manager at the ORG, said:
“It’s one small clause with a huge impact. It up-ends the existing legal order on tech platforms. This legal opinion confirms that these measures could compel online platforms to remove content before it has even been posted, under threat of fines or imprisonment. There are no provisions for telling users why their content has been blocked.
As well as being potentially unlawful, these proposals threaten the free speech of millions of people in the UK. It is yet another example of the government expecting Parliament to pass a law without filling in the detail.
This stark warning cannot be ignored. As the Bill reaches its final parliamentary stages, it’s vital that peers press the government on whether it will require tech companies to pre-screen social media posts, and how, in those circumstances, the Bill could protect online freedom of expression.”
In addition, the Bill currently lacks a requirement for individuals to be notified that their content has been blocked, while also offering no timescale to respond to complaints and nor do companies have to provide them with a reason as to why their content has been removed.
The Opinion notes: “There do not appear to be any enforcement processes for a failure adequately to address complaints and no entitlement to compensation” and that “the complaints process is very likely to be ineffective”.
However, in fairness, if such provisions did exist then it would probably exasperate the problem by increasing an already intolerable level of liability for many sites – opening them to attack from all sides. Figuring out compensation for not being allowed to post something on a platform that the user doesn’t pay to use would be an especially interesting debate.
The ORG has shared their findings with peers in parliament, although it remains to be seen whether either the Government or opposition MPs – most of which have long supported the Bill – will make any significant changes to address these problems.
So far, MPs appear to have only paid lip service to such concerns, while at the same time continuing to extend the Bill’s powers. As such, we still seem to be heading for a Draconian censorship regime via the backdoor of cost and liability shock. Meanwhile, Ofcom will be hard-pressed to find enough resources to adequately police such a large sector. The only winners in all this seem to be the lawyers.
Nothing about safety and everything about censorship for the state
VPN yourself out of the UK if this sees light of day
We all know where this is headed. What started out as to “clampdown on “harmful” internet content” will end up as an excuse to turn the UK even more towards an Orwellian government system policing everything we look at, everything we do, everything we spend our money on, until it goes down the road of predicting who will be likely to endanger “the system” or “what’s best for the population” and then start policing our very thoughts too.
Can’t disagree with you there. Except that the moment they start to monitor/censor my leased line. I will end using the internet when the contract ends. I have a VPN as I was given one with my line I rarely use it. I was told only illegal sites would flag up on the line like ammo/narcotics sites
“The only winners in all this seem to be the lawyers.”
No, the only winners will be politicians, who will suddenly be able to control what people see and say via the internet. As with every other power of government it will be implemented poorly to meet the stated requirements, but then abused to address the personal interest of the governing party, to hide wrongdoing, to block criticism, and to silence dissenting voices. It’s quite horrifying that the MPs of both parties support this, equally it’s unsurprising.
“The aim was a noble one”
The road to hell is paved with good intentions.
Can the Online Safety Bill be legally challenged in the court, either before or after it’s passed? We need a judicial review. (Yes, I know UK courts cannot strike down primary legislation, but an official court ruling that the law is illegal would be an important step.
The US constitution is amazing at preventing this kind of state tyranny. Unfortunately no such thing in the UK
Well that escalated quickly.
I think this bill is abhorrent, unworkable and sums up everything that’s wrong with the current government. But when you read the loonies literally posting their unrelated insane drivel in the comments above, often inspired by Russian troll farms, you sort of realise why these laws don’t get shot down at first reading.
What insane drivel are you talking about?
The reason why state tyranny happens is because people fail to see the full picture. Censorship in this site does not help
If you stop people from talking, they will scream
If you stop people from screaming, they will burn things down
There are still people brainwashed to believe the conspiracy theory that a bioweapon (legal definition: created through gain of function in a lab) came from some bat, rather than the logical conclusion that the US funded research in china through the NIH happening in a lab with poor safety standards.
This is why free speech must be absolute
Why do people continue to post off topic conspiracy theory nonsense here? Hopefully MarkJ will delete this too.
In a post about a censorship politics, it is very relevant to point out the evils in censorship
The bat transmission is the conspiracy but the lab leak is factual, regardless of whether your software has been updated or not. Documents have been declassified long ago