Home
 » ISP News » 
Sponsored Links

DCMS Committee Finds Online Safety Bill Gets the Balance Wrong

Monday, Jan 24th, 2022 (8:23 am) - Score 832
Censorship and forbidden speech warning sign uk internet

A new cross-party UK report from the DCMS Select Committee has warned that the Government’s new Online Safety Bill (OSB), which will task Ofcom with tackling “harmful” content online, fails to get the balance right and “neither protects freedom of expression nor is it clear nor robust enough to tackle illegal and harmful online content.”

At present much of the online content you see today is governed by a self-regulatory approach, which has often struggled to keep pace with rapid online changes. Various examples exist for “harmful” content, such as the rise of the ISIS terrorist group, child abuse, as well as state sponsored propaganda from hostile countries, online bullying, racism and the spread of COVID-19 conspiracy theories etc. Some of this is already illegal.

The big social media firms (Facebook, Twitter etc.) usually catch-up with tackling such problems, but they’re often perceived to be too slow or unwilling to act unless pressured to do so, while other websites seem to exist solely to promote the worst of humanity. The OSB appears set to pressure all sites, from big social media firms to small blogs, to act or face the consequences (e.g. access blocked by ISPs, huge fines and making some people liable for what others may say on their website).

Advertisement

The OSB is the Government’s response. But trying to strike the right balance between Freedom of Speech and outright censorship may be difficult, which is what happens when you attempt to police the common and highly subjective public expression of negative human thought. Faced with such a heavy risk of liability, most sites are likely to become overzealous when filtering user-generated content or prevent people from speaking at all.

Suffice to say, balancing all of this against complex issues of context (e.g. people joking about blowing up a city in a video game vs actual terrorists), satire or parody, the application of such rules to non-UK based sites, political speech and the risk from overzealous automated filtering systems (only economically viable for the biggest sites) – since manually moderating all content on major platforms would be economically impossible – is going to be a nightmare to get right.

Like it or not, this is not just about policing online platforms, but about policing what YOU all do and say online. Granted, some people go to extremes, but those often cross over into more clearly defined aspects of illegality, while what is being talked about in the OSB extends to the murkily ambiguous area of “lawful but still harmful.

Findings of the DCMS Committee Report

The new report warns that the draft OSB would fail to prevent the sharing of some of the most “insidious” images of child abuse and violence against women and girls, while at the same time failing to protect freedom of expression.

Advertisement

The report proposes several amendments to the definition and scope of harms covered by the regime that would bring the Bill into line with the UK’s obligations to freedom of expression under international human rights law.

On top of that, it also recommends that the Government proactively address types of content that are “technically legal“, such as insidious parts of child abuse sequences like breadcrumbing and types of online violence against and women and girls like tech-enabled ‘nudifying’ of women and deepfake pornography, by bringing them into scope either through primary legislation or as types of harmful content covered by the duties of care.

Moreover, it found that the current provisions that provide Ofcom with a suite of powers and users with redress are “similarly unclear and impractical … we urge the Government to provide greater clarity within the Bill on how and when these powers should be used to ensure they are both practical and proportionate.”

Julian Knight MP, Chair of the DCMS Committee, said:

“In its current form what should be world-leading, landmark legislation instead represents a missed opportunity.

The Online Safety Bill neither protects freedom of expression nor is it clear nor robust enough to tackle illegal and harmful online content.

Urgency is required to ensure that some of the most pernicious forms of child sexual abuse do not evade detection because of a failure in the online safety law.

These are matters of important public debate to which we will return as the Bill makes its way through Parliament.”

As usual, the committee concludes with a long list of recommendations (see below), which helps to provide some context for their conclusions. The recommendations include various things, such as the need for a greater consideration of context when assessing whether content is “harmful“, as well as the need to categorise whether something it “harmful” to an adult or only children.

Advertisement

However, at the end of the day, there’s no escaping the fact that the OSB is an immensely complex piece of legislation, which perhaps reflects the fact that it is trying to police an immensely complex problem. At the same time it’s difficult to see how Ofcom could ever realistically hope to find the resources to adequately police the whole internet.

Suffice to say that many people in the industry are concerned that, in practice, big parts of the new bill may be unworkable in the real-world, at least not without causing significant harm (e.g. too many legitimate pieces of content being removed and Ofcom being swamped with requests).

Recommendations

➤ The Government should redraft the Online Safety Bill to state explicitly that the scope of the framework for addressing “illegal content”, which should be subject to the most stringent moderation measures, specifically applies to existing criminal offences, rather than regulatory offences and civil or administrative wrongs.

➤ We recommend that the Government respond to our concerns about the risk of content and activity that falls below the threshold of outright criminal activity but nonetheless forms part of the sequence for online CSEA [child sexual exploitation and abuse]. One starting point should be to reframe the definition of illegal content to explicitly add the need to consider context as a factor, and include explicitly definitions of activity like breadcrumbing, on the face of the Bill.

➤ We recommend that the Government reframes the language around considerations for freedom of expression to incorporate a ‘must balance’ test so Ofcom can interrogate and assess whether providers have duly balanced their freedom of expression obligations with their decision making.

➤ We recommend that the Government also reframes the definitions of harmful content and relevant safety duties for content that is harmful to children and content that is harmful to adults, to apply to reasonably foreseeable harms identified in risk assessments, and explicitly add the need to consider context, the position and intentionality of the speaker, the susceptibility of the audience and the content’s accuracy. These factors would bring the Bill into line with international human rights law and provide minimum standards against which a provider’s actions, systems and processes to tackle harm, including automated or algorithmic content moderation, should be judged.

➤ The Bill should include non-exhaustive, illustrative lists of preventative and remedial measures beyond takedowns for both illegal and ‘legal but harmful’ content, proportionate to the risk and severity of harm, to reflect a structured approach to content. This could include tagging or labelling, covering, redacting, factchecking, deprioritising, nudging, promoting counter speech, restricting or disabling specific engagement and/or promotional functionalities (such as likes and intra- and cross-platform sharing) and so on.

➤ We recommend that the definition of content that is harmful to adults should explicitly include content that undermines, or risks undermining, the rights or reputation of others, national security, public order and public health or morals, as also established in international human rights law.

➤ We recommend that, in addition to the factors listed above, the definition for content that is harmful to adults should be further clarified to explicitly account for any intention of electoral interference and voter suppression when considering a speaker’s intentionality and the content’s accuracy, and account for the content’s democratic importance and journalistic nature when considering the content’s context.

➤ The Government should add a new Schedule to the Bill providing at least the most relevant types of illegal content and non-exhaustive illustrative lists of proportionate preventative and remedial measures to mitigate and manage risk. It should also provide, in another new Schedule, a detailed procedure for designating new and/or additional offences that constitute illegal content in the Bill through regulations. Alongside the Bill, the Government should issue example Regulations to illustrate how this process would work and ensure it is done consistently post-implementation.

➤ We recommend that the Government produce new Schedules detailing procedures for designating, by regulations, content that is harmful to children and content that is harmful to adults. Any regulations designating types of harm should define the harms and provide non-exhaustive illustrative lists of factors and proportionate preventative and remedial measures.

➤ All regulations making designations under “content that is harmful to children” and “content that is harmful to adults” should be subject to the affirmative procedure. This will provide an important, additional safeguard for freedom of expression, recognising the need for additional parliamentary oversight in this area.

➤ We recommend that the Government should take forward the commitments made by the Prime Minister and work with charities, campaign organisations and children’s advocacy groups to identify, define and address legal but harmful content, such as content that advocates self-harm and types of online violence against women and girls, that are not currently illegal.

➤ We recommend that the Government provide Ofcom with the power to conduct confidential auditing or vetting of a service’s systems to assess the operation and outputs in practice (itself or through an independent third party) in Chapter 3 of the Bill. Alongside the power to request generic information about how “content is disseminated by means of a service”, the Government should also include in Section 49(b) a non-exhaustive list of specific information that may be requested, subject to non-disclosure, including:

• The provider’s objectives and the parameters for a system’s outputs, (such as maximising impressions, views, engagement and so on);

• Their metrics for measuring performance and references of success;

• The datasets on which systems are developed, trained and refined, including for profiling, content recommendation, moderation, advertising, decision-making or machine learning purposes;

• How these datasets are acquired, labelled, categorised and used, including who undertakes these tasks;

• Data on and the power to query a system’s outputs, including to request or scrape information on said outputs given particular inputs.

➤ We also recommend that the online safety regime should require providers to have designated compliance officers, similar to financial services regulation and which we have advocated previously, in order to bake compliance and safety by design principles into corporate governance and decision-making.

➤ We recommend that the Government provide greater clarity about the use of enforcement powers contained in the Bill. First, it should make explicit that these powers apply only to in-scope services.

Second, it should redraft the use of technology notices by more tightly defining the scope and application of the power, the actions required to bring providers to compliance and a non-exhaustive list of criteria that might constitute a test as to whether the use of such power is proportionate, such as:

• The evidential basis for intervention (including the time period this evidence covers);

• The level of risk and severity of harm that exists on the service and the existing systems used to identify and mitigate or manage these risks;

• The implications for human rights, including freedom of expression and user privacy;

• The cost to the service relative to factors such as its user base and revenue.

➤ Third, with regards to business disruption measures, we recommend that the Government provide greater clarity to other services in the supply chain by bringing forward more detailed proposals for how this would work in practice. This should include:

• Time frames and consultation requirements;

• Due consideration for human rights implications, including the unintended coverage of legal content;

• Consideration for the costs to services that might be required to enact the measure; and

• Processes for updating consumers.

➤ The Government should also give consideration to, and evaluate in its response to this Report, whether these powers are appropriately future-proofed given the advent of technology like VPNs and DNS over HTTPs.

➤ We recommend that the Government include a provision in the Bill to mandate publication of a breach notice by a service. This should include details of their breaches against the duty of care and be available to view on the platform.

➤ We recommend that the Government should include a provision in the Bill to clarify that the right of eligible entities to make super-complaints before Ofcom is without prejudice to the right of individuals to access courts and make judicial complaints on a case-by-case basis for breaches of user-to-user and search service providers’ duties of care laid down in the Bill and other acts or omissions that are unlawful under other applicable laws. The Government should amend Clauses 15(3) and 24(3) to impose a duty on providers to operate a complaints procedure that gives users notice of any restriction on their ability to access and use the service, along with the reasons for the restriction.

➤ We recommend that the Government should scrap any plans to introduce a Joint Committee to oversee online safety and digital regulation [this is percieved as duplicating the role of the DCMS Committee].

Share with Twitter
Share with Linkedin
Share with Facebook
Share with Reddit
Share with Pinterest
Mark-Jackson
By Mark Jackson
Mark is a professional technology writer, IT consultant and computer engineer from Dorset (England), he also founded ISPreview in 1999 and enjoys analysing the latest telecoms and broadband developments. Find me on X (Twitter), Mastodon, Facebook, BlueSky, Threads.net and .
Search ISP News
Search ISP Listings
Search ISP Reviews
Comments
1 Response

Advertisement

  1. Avatar photo Neil Fairbrother says:

    You might also find this podcast interview with Damian Collins MP about the Online Safety Bill and the Scrutiny Committee’s response of interest: https://safetonetfoundation.libsyn.com/a-safeguarding-podcast-levelling-up-and-the-online-safety-bill-with-damian-collins-mp

Comments are closed

Cheap BIG ISPs for 100Mbps+
Community Fibre UK ISP Logo
100Mbps
Gift: None
Hyperoptic UK ISP Logo
Hyperoptic £22.00 - 25.00
158Mbps
Gift: None
Sky UK ISP Logo
Sky £24.00
145Mbps
Gift: None
Youfibre UK ISP Logo
Youfibre £24.99
150Mbps
Gift: None
Vodafone UK ISP Logo
Vodafone £25.00
150Mbps
Gift: None
Large Availability | View All
Cheap Unlimited Mobile SIMs
iD Mobile UK ISP Logo
iD Mobile £15.00
Contract: 1 Months
Data: Unlimited
Smarty UK ISP Logo
Smarty £16.00
Contract: 1 Month
Data: Unlimited
Lebara UK ISP Logo
Lebara £22.50
Contract: 12 Months
Data: Unlimited
ASDA Mobile UK ISP Logo
ASDA Mobile £23.00
Contract: 24 Months
Data: Unlimited
Utility Warehouse UK ISP Logo
Contract: 1 Month
Data: Unlimited
Cheapest ISPs for 100Mbps+
Gigaclear UK ISP Logo
Gigaclear £19.00
300Mbps
Gift: None
Community Fibre UK ISP Logo
100Mbps
Gift: None
Hyperoptic UK ISP Logo
Hyperoptic £22.00 - 25.00
158Mbps
Gift: None
BeFibre UK ISP Logo
BeFibre £22.00
150Mbps
Gift: None
toob UK ISP Logo
toob £22.00
150Mbps
Gift: None
Large Availability | View All
Promotion
Sponsored

Copyright © 1999 to Present - ISPreview.co.uk - All Rights Reserved - Terms , Privacy and Cookie Policy , Links , Website Rules , Contact
Mastodon