Home
 » ISP News » 
Sponsored

DCMS Committee Finds Online Safety Bill Gets the Balance Wrong

Monday, January 24th, 2022 (8:23 am) - Score 792
Censorship and forbidden speech warning sign uk internet

A new cross-party UK report from the DCMS Select Committee has warned that the Government’s new Online Safety Bill (OSB), which will task Ofcom with tackling “harmful” content online, fails to get the balance right and “neither protects freedom of expression nor is it clear nor robust enough to tackle illegal and harmful online content.”

At present much of the online content you see today is governed by a self-regulatory approach, which has often struggled to keep pace with rapid online changes. Various examples exist for “harmful” content, such as the rise of the ISIS terrorist group, child abuse, as well as state sponsored propaganda from hostile countries, online bullying, racism and the spread of COVID-19 conspiracy theories etc. Some of this is already illegal.

The big social media firms (Facebook, Twitter etc.) usually catch-up with tackling such problems, but they’re often perceived to be too slow or unwilling to act unless pressured to do so, while other websites seem to exist solely to promote the worst of humanity. The OSB appears set to pressure all sites, from big social media firms to small blogs, to act or face the consequences (e.g. access blocked by ISPs, huge fines and making some people liable for what others may say on their website).

The OSB is the Government’s response. But trying to strike the right balance between Freedom of Speech and outright censorship may be difficult, which is what happens when you attempt to police the common and highly subjective public expression of negative human thought. Faced with such a heavy risk of liability, most sites are likely to become overzealous when filtering user-generated content or prevent people from speaking at all.

Suffice to say, balancing all of this against complex issues of context (e.g. people joking about blowing up a city in a video game vs actual terrorists), satire or parody, the application of such rules to non-UK based sites, political speech and the risk from overzealous automated filtering systems (only economically viable for the biggest sites) – since manually moderating all content on major platforms would be economically impossible – is going to be a nightmare to get right.

Like it or not, this is not just about policing online platforms, but about policing what YOU all do and say online. Granted, some people go to extremes, but those often cross over into more clearly defined aspects of illegality, while what is being talked about in the OSB extends to the murkily ambiguous area of “lawful but still harmful.

Findings of the DCMS Committee Report

The new report warns that the draft OSB would fail to prevent the sharing of some of the most “insidious” images of child abuse and violence against women and girls, while at the same time failing to protect freedom of expression.

The report proposes several amendments to the definition and scope of harms covered by the regime that would bring the Bill into line with the UK’s obligations to freedom of expression under international human rights law.

On top of that, it also recommends that the Government proactively address types of content that are “technically legal“, such as insidious parts of child abuse sequences like breadcrumbing and types of online violence against and women and girls like tech-enabled ‘nudifying’ of women and deepfake pornography, by bringing them into scope either through primary legislation or as types of harmful content covered by the duties of care.

Moreover, it found that the current provisions that provide Ofcom with a suite of powers and users with redress are “similarly unclear and impractical … we urge the Government to provide greater clarity within the Bill on how and when these powers should be used to ensure they are both practical and proportionate.”

Julian Knight MP, Chair of the DCMS Committee, said:

“In its current form what should be world-leading, landmark legislation instead represents a missed opportunity.

The Online Safety Bill neither protects freedom of expression nor is it clear nor robust enough to tackle illegal and harmful online content.

Urgency is required to ensure that some of the most pernicious forms of child sexual abuse do not evade detection because of a failure in the online safety law.

These are matters of important public debate to which we will return as the Bill makes its way through Parliament.”

As usual, the committee concludes with a long list of recommendations (see below), which helps to provide some context for their conclusions. The recommendations include various things, such as the need for a greater consideration of context when assessing whether content is “harmful“, as well as the need to categorise whether something it “harmful” to an adult or only children.

However, at the end of the day, there’s no escaping the fact that the OSB is an immensely complex piece of legislation, which perhaps reflects the fact that it is trying to police an immensely complex problem. At the same time it’s difficult to see how Ofcom could ever realistically hope to find the resources to adequately police the whole internet.

Suffice to say that many people in the industry are concerned that, in practice, big parts of the new bill may be unworkable in the real-world, at least not without causing significant harm (e.g. too many legitimate pieces of content being removed and Ofcom being swamped with requests).

Recommendations

➤ The Government should redraft the Online Safety Bill to state explicitly that the scope of the framework for addressing “illegal content”, which should be subject to the most stringent moderation measures, specifically applies to existing criminal offences, rather than regulatory offences and civil or administrative wrongs.

➤ We recommend that the Government respond to our concerns about the risk of content and activity that falls below the threshold of outright criminal activity but nonetheless forms part of the sequence for online CSEA [child sexual exploitation and abuse]. One starting point should be to reframe the definition of illegal content to explicitly add the need to consider context as a factor, and include explicitly definitions of activity like breadcrumbing, on the face of the Bill.

➤ We recommend that the Government reframes the language around considerations for freedom of expression to incorporate a ‘must balance’ test so Ofcom can interrogate and assess whether providers have duly balanced their freedom of expression obligations with their decision making.

➤ We recommend that the Government also reframes the definitions of harmful content and relevant safety duties for content that is harmful to children and content that is harmful to adults, to apply to reasonably foreseeable harms identified in risk assessments, and explicitly add the need to consider context, the position and intentionality of the speaker, the susceptibility of the audience and the content’s accuracy. These factors would bring the Bill into line with international human rights law and provide minimum standards against which a provider’s actions, systems and processes to tackle harm, including automated or algorithmic content moderation, should be judged.

➤ The Bill should include non-exhaustive, illustrative lists of preventative and remedial measures beyond takedowns for both illegal and ‘legal but harmful’ content, proportionate to the risk and severity of harm, to reflect a structured approach to content. This could include tagging or labelling, covering, redacting, factchecking, deprioritising, nudging, promoting counter speech, restricting or disabling specific engagement and/or promotional functionalities (such as likes and intra- and cross-platform sharing) and so on.

➤ We recommend that the definition of content that is harmful to adults should explicitly include content that undermines, or risks undermining, the rights or reputation of others, national security, public order and public health or morals, as also established in international human rights law.

➤ We recommend that, in addition to the factors listed above, the definition for content that is harmful to adults should be further clarified to explicitly account for any intention of electoral interference and voter suppression when considering a speaker’s intentionality and the content’s accuracy, and account for the content’s democratic importance and journalistic nature when considering the content’s context.

➤ The Government should add a new Schedule to the Bill providing at least the most relevant types of illegal content and non-exhaustive illustrative lists of proportionate preventative and remedial measures to mitigate and manage risk. It should also provide, in another new Schedule, a detailed procedure for designating new and/or additional offences that constitute illegal content in the Bill through regulations. Alongside the Bill, the Government should issue example Regulations to illustrate how this process would work and ensure it is done consistently post-implementation.

➤ We recommend that the Government produce new Schedules detailing procedures for designating, by regulations, content that is harmful to children and content that is harmful to adults. Any regulations designating types of harm should define the harms and provide non-exhaustive illustrative lists of factors and proportionate preventative and remedial measures.

➤ All regulations making designations under “content that is harmful to children” and “content that is harmful to adults” should be subject to the affirmative procedure. This will provide an important, additional safeguard for freedom of expression, recognising the need for additional parliamentary oversight in this area.

➤ We recommend that the Government should take forward the commitments made by the Prime Minister and work with charities, campaign organisations and children’s advocacy groups to identify, define and address legal but harmful content, such as content that advocates self-harm and types of online violence against women and girls, that are not currently illegal.

➤ We recommend that the Government provide Ofcom with the power to conduct confidential auditing or vetting of a service’s systems to assess the operation and outputs in practice (itself or through an independent third party) in Chapter 3 of the Bill. Alongside the power to request generic information about how “content is disseminated by means of a service”, the Government should also include in Section 49(b) a non-exhaustive list of specific information that may be requested, subject to non-disclosure, including:

• The provider’s objectives and the parameters for a system’s outputs, (such as maximising impressions, views, engagement and so on);

• Their metrics for measuring performance and references of success;

• The datasets on which systems are developed, trained and refined, including for profiling, content recommendation, moderation, advertising, decision-making or machine learning purposes;

• How these datasets are acquired, labelled, categorised and used, including who undertakes these tasks;

• Data on and the power to query a system’s outputs, including to request or scrape information on said outputs given particular inputs.

➤ We also recommend that the online safety regime should require providers to have designated compliance officers, similar to financial services regulation and which we have advocated previously, in order to bake compliance and safety by design principles into corporate governance and decision-making.

➤ We recommend that the Government provide greater clarity about the use of enforcement powers contained in the Bill. First, it should make explicit that these powers apply only to in-scope services.

Second, it should redraft the use of technology notices by more tightly defining the scope and application of the power, the actions required to bring providers to compliance and a non-exhaustive list of criteria that might constitute a test as to whether the use of such power is proportionate, such as:

• The evidential basis for intervention (including the time period this evidence covers);

• The level of risk and severity of harm that exists on the service and the existing systems used to identify and mitigate or manage these risks;

• The implications for human rights, including freedom of expression and user privacy;

• The cost to the service relative to factors such as its user base and revenue.

➤ Third, with regards to business disruption measures, we recommend that the Government provide greater clarity to other services in the supply chain by bringing forward more detailed proposals for how this would work in practice. This should include:

• Time frames and consultation requirements;

• Due consideration for human rights implications, including the unintended coverage of legal content;

• Consideration for the costs to services that might be required to enact the measure; and

• Processes for updating consumers.

➤ The Government should also give consideration to, and evaluate in its response to this Report, whether these powers are appropriately future-proofed given the advent of technology like VPNs and DNS over HTTPs.

➤ We recommend that the Government include a provision in the Bill to mandate publication of a breach notice by a service. This should include details of their breaches against the duty of care and be available to view on the platform.

➤ We recommend that the Government should include a provision in the Bill to clarify that the right of eligible entities to make super-complaints before Ofcom is without prejudice to the right of individuals to access courts and make judicial complaints on a case-by-case basis for breaches of user-to-user and search service providers’ duties of care laid down in the Bill and other acts or omissions that are unlawful under other applicable laws. The Government should amend Clauses 15(3) and 24(3) to impose a duty on providers to operate a complaints procedure that gives users notice of any restriction on their ability to access and use the service, along with the reasons for the restriction.

➤ We recommend that the Government should scrap any plans to introduce a Joint Committee to oversee online safety and digital regulation [this is percieved as duplicating the role of the DCMS Committee].

Share with Twitter
Share with Linkedin
Share with Facebook
Share with Reddit
Share with Pinterest
By Mark Jackson
Mark is a professional technology writer, IT consultant and computer engineer from Dorset (England), he also founded ISPreview in 1999 and enjoys analysing the latest telecoms and broadband developments. Find me on Twitter, , Facebook and Linkedin.
Leave a Comment
1 Response
  1. Neil Fairbrother says:

    You might also find this podcast interview with Damian Collins MP about the Online Safety Bill and the Scrutiny Committee’s response of interest: https://safetonetfoundation.libsyn.com/a-safeguarding-podcast-levelling-up-and-the-online-safety-bill-with-damian-collins-mp

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Comments RSS Feed

Javascript must be enabled to post (most browsers do this automatically)

Privacy Notice: Please note that news comments are anonymous, which means that we do NOT require you to enter any real personal details to post a message. By clicking to submit a post you agree to storing your comment content, display name, IP, email and / or website details in our database, for as long as the post remains live.

Only the submitted name and comment will be displayed in public, while the rest will be kept private (we will never share this outside of ISPreview, regardless of whether the data is real or fake). This comment system uses submitted IP, email and website address data to spot abuse and spammers. All data is transferred via an encrypted (https secure) session.

NOTE 1: Sometimes your comment might not appear immediately due to site cache (this is cleared every few hours) or it may be caught by automated moderation / anti-spam.

NOTE 2: Comments that break our rules, spam, troll or post via known fake IP/proxy servers may be blocked or removed.
Cheapest Ultrafast ISPs
  • Gigaclear £17.00
    Speed: 200Mbps, Unlimited
    Gift: None
  • Community Fibre £20.00
    Speed: 150Mbps, Unlimited
    Gift: None
  • Virgin Media £24.00
    Speed: 108Mbps, Unlimited
    Gift: None
  • Vodafone £25.00
    Speed: 100Mbps, Unlimited
    Gift: None
  • Hyperoptic £25.00
    Speed: 150Mbps, Unlimited
    Gift: None
Large Availability | View All
New Forum Topics
Cheapest Superfast ISPs
  • Hyperoptic £17.99
    Speed 30Mbps, Unlimited
    Gift: None
  • Virgin Media £20.00
    Speed 54Mbps, Unlimited
    Gift: None
  • NOW £21.00
    Speed 36Mbps, Unlimited
    Gift: None
  • Shell Energy £21.99
    Speed 35Mbps, Unlimited
    Gift: None
  • Vodafone £22.00
    Speed 38Mbps, Unlimited
    Gift: None
Large Availability | View All
The Top 20 Category Tags
  1. FTTP (4022)
  2. BT (3134)
  3. Politics (2087)
  4. Building Digital UK (2009)
  5. Openreach (1950)
  6. FTTC (1917)
  7. Business (1807)
  8. Mobile Broadband (1589)
  9. Statistics (1491)
  10. FTTH (1370)
  11. 4G (1360)
  12. Virgin Media (1265)
  13. Ofcom Regulation (1230)
  14. Wireless Internet (1223)
  15. Fibre Optic (1222)
  16. Vodafone (920)
  17. EE (900)
  18. 5G (876)
  19. TalkTalk (817)
  20. Sky Broadband (782)
Promotion
Helpful ISP Guides and Tips
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
»
Sponsored

Copyright © 1999 to Present - ISPreview.co.uk - All Rights Reserved - Terms , Privacy and Cookie Policy , Links , Website Rules , Contact