top of page
Search
Writer's pictureStephen Kinsella

Reflections on the draft Online Safety Bill

Updated: Jul 13, 2021

There’s much to welcome in the UK government's publication of a draft Online Safety Bill. It’s been a long time now since the publication of the Online Harms White Paper in April 2019, and publication of a draft Bill marks an important step towards much-needed regulation.


The draft Bill contains many of the building blocks for an effective, evidence-based regulatory regime for social media platforms. These include the introduction of duties of care, and making Ofcom the regulator with the power to conduct audits of platforms’ systems and processes and to produce codes of practice.


Clean Up The Internet does however have some reservations regarding the current draft, which we will summarise below. We would expect a draft Bill to be imperfect. The government’s decision to publish the Bill in draft, and initiate a process of pre-legislative scrutiny, provides welcome opportunities to address such imperfections. We intend to engage thoroughly with the process of pre-legislative scrutiny, which is due to start shortly.

Most significantly, we note the lack of any specific proposals in the draft Bill to address the issue of anonymity. This is a significant omission which would weaken the Bill as a whole in a least three significant respects:

1. The platforms’ current approach to anonymity is a key design flaw which fuels abuse and disinformation

There’s a significant body of evidence that the major social media platforms’ approach to anonymity is a key factor fuelling both online abuse and the spread of disinformation. These are two of the key “harms” which the government’s Online Safety agenda seeks to reduce. Tackling design issues such as the misuse of anonymity would reduce the amount of harmful activity on the platforms, and give users more options to protect themselves. Leaving such design flaws unaddressed by the Bill risks leading to over-reliance on content moderation, which is much more challenging to get right and poses greater trade-offs regarding freedom of expression.

2. Failing to tackle misuse of anonymity limits the effectiveness of other measures in the Bill, such as the reliance on Platforms’ own Terms & Conditions

The draft Bill sees more consistent enforcement of the largest (“Category One”) platforms’ own Terms & Conditions as key to reducing harmful behaviour. However, at present users are able to exploit a laissez-faire approach to anonymity and verification to evade T&Cs. The ultimate enforcement sanctions in a platform’s T&Cs are suspending or banning an account. In the absence of any new measures regarding anonymity, it remains extremely simple for a banned user to start a new account and continue their harmful behaviour including harrassing other users.


3. A lack of measures on anonymity will weaken the credibility of the legislation in the eyes of the general public

Numerous opinion polls have found that the public sees harm from anonymous accounts as a key problem with social media platforms. This is informed by their own experience of using such platforms, and their awareness of high-profile figures who experience online abuse, such as politicians or footballers. The public will struggle to understand why no measures have been introduced to reduce the harm from anonymous accounts, and will feel less confidence in the new Online Safety regime as a result.


Clean Up The Internet will be making the case for correcting this omission to the pre-legislative scrutiny committee. We are very encouraged by the growing number of parliamentarians of all parties who recognise the importance of strengthening the Bill to address the misuse of anonymity, and will work with them to develop detailed proposals on how the draft Bill could be amended to add such measures.

Alongside this priority focus, we will also work constructively with partners in civil society to support other proposals to improve the draft Bill. These are likely to include:

  • Ensuring that “democratic harms” such as disinformation and electoral manipulation are adequately covered. Our priority proposal, to restrict the misuse of anonymity, would make a significant contribution in this area, by limiting the ability of disinformation networks to use large numbers of fake accounts to distort online conversations. However, this is unlikely to address the whole problem on its own. Disinformation and electoral manipulation should be brought explicitly within scope.

  • Ensuring that there is sufficient overall focus on requiring platforms to act to improve the health of the online public sphere - for example the harm caused by the cumulative impact of trolling and abuse, and the disproportionate impact on the freedom of expression of already under-represented groups.

  • Pushing for a clearer separation of powers between government ministers and Ofcom. The present draft includes a power for the Secretary of State to direct Ofcom to modify its codes of practice to ensure they “reflect government policy”. This hands too much power to the government of the day, with worrying implications for the independence of the regulator and for freedom of expression online.

0 comments

Comments


bottom of page