Skip to main content
Legal Advice Centre

Review of The Draft Online Safety Bill from the perspective of SPITE victims

Summary of findings by Mishcon de Reya LLP and Queen Mary Legal Advice Centre, as part of the SPITE Project

Published:
Queen Mary School of Law and Mishcon de Reya logos above a globe with neurons showing digital connectivity

Foreword

Today, 8 February 2022, is Safer Internet Day 2022 – which this year has the theme of ‘All fun and games? Exploring respect and relationships online’. With the draft Online Safety Bill (the Bill) progressing through Parliament and the source of much scrutiny from key stake-holders, the public and political parties, Safer Internet Day this year takes on a new importance – especially with the theme of online relationships.

The Bill seeks to promote online safety in the UK though online content regulation, with the objective to greater protect the vulnerable, and impose duties on internet service providers. If enacted, this Bill has the potential to dramatically change the face of the internet, the experience of users, the duties of service providers and the rights of those who have fallen victim to internet based offences.

With this potential in mind, in 2021, the Queen Mary Legal Advice Centre and Mishcon de Reya LLP established a Policy Clinic to specifically consider the complexity and breadth of the Bill through the lens of victims of image-based sexual abuse (colloquially known as "revenge porn"), with the aim to analyse how the Bill might assist these victims, and provide recommendations for the Bill's improvement.

Analysing the Bill specifically through the lens of these victims stems from the years of work both Queen Mary Legal Advice Centre and Mishcon de Reya LLP have undertaken to provide legal advice and education to the community as part of the SPITE (Sharing and Publishing Images To Embarrass) project. Both organisations have collaborated on this project since the introduction of the Criminal Justice and Courts Act in 2015. The SPITE project provides free legal advice to victims, and undertakes community based public legal education in secondary schools on image-based sexual abuse. The project and Policy Clinic's joint expertise is founded in front-line project experience and a wealth of legal knowledge in areas such as criminal law, reputation protection, data protection and civil law.

Today Queen Mary Legal Advice Centre and Mishcon de Reya LLP publish a brief note on the findings of the Policy Clinic – summarising weaknesses of the Bill, how the Bill as it stands affects SPITE victims, and amendments the Policy Clinic propose to strengthen protections and address the Bill's limitations. Nonetheless, we should state that even in its current form, the Bill is a welcome step forwards – we are simply of the view that it can be even better.

The Mishcon Policy Clinic team are: Harry Eccles-Williams, Joshua Edwards, Liz Barrett, Sophie Hollander and Saba Tavakoli; the Queen Mary Legal Advice Centre Policy Clinic team are: Frances Ridout, Maria Padczuk, Isabel Hope Tucker, Heloise Anne-Sophie Lauret, Xhulia Tepshi, Nikela Allidri and Gerasimos Ioannidis.

To read more about the SPITE project or if you have been the victim of image-based sexual abuse follow this link.

Summary of findings

The objective of the Online Safety Bill (the Bill) is to promote online safety in the UK. The Bill imposes duties of care on regulated services and grants OFCOM new responsibilities to protect people online, whilst balancing the right to freedom of expression against the right to privacy.

We note, and largely agree with, the broader concerns with the Bill raised by interest groups, politicians and the public, including that: it does not go far enough to protect the individual user; it places too much reliance on organisations to monitor and remove offending content; it is too broad and leaves much open to interpretation; and it is not clear how effective the enforcement regime will be.

The purpose of our review, however, was to look at the Bill from the perspective of SPITE[1] victims, and to propose modest amendments that would specifically benefit them. We are of the view that the below amendments would do this, whilst not having a material impact on freedom of speech or an unmanageable impact on business.

Definitions

  • The terms "illegal content" in section 41[2], "content harmful to children" in section 45 and "content harmful to adults" in section 46, do not explicitly include image-based sexual abuse or violence against women and girls ("VAWG"). The Online Harms Consultation Response indicates that these terms may include violent and/or pornographic content but there is a lack of certainty around whether this is the case.
    • Image-based sexual abuse should be explicitly defined and included within the scope of harmful content under the Bill[3].
    • The Bill should explicitly recognise VAWG in all its forms (this has also been recommended by a Committee report published on 24 January 2022[4]).
  • Commercial pornography websites should be specifically named within the Bill and subject to a higher level of scrutiny by OFCOM, which should be empowered to issue take down notices.
  • The Bill should make ‘cyber flashing’ a specific criminal offence as recommended by the Law Commission[5].
  • The tier system for regulated services (which is currently based on the number of users and functionalities of the service provider) should be revised to ensure that harmful and illegal content is caught on more sites. There should be a rebuttable presumption that websites that host pornographic material[6] are 'Category 1' services[7].

Guidance

  • The Bill should provide further guidance under Section 106 on which entities are eligible to make a super complaint to OFCOM. A suitable entity needs to be appointed to represent the interests of SPITE victims[8].
  • There should be a duty on OFCOM to assess the risks of harms to particular groups of users, like SPITE victims, and assess how such groups may be disproportionately exposed to online harms.
  • The guidance on risk assessments under section 62 of the Bill should set out how harm arises from image-based sexual abuse content and how it plans to regulate service providers that provide a platform for such content.

Online Anonymity

  • The Bill should address issues of online anonymity, which is a major concern for SPITE victims. Online anonymity often makes is very difficult to prove who is behind image-based sexual abuse (even if the victim knows that there is only one person it can be).
  • There are a number of sensible recommendations in this regard. The recommendations in the 10 December 2021 Joint Committee report[9], are a good starting point.[10] These include the proposal by Siobhan Bailley MP that social media platforms: "First, give all social media users a choice: the right to verify their identity. Secondly, give social media users the option to follow or be followed only by verified accounts. Thirdly, make it clear which accounts are verified."

Warning notices and collecting information

  • The Bill should require OFCOM to collect information linked to image-based sexual abuse (e.g. by consulting QMLAC and drawing on relevant data derived from its clients).[11]
  • The skilled persons’ report under Section 74 of the Bill should be prepared in consultation with a professional with relevant experience in developing algorithms to identify and remove image-based sexual abuse content.[12]
  • The provisions relating to OFCOM's power to issue a technology warning notice for content (Sections 63 – 69 of the Bill) should be amended to include specific reference to image-based sexual abuse content.[13] OFCOM should require regulated services to use accredited technology to identify and remove this content.[14]

Media literacy

  • The duty to promote media literacy under Section 103 of the Bill could be expanded to outline the specific goals it aims to achieve e.g. removing terrorism, CSEA and image-based sexual abuse content.[15]
  • OFCOM should be required to work with schools to promote media literacy from an early age with specific reference to content which is image-based sexual abuse where appropriate (e.g. by way of initiatives such as SPITE for Schools, led by Mishcon and QMLAC).
  • The Bill should be expanded to list the type, frequency and outreach objectives (e.g. number and demographic of people targeted) of media literacy campaigns that OFCOM must implement. Guidance about the evaluation of educational initiatives should also be included.

Footnotes

[1] By SPITE victims we mean victims of image based sexual abuse who may engage with the SPITE (Sharing and Publishing Images To Embarrass) Project at the Queen Mary Legal Advice Centre, http://www.lac.qmul.ac.uk/clients/advice/revenge-porn-free-legal-advice/.

[2] Specifically, section 41(5) should include explicit reference to image-based sexual abuse, VAWG and content which amounts to those.

[3] See The Digital, Culture, Media and Sport Committee report (n 2), recommendations 13 and 14 regarding new Schedules for types of illegal content and harmful content. This could also include reference to pornography websites.

[4] The Draft Online Safety Bill and the legal but harmful debate, The Digital, Culture, Media and Sport Committee, Eighth Report of Session 2021-2022, 24 January 2022, recommendations 10, 13 and 14, https://publications.parliament.uk/pa/cm5802/cmselect/cmcumeds/1039/summary.html

[5] Modernising Communication Offences, 20 July 2021, Law Com No 399.

[6] This echoes issues flagged by the Joint Committee on the Draft Online Safety Bill, Report of Session 2021–22, Draft Online Safety Bill, HC 609, paras 202–4, 235–7 and the Digital, Culture, Media and Sport Committee report (n 2), para 11.

[7] Section 59 of the Bill sets out OFCOM's duty to establish a register of particular categories of regulated services for the purpose of determining the threshold conditions which will apply to each category. All services in Category 1 will be subject to a higher level of regulation. The Bill must therefore include wording under Schedule 4 to ensure that service providers on which image based sexual abuse content is shared will fall under Category 1 regardless of their size or number of audience, in order for harmful and illegal content to be better identified and regulated.

[8] Super complaints enable eligible entities to make complaints to OFCOM regarding the conduct of a regulated service, if such conduct presents a material risk of harm. However, it is unclear what the definition of an 'eligible entity entails' (section 106(3)). The Bill sets out that an eligible entity must be a body representing the interest of users of regulated services. The Bill should include wording to specify whether the representatives must be registered charities and organisation, or if a group of individuals would also be able make super complaints. SPITE victims are individuals who can benefit from the right to submit super complaints, however the Bill does not enable this, therefore it may be necessary to appoint an entity that represents the interest of SPITE victims accordingly.

[9] Joint Committee report (n 4) paras 92–94.

[10] ibid, para 87 and also recommended in the Social Media Platforms (Identity Verification Bill) proposed by Siobhan Baillie MP.

[11] Wording to this effect should be included in sections 99 (research about users' experiences of regulated services), 100 (OFCOM's transparency reports), 101 (OFCOM's report about researcher's access to information) and 102 (reports by OFCOM). OFCOM should consider appointing a SPITE representative to its advisory committee on disinformation and misinformation under section 98(3)(a) ("persons representing the interests of United Kingdom users of regulated services") and 98(3)(c) ("persons with expertise in the prevention and handling of disinformation and misinformation online").

[12] This can be achieved by amending sections 74(3) and (4) to include wording to this effect.

[13] In order that SPITE victims are afforded the same protections as CSEA victims under these sections as currently drafted. This can be achieved by inserting "image-based sexual abuse content" to the following subsections: 63(2); 63(4)(c); 64(5(a) to (c); 65(5)(a) to (c); and 67(5).

[14] This can be achieved by adding the following additional subsections under section 64:

  • Section 64(4)(c): “use accredited technology to identify image-based sexual abuse content present on any part of the service (public or private), and to take down the content swiftly (either by means of technology alone or by means of the technology together with the use of human moderators to review image-based sexual abuse content identified by the technology)."
  • Section 65(4)(c): “use accredited technology to identify image-based sexual abuse content in search results and to swiftly take steps that result in that content no longer appearing in search results (either by means of technology alone or by means of the technology together with the use of human moderators to review image-based sexual abuse content identified by the technology).”

[15] Media literacy helps people identify what type of material they are seeing, how reliable and accurate it is and how people can control the material they receive. The Bill should outline the specific aims of media literacy, including increasing awareness of:

  • The impact of image-based sexual abuse content on victims;
  • Any criminal sanctions for offenders;
  • How users can identify and report image-based sexual abuse content they encounter online; and
  • Informing SPITE victims of steps they can take to protect themselves before (e.g. encryption of data) and after their images have been posted (e.g. reporting to OFCOM/the police), and their legal rights.

Section 103(3) should be expanded to list the type, frequency and outreach capacity of education initiatives that OFCOM must implement.

 

 

Back to top