UK: New VSP regime provides a testbed for wider online harms reforms
Creating the right regulatory framework to address online harms is a uniquely difficult puzzle for governments around the world.
Measures are needed to protect users from legal but harmful content (e.g. disinformation and cyberbullying) and illegal content (e.g. terrorist content and child sexual abuse material (“CSAM”)). However, when regulating in this space, governments and regulators have to deal with difficult questions about the impact on freedom of speech and the risk of positioning social media companies as arbiters of questions of fundamental rights. There are also practical difficulties, for example the measures must work effectively at scale.
Legislative responses
Governments across the globe are proposing new regulation and passing new laws to address these issues. Our publication Online Harms: A comparative analysis sets out the position in eight key jurisdictions - analysing the current position in Australia, France, Germany, Singapore and the United States, as well as bold proposals put forward by the EU, Ireland and the United Kingdom.
The main legislative response in the UK is the draft Online Safety Bill. This is aimed at protecting users against harmful online content and activity, with the intention of making the UK “the safest place in the world to go online”. There is more information in our ‘At a glance’ summary of the draft Bill.
The draft Online Safety Bill is unlikely to take effect until 2022 at the earliest, but there is currently a regime in place that requires UK based video sharing platforms (“VSPs”) to take appropriate measures to protect users from harmful content.
Although the VSP regime is much narrower in scope than the online safety regime, there are similarities between the two, including the focus on protecting users against harm. The VSP regime will therefore provide a testbed for the broader reforms in the draft Online Safety Bill. Further, the approach taken by Ofcom in regulating the VSP regime may provide insight as to how it intends to regulate the online safety regime when it comes into force.
New guidance for VSPs
The rules, as outlined in Part 4B of the Communications Act 2003 and which came into force on 1 November 2020, require VSPs to proactively assess and confront applicable risks, and put in place proportionate and effective measures to minimise the chances of users encountering harmful content.
However, given that harmful content online comes in many guises, the onus is on providers to decide what steps are appropriate to mitigate risks they identify. This means that compliance with the VSP regime is far more than a tick box exercise.
Today, Ofcom has published long-awaited guidance (here) which is intended to help VSP providers understand their regulatory obligations under the new regime. Although the guidance is non-binding, Ofcom states that in certain circumstances it is unlikely that VSPs can effectively protect users (and therefore comply with the regime) without following the approach Ofcom has set out. Ofcom has also published a strategy paper (here) detailing its priorities in relation to enforcing the regime.
Ofcom’s harm and measures guidance
Ofcom’s final guidance on “measures to protect users from harmful material” builds on its consultation paper published on 24 March 2021 and makes it clear that the VSP regime is about putting in place appropriate systems and processes, not about the regulation of individual videos. Ofcom will be looking to understand what risks platforms have considered, the steps they are taking to protect their users and how effective these measures are. Ofcom has said that the final guidance takes on board input from industry stakeholders.
The guidance offers the following insights into how Ofcom will be viewing and approaching the regulation of VSPs:
Assessing and managing risk
- Providers should put in place a risk management framework and senior decision-makers in the organisation should have good oversight of risk management activities.
- Ofcom will aim to understand providers’ existing risk management systems and will then work with providers to develop these for compliance with the VSP regime; but providers are also expected to be alert to emerging risks, and to regularly and rigorously perform risk identification exercises.
Measures to protect users from harm
- There are examples of measures that providers can take to protect users set out in the legislation and, although providers have the flexibility to take other measures, the guidance states that they will need to be able to explain why they chose to do this to Ofcom.
- Providers are encouraged to assess the limitations of their measures (including where user behaviour might hinder effectiveness), to continually monitor the efficacy of such measures in practice, and to collect information and evidence to demonstrate this efficacy.
- A lack of resources or an unwillingness to invest in new processes to protect users will not necessarily be a justification for non-compliance with the regime.
Terms and conditions
- Providers are encouraged to consider setting out in their terms and conditions all types of content that would be prohibited on the platform, together with sanctions for uploading such content.
- Providers must ensure that they update their terms and conditions as the risk environment evolves.
- Ofcom has provided guidance on what constitutes hatred and violence and how providers might structure and implement their terms and conditions relating to these types of harms. For example, by providing case studies to illustrate the types of content that are prohibited.
- Providers will need to demonstrate that their terms are user friendly and have been effectively implemented.
A collaborative approach
- Ofcom states that it intends to work collaboratively with providers and other industry players in carrying out its supervision and enforcement duties.
- Ofcom has suggested that in relation to particular types of harms, providers may also want to consult and collaborate with specific organisations who work to prevent the relevant harms (e.g. the NSPCC).
Reporting and flagging mechanisms
- The processes for reporting and flagging issues to providers should encourage users to identify the specific type of harm they are reporting.
- If a user notifies a provider that certain content that they are uploading contains restricted material, the provider should either pass on warnings to viewers or restrict access to the material as appropriate. However, if a large proportion of users are under 18, providers should consider more sophisticated processes to differentiate between the suitability of content for different age groups.
- Ofcom has suggested that providers put in place classification frameworks from established ratings bodies, while acknowledging that ratings systems alone are unlikely to be sufficient to protect under 18s from harm.
Ofcom’s plans and approach for the coming year
Ofcom’s strategy paper sets out Ofcom’s five priority areas of focus for the first year of VSP regulation, which are set out below.
1. Reducing the risk of CSAM
Ofcom expects all VSPs to: (a) have put in place, and effectively implemented, clear and appropriate terms and conditions to prohibit CSAM; and (b) have robust processes for identifying and dealing with CSAM content, including swift removal of any such content.
Ofcom will be looking closely at the processes for registration and moderation on VSPs hosting adult content. Registration journeys and subsequent verification must be robust enough to “significantly reduce the risk of CSAM being uploaded and shared”.
2. Tackling hate and terror
Ofcom expects all VSPs to have put in place, and effectively implemented, clear and appropriate terms and conditions that prohibit the uploading of content that would be a criminal offence under English laws relating to terrorism, racism and xenophobia, as well as material likely to incite violence or hatred.
VSPs’ terms and conditions will be a particular focus for Ofcom, in context of hate and terror. Ofcom expects terms and conditions to be aligned with legal requirements for this type of harm and implemented in such a way as to achieve the desired outcome of protecting users. Ofcom will also be focusing on how the terms and conditions are communicated to users and how they are enforced by VSPs.
3. Protections for under 18s
Ofcom’s own research has shown that 79% of 13 – 17-year-olds who used VSPs have encountered a potential online harm on a VSP in the last 3 months. Under 18s are also more likely to say that they have been exposed to harms such as negative body image and eating disorder content, or content that glamourises unhealthy or abusive lifestyles or promotes self-harm.
Ofcom intends to work with VSPs that are likely to be accessed by children to ensure that effective solutions are in place to provide an age-appropriate experience for under 18s, whilst also continuing to collaborate with other key players in the industry on issues such as the tools available to protect under 18s.
4. Age verification on adult VSPs
It is expected that the most stringent steps should be taken by VSPs in relation to access and control over content that has the most potential to harm under 18s. Pornographic material should be shielded by strict age verification systems (either age-gates to prevent access to the content or filters to ensure that under 18s do not come across the content).
Ofcom expects adult VSPs to continue to investigate age verification options for adult services.
5. Reporting and flagging
Ofcom expects all VSPs to: (a) have effective flagging and reporting processes; and (b) work toward increasing the engagement of their users with the various safety measures available on their platforms.
Ofcom considers that having a clear and effective process for flagging harmful content is essential to protecting users and holding platforms to account. Ofcom will therefore be looking to understand what reporting and flagging processes VSPs currently have, including how easy they are to locate and use, how effectively reports are actioned and the overall experience for the user.
Conclusions
The Online Safety Bill will eventually supersede the VSP legislation and will apply to a much wider range of services. Organisations in scope of the Online Safety Bill will no doubt be watching carefully to see how Ofcom exercises its powers under the VSP regime for a possible indication of how the online safety regime is likely to play out.
In order to comply with the new regime, VSPs are likely to need new systems and processes in place, but, with the Online Safety Bill (and equivalent overseas regimes) coming down the track, they will need to have one eye to this wider legislation when designing those processes.