A third of people who use online video-sharing services have come across hateful content in the past three months, according to new Ofcom research. So, what can/should be done about it?
As we’ve previously reported, since 1 November 2020, UK-established providers of video-sharing platforms (VSPs) must comply with new rules, including to protect users from harmful content. Our previous article on VSPs explains what VSPs are and outlines the rules that VSPs need to comply with. However, Ofcom is now providing more detail on various aspects of the legislative requirements.
Last week we reported on guidance which helps VSP providers assess whether they are legally obliged to submit a formal notification of their service to Ofcom (see our blog here). Hot on the heels of this, Ofcom has now published draft guidance on the regulatory requirements relating to the protection of users against harmful content.
This draft guidance is out for consultation (open until 2 June 2021) and elaborates on the measures that VSPs will be expected to take to comply with the statutory framework (as set out in the Communications Act 2003). The legislation requires VSP providers to have in place appropriate measures to protect:
- under-18s from material in videos which might impair their physical, mental or moral development; and
- the general public from criminal content and material likely to incite violence or hatred.
Ofcom’s draft guidance on the protection measures required is not prescriptive but intended to give suggestions to aid understanding of how users can be appropriately protected from harmful material. These include:
- Having, and enforcing, terms and conditions prohibiting harmful material;
- Having, and effectively implementing, flagging and reporting mechanisms;
- Applying appropriate age assurance measures to protect under 18s, including age verification for pornography;
- Implementing complaints processes (including the requirement to provide for an impartial procedure for the resolution of disputes); and
- Providing “media literacy” tools and information.
The guidance goes into significant detail with a particular focus on urging providers to ensure that five key principles are considered in implementing the measures mentioned above. These are that the measures should be: Effective; Easy to Use; Transparent; Fair; and Evolving (ie regularly reviewed and updated).
Ofcom acknowledges that there may be various ways to implement a measure to achieve the same purpose. The draft guidance also acknowledges that the appropriateness of a particular measure and how is it implemented will vary depending on various factors, including the size and nature of the VSP platform, the nature of material being shared and the legitimate interests of those uploading the material. As such, this is not a one size fits all approach to compliance.
Ofcom plans to issue its final guidance on the protection of users against harmful content later this year. In the meantime, we can expect more draft guidance from Ofcom on the advertising-specific rules applying to VSP providers, including around transparency, and prohibited and restricted products. Ofcom will consult separately on these advertising-specific requirements, including proposals for guidance on the control of advertising and a proposal to designate advertising enforcement functions in respect of VSPs to the Advertising Standards Authority. More on this in due course.
And finally…as we’ve flagged in previous articles, the UK government has stated it intends for the VSP regime in the UK to be superseded by the online harms regime through the proposed Online Safety Bill. Sadly this doesn’t mean VSP providers can ignore the current regime. The government has indicated that the introduction of the VSP regime will provide a solid foundation to inform and develop the future online harms regulatory framework, so implementing complaint measures now, should put you in a good position for complying with any future regime which supersedes it too.
“We recognise that it is impossible to prevent all instances of harm occurring, but we expect providers to take a proactive approach to identifying and mitigating the risk of harmful material occurring on their platform”.