
Preventing Online Harm
9th March 2020
Introduction
The start of the year saw the release by the Information Commissioner’s Office (ICO) of the Age-appropriate Design Code of Practice for online service providers (you can read more about the code in our previous blog post here. In a nutshell, the code requires children’s best interests to be a primary consideration when designing and developing online services.
The focus on holding online providers to account also remains high on the Government’s agenda, with last month seeing the release of the Government’s initial responses to last year’s consultation on the Online Harms White Paper.
Background: the Online Harms White Paper Consultation
The consultation responses were varied: fundamental objections were voiced but some suggestions appear to have been taken on board. The overall impression given by the Government’s response, however, is that very little is certain: the response gives only an indication of the Government’s direction of travel, acknowledging that policy development is ongoing. The fundamentals remain clear, in that the focus is on the introduction of a statutory duty to protect users of services that enable or facilitate the sharing of user-generated content online or online interaction. The Government’s aim is to ensure that the UK is the safest place in the world to be online whilst at the same time allowing technology to flourish and innovate. These two objectives may be challenging to reconcile.
The Proposals – In a nutshell
The devil will, as ever, be very much in the detail yet to be made available, however, the regulatory approach will be ‘broadcast-style’ oversight, with Ofcom likely to be appointed as regulator. Ofcom would, in a broadening of its current communications and broadcasting remit, become responsible for drawing up new codes of practice and enforcing compliance with the statutory duty. The Government will ensure that Ofcom has a clear responsibility and powers to protect user’s rights online and ensure there are appropriate systems and processes in place to do so. The extent of such powers is as yet not determined. There will, of course, be a balancing act for the regulator to ensure appropriate safeguarding of free speech, defending the role of the press, promoting tech innovation and ensuring businesses do not face disproportionate burdens. Ofcom’s powers are likely to extend to oversight of companies’ wider systems and processes but not to the adjudication of an individual complaint.
It is clear that the Government is wanting to place a statutory duty of care on businesses within the scope of the new regulation to protect users from harmful and abusive content. What is less clear is which businesses would find themselves within the scope: the facilitation of user-generated content or of online interaction between users is key but where the cut-off falls between global social media giants and small traders, who allow customer feedback to be posted on their website, remains to be seen. It is important to note, however, that the proposed regulations will not apply to every company. A business will not necessarily fall in the scope of the legislation simply as a result of having a presence on social media.
Some key questions remain to be addressed. There is currently no clarity as to the way in which ‘harm’ would be defined. There is also a lack of clarity as to how legality of content would be assessed and how the line between lawful and unlawful content would be drawn. This would include consideration, in many cases, of the link between lawful content and its potential to cause harm. Under the proposals, platforms will need to ensure that illegal content is removed quickly and to minimise the risks of it appearing in the first place. It will clearly then be important for illegal content to be readily identifiable, which may pose a significant challenge. Freedom of speech considerations may continue to allow adults to access or post content which others may find offensive, and intermediaries may be given responsibility for deciding, and explicitly stating, what content is acceptable on their sites subject to an ongoing duty to enforce their policy consistently and transparently.
What’s next?
As mentioned above, these proposals are ‘iterative’ and the Government is set to publish a full consultation response in Spring 2020, providing more details on the enforcement powers of Ofcom and interim voluntary codes on tackling online terrorism, child sexual exploitation and abusive content and activity. These codes will be voluntary but are intended to bridge the gap until the new regulator becomes operational.
Article Info
- 9th March 2020
- Becky Brook
- Corporate Law
Keep Up-To-Date
Sign up to our blogs
Sign up here to be notified of the latest opinions and insights from our legal experts.
Burnetts produces a range of articles, employment law e-bulletins and factsheets. This free legal resource is useful for both organisations and individuals.