A White paper on Online Harms, issued on 8 April, suggests a new system of accountability and oversight for tech companies, thus moving far beyond self-regulation. This regime would be supervised by an independent regulator, which would be funded by industry.
There would be a new statutory duty of care on relevant companies to take reasonable steps to keep their users safe and tackle illegal and harmful activity on their services.
The proposals would not just affect social media platforms and search engines but companies of all types and sizes. The government says that the regulatory framework should apply to companies that ‘allow users to share or discover user-generated content or interact with each other online’.
‘Every company within scope will need to fulfil their duty of care, particularly to counter illegal content and activity, comply with information requests from the regulator, and, where appropriate, establish and maintain a complaints and appeals function which meets the requirements to be set out by the regulator.’
The government is now consulting on these plans, including the type of regulator which would be in charge of this field, and the kind of powers which the body would have.
If choosing an existing regulator, the ICO could be one option.
The government says that it wants to ensure a level playing field between companies that have a legal presence in the UK, and those which operate entirely from overseas.
The consultation ends 1 July 2019. See the Online Harms White Paper.