New UK Regulator Proposed for Technology & Social Media Firms

UK Tech Regulator

Companies such as Facebook, Twitter and Google will all be required to adhere to the new “code of practice” and could be fined for breaking rules on content. 

New proposals from the UK Government could see websites fined if they fail to tackle harmful content such as child abuse or terrorist propaganda.

An independent watchdog could also be launched to establish a “code of practice” for technology firms, the Department for Digital, Culture, Media and Sport (DCMS) said.

As part of the Online Harms White Paper – a joint proposal from the DCMS and the Home Office – senior managers at tech companies could also be held accountable for breaches.

Other proposals include additional powers to force internet service providers to block sites which break the rules.

Commenting on the proposals, Digital Secretary Jeremy Wright said: “The era of self-regulation for online companies is over. Voluntary actions from industry to tackle online harms have not been applied consistently or gone far enough.

“Tech can be an incredible force for good and we want the sector to be part of the solution in protecting their users. However, those that fail to do this will face tough action.”

“We want the UK to be the safest place in the world to go online, and the best place to start and grow and digital business and our proposals for new laws will help make sure everyone in our country can enjoy the internet safely.”

Banner Digital Energy 2019

Taking Action

Plans outlined in the white paper would see the regulator funded by a new levy on the technology sector. Currently, the UK Government has not detailed whether a new regulator will be established, or if an existing one will be handed additional powers.

Companies such as Facebook, Twitter and Google will all be required to adhere to the new “code of practice”, while messaging services, such as Snapchat, and cloud storage services will also fall under the regulator’s umbrella of influence.

A range of areas will be covered under the proposals to tackle ‘online harms’. These include the distribution of terrorist content, revenge porn, hate crimes and the sale and distribution of illegal goods.

While these issues are clearly defined in law throughout the United Kingdom, other online behaviours and activities will be covered. Cyberbullying, ‘trolling’, and the creation and/or proliferation of fake news and misinformation will be included.

Social networks could be required to employ fact-checkers and promote ‘legitimate’ news sources under the code of practice. The white paper also suggests that annual reports should be published by the firms to highlight the volume of harmful content which has been discovered on their platform.

Content pertaining to self-harm and suicide will also be an area in which social networks are required to take action.

Facebook’s Head of UK Policy, Rebecca Stimson said in a statement: “New regulations are needed to that we have a standardised approach across platforms and private companies aren’t making to many important decisions alone.

“New rules for the internet should protect society from harm while also supporting innovation, the digital economy and freedom of speech.”



Latest News

Digital Featured News
19th April 2019

DIGIT Tech News Roundup – 19th April 2019

Business Editorial Entrepreneurship
News Security
Blockchain News
19th April 2019

HM Land Registry Successfully Completes Blockchain Trial