Only a few days to go: We’re raising £25,000 to keep TheyWorkForYou running and make sure people across the UK can hold their elected representatives to account.

Donate to our crowdfunder

Online Harms Legislation

Part of the debate – in the House of Commons at 10:32 am on 13th February 2020.

Alert me about debates like this

Photo of Chi Onwurah Chi Onwurah Shadow Minister (Department for Business, Energy and Industrial Strategy) (Industrial Strategy) 10:32 am, 13th February 2020

Molly Russell was only 14 when she killed herself after viewing posts on Instagram. David Turnball was 75 when he lost his pension through an unregulated financial product that was prominently advertised by Google. Last year TikTok live-streamed a teenager’s suicide. Misinformation on the coronavirus is spreading on social media. An online abuse offence against a child is recorded every 16 minutes. When we talk about online harms, these are real people, real stories, real pain and real hurt.

Before becoming an MP, I was an engineer. I helped build out the internet. I am proud of my work, which enabled people to better communicate and connect, but it has been clear for years that the internet requires regulation. Tim Berners-Lee, the inventor of the internet, has said it; the National Society for the Prevention of Cruelty to Children has said it; and Facebook has said it.

This response on online harms is overdue, weak and ultimately ineffective. Social media companies will have a duty of care, which Ofcom will regulate—good. Tech companies always had a duty of care, in my opinion, but the first online suicide was over 10 years ago, and still victims await legislation. When will these proposals be law?

Instead of creating a new regulator, the Government have given responsibility to Ofcom. I like Ofcom—I used to work for it—but in the last ten years it has had the BBC, postal services and more added to its remit. What additional resource will it have? What powers of enforcement will it have? Companies will regulate complaints themselves, although we are told that it will be transparent—how? The transparency working group has been mentioned, so could we have some transparency on that?

New online harms are emerging. Just a few weeks ago the smart doorbell system Ring was hacked, putting children at risk. Algorithms, facial recognition and artificial intelligence are not addressed—why not? In a week’s time the European Union will announce measures for digital services regulation. Has the Minister spoken with the EU about alignment, and if not, why not?

Online harms cause untold damage in the real world. If the Minister cannot give clear answers to these questions, victims past and present will have lost out in another wasted year.