Online Harms - Statement

Part of the debate – in the House of Lords at 5:25 pm on 8th April 2019.

Alert me about debates like this

Photo of Lord Ashton of Hyde Lord Ashton of Hyde The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport 5:25 pm, 8th April 2019

My Lords, with the leave of the House, I would like to repeat a Statement made by my right honourable friend the Secretary of State for Digital, Culture, Media and Sport, in the other place recently, as follows:

“The Government have today published a White Paper setting out our proposals to make the internet a safer place. For so many people, the internet is an integral part of daily life. Nearly nine in 10 UK adults are online and, significantly, 99% of 12 to 15 year-olds are too. As the internet continues to grow and transform our lives, we need to think carefully about how we want it to develop. In many ways it is a powerful force for good. It can forge connections, share knowledge and spread opportunity across the world, but it can also be used to circulate terrorist material, undermine civil discourse, spread disinformation and abuse, or bully. Our challenge as a society is to help shape an internet that is open and vibrant, but which also protects its users from harm. There is clear evidence that we are not succeeding. Over 8,000 sexual offences against children with an online element were reported to the police in 2017, a figure that is continuing to rise. Up to 20% of young people in the UK have experienced bullying online. The White Paper sets out many, many more examples of harms suffered.

People are closing their social media accounts following unacceptable online abuse. For the vulnerable, online experiences can mean cyberbullying, exposure to abusive content and the risk of grooming and exploitation. We cannot allow this behaviour to undermine the very real benefits that the digital revolution can bring. If we surrender our online spaces to those who spread hate, abuse and fear, then we all lose. This is a serious situation and it requires a serious response.

The Government have taken time to consider what we might do and how we might do it. I am grateful to Members across the House and indeed in the other place for their consideration of these issues, in particular the DCMS Select Committee. I am grateful too for the discussions I have had, including with the honourable gentleman opposite and his Front-Bench colleagues. We intend to continue these conversations and to consult on what we propose, because it is vital we get this right. No one has done it before. There is no comprehensive international model to follow and there are important balances to strike in sustaining innovation in the digital economy and promoting freedom of speech as well as reducing harm. None of that is straightforward and the Government should not claim a monopoly of wisdom. That is why the consultation which will follow will be a genuine opportunity for Members of this House and others to contribute to these proposals.

It is also right to recognise that some work is already being done to make the internet a safer place, including by online companies themselves, but it has not been enough and it has been too reactive. It can no longer be right to leave online companies to decide for themselves what action should be taken, as some of them are beginning to recognise. That is why my right honourable friend the Home Secretary and I concluded that the Government must act, and that the era of self-regulation of the internet must end, so the Government will create a new statutory duty of care, establishing in law that online companies have a responsibility for the safety of their users. It will require companies to do what is reasonable to prevent harmful material reaching those users. Compliance will be overseen and enforced by an independent regulator.

The White Paper sets out expectations for the steps that companies should take to fulfil the duty of care towards their users. We expect the regulator to reflect these expectations in new codes of practice. In the case of the most serious harms—such as child sexual exploitation and abuse, and the promotion of terrorism—the Home Secretary will need to approve these codes of practice and also have the power to issue directions to the regulator about their content. The Home Office will publish interim codes of practice on these subjects later this year, and we are consulting about the role that Parliament should have in relation to these codes too.

If online companies are to persuade the regulator that they are meeting their duty of care to keep their users safe, there will need to be transparency about what is happening on their platforms and what they are doing about it. If they are unwilling to provide the necessary information voluntarily, the regulator will have the power to require annual transparency reports and to demand information from companies relating to the harms on their platforms.

It is also important to give users a voice in this system, so they can have confidence that their concerns are being treated fairly, so we will expect companies to have an effective and easy-to-access complaints function, and we are consulting on two further questions: how we can potentially provide users with an independent review mechanism, and how we might allow designated bodies to make ‘super complaints’ to defend the needs of users.

For a duty-of-care-based model to work, those subject to it must be held to account for how they fulfil that duty. That is why we have concluded that a regulator will be necessary, whether a new entity or an extension of the responsibilities of an existing regulatory body. The regulator must be paid for by the online companies, but it is essential that it commands public confidence in its independence, its impartiality and its effectiveness. To ensure that the regulatory framework remains effective within this fast-changing landscape, we believe it is right to define its scope by activity, not by the name of the company or even the type of company.

We propose that the scope of the regulatory framework will be companies that allow users to share or discover user-generated content, or interact with each other online. This includes a wide variety of organisations, both big and small, from a range of sectors. The new regulatory regime will need to be flexible enough to operate effectively across them all. There are two key principles to such an approach. The first is that the regulator will adopt a risk-based approach, prioritising regulatory action to tackle harms that have the greatest impact on individuals or wider society. The second factor is proportionality. The regulator will require companies to take reasonable and proportionate actions to tackle harms on their services, taking account of their size and resources. The regulator will expect more of global giants than small start-ups.

It is also necessary for the regulator to have sufficient teeth to hold companies to account when they are judged to have breached their statutory duty of care. That will include the power to serve remedial notices and to issue substantial fines, and we will consult on even more stringent sanctions, including senior management liability and the blocking of websites, but this is a regulatory approach designed to encourage good behaviour as well as punish bad behaviour. Just as technology has created the challenges we are addressing here, technology will provide many of the solutions—for example, in the identification of terrorist videos online and images of child sexual abuse, or in new tools to identify online grooming. The regulator will have broader responsibilities to promote the development and adoption of these technologies and to promote safety by design.

The truth is that, if we focus only on what the Government or the online companies do, we miss something important. We all need the skills to keep ourselves safe online and too few of us feel confident that we have them, so we will task the regulator to work on promoting those skills and we will develop a national media literacy strategy.

This White Paper does not aspire to deal with all that is wrong with the internet—no single piece of work could sensibly do so. It forms part of the Government’s response to the many challenges the online world brings. But it is focused on some of the most pernicious harms found online and it expects much more of the companies that operate there in tackling those harms. These are big steps, but they need to be taken.

Some say the internet is global so no country can act alone, but I believe we have both a duty to act to protect UK citizens and an opportunity to lead the world on this. With well-deserved worldwide reputations for fostering innovation and respect for the rule of law, the United Kingdom is well placed to design a system of online regulation that the world will want to emulate. The more we do online, the less acceptable it is that content which is controlled in any other environment is not controlled online.

A safer internet is in the interests of responsible online companies that want their customers to spend more time online, and is a legitimate expectation of those we represent. That is what this White Paper will deliver and I commend it and this Statement to the House”.