My Lords, we, too, on these Benches welcome the fact that the Government’s proposals have come forward today, and we support the placing of a statutory duty of care on social media companies. We agree that the new arrangements should apply to any sites,
“that allow users to share or discover user-generated content, or interact with each other online”.
We think that is a fair definition.
We are all aware of the benefits of social media networks and the positive role they can play. There is, however, far too much illegal content and harmful activity on social media that goes undealt with by social media platforms and creates social harm. The self-harming material on Instagram and the footage of the Christchurch killings are perhaps the most recent examples.
Proper enforcement of existing laws is, of course, vital to protect users from harm, but, as the White Paper proposes, social media companies should have a statutory duty of care to their users—above all, to children and young people—and, as I say, we fully support the proposed duty of care. It follows that, through the proposed codes, Parliament and Government have an important role to play in defining that duty clearly. We cannot leave it to big private tech firms, such as Facebook and Twitter, to decide the acceptable bounds of conduct and free speech on a purely voluntary basis, as they have been doing to date.
It is good that the Government recognise the dangers that exist online and the inadequacy of current protections. However, regulation and enforcement must be based on clear evidence of well-defined harm, and must respect the rights to privacy and free expression of those who use social media legally and responsibly. I welcome the Government’s stated commitment to these two aspects.
We also very much welcome the Government’s adherence to the principle of regulating on a basis of risk and proportionality when enforcing the duty of care and drawing up the codes. Will the codes, as the Lords Communications Committee called for, when exercising powers of oversight, set out clearly the distinction between criminal, harmful content and antisocial content? By the same token, upholding the right to freedom of expression does not mean a laissez-faire approach. Does the Minister agree that bullying and abuse prevent people expressing themselves freely and must be stamped out? Will there be a requirement that users must be able to report harmful or illegal content to platforms and have their reports dealt with appropriately, including being kept informed of the progress and outcome of any complaint?
Similarly, there must be transparency about the reasons for decisions and any enforcement action, whether by social media companies or regulators. Users must have the ability to challenge a platform’s decision to ban them or remove their content. We welcome the proposed three-month consultation period; indeed, I welcome the Government’s intention to achieve cross-party consensus on the crucial issue of regulating online harms. I agree that with a national consensus we could indeed play an international leadership role in this area.
Then we come to the question of the appropriate regulator to enforce this code and duty. Many of us assumed that this would naturally fall to Ofcom, with its experience and expertise, particularly in upholding freedom of speech. If it is not to be Ofcom, with all its experience, what criteria will be used in determining what new or existing body will be designated? The same appears to me to apply to the question of whether the ICO is the right regulator for the algorithms used by social media. I see that the Home Office will be drawing up certain codes. Who will be responsible for the non-criminal codes? Have the Government considered the proposals by Doteveryone and the Lords Communications Select Committee for a new “Office for Internet Safety” as an advisory body to analyse online harms, identify gaps in regulation and enforcement and recommend new regulations and powers to Parliament?
At the end of the day, regulation alone cannot address all these harms. As the noble Baroness, Lady Kidron, has said, children have the right to a childhood. Schools need to educate children about how to use social media responsibly and be safe online, as advocated by the PSHE Association and strongly supported by my party. Parents must be empowered to protect their children through digital literacy, advice and support. I very much hope that that is what is proposed by the online media literacy strategy.
At the end of the day, we all need to recognise that this kind of regulation can only do so much. We need a change of culture among the social media companies. They should be proactively seeking to prevent harm. The Government refer to a culture of continuous improvement being a desired goal. We on these Benches thoroughly agree that that is vital.