Amendment 10

Part of Online Safety Bill - Committee (2nd Day) – in the House of Lords at 7:00 pm on 25 April 2023.

Alert me about debates like this

Photo of Lord Parkinson of Whitley Bay Lord Parkinson of Whitley Bay Parliamentary Under Secretary of State (Department for Culture, Media and Sport) 7:00, 25 April 2023

My Lords, I agree that this has been a rather unfortunate grouping and has led to a slightly strange debate. I apologise if it is the result of advice given to my noble friend. I know there has been some degrouping as well, which has led to slightly odd combinations today. However, as promised, I shall say a bit more about Wikipedia in relation to my noble friend’s Amendments 10 and 11.

The effect of these amendments would be that moderation actions carried out by users—in other words, community moderation of user-to-user and search services —would not be in scope of the Bill. The Government support the use of effective user or community moderation by services where this is appropriate for the service in question. As I said on the previous group, as demonstrated by services such as Wikipedia, this can be a valuable and effective means of moderating content and sharing information. That is why the Bill does not impose a one-size-fits-all requirement on services, but instead allows services to adopt their own approaches to compliance, so long as these are effective. The noble Lord, Lord Allan of Hallam, dwelt on this. I should be clear that duties will not be imposed on individual community moderators; the duties are on platforms to tackle illegal content and protect children. Platforms can achieve this through, among other things, centralised or community moderation. Ultimately, however, it is they who are responsible for ensuring compliance and it is platforms, not community moderators, who will face enforcement action if they fail to do so.

The amendments in the name of my noble friend Lord Moylan appear to intend to take services which rely only on user moderation entirely out of the scope of the Bill, so that those services are not subject to the new regulatory framework in any way. That would create a gap in the protections created by the Bill and would create incentives for services to adopt nominal forms of user moderation to avoid being subject to the illegal content and child safety duties. This would significantly undermine the efficacy of the Bill and is therefore not something we could include in it. His Amendment 26 would remove the duties on providers in Clause 11(3) to prevent children encountering primary priority content, and to protect children in age groups at risk of harm from other content that is harmful to children. This is a key duty which must be retained.

Contrary to what some have said, there is currently no requirement in the Bill for users to verify their age before accessing search engines and user-to-user services. We expect that only services which pose the highest risk to children will use age-verification technologies, but this is indeed a debate to which we will return in earnest and in detail on later groups of amendments. Amendment 26 would remove a key child safety duty, significantly weakening the Bill’s protections for children. The Bill takes a proportionate approach to regulation, which recognises the diverse range of services that are in scope of it. My noble friend’s amendments run counter to that and would undermine the protections in the Bill. I hope he will feel able not to press them and allow us to return to the debates on age verification in full on another group.