Amendment 266

Part of Crime and Policing Bill - Committee (4th Day) (Continued) – in the House of Lords at 5:00 pm on 27 November 2025.

Alert me about debates like this

Photo of Lord Bethell Lord Bethell Conservative 5:00, 27 November 2025

My Lords, I too support the amendments in this group, particularly those tabled by my noble friend Lord Nash on security software and by the noble Baroness, Lady Kidron, on AI-generated child sexual abuse material. I declare my interest as a trustee of the Royal Society for Public Health.

As others have noted, the Online Safety Act was a landmark achievement and, in many ways, something to be celebrated, but technology has not stood still—we said it at the time—and nor can our Laws. It is important that we revisit it in examining this legislation, because generative AI presents such an egregious risk to our children which was barely imaginable even two years ago when we were discussing that Act. These amendments would ensure that our regulatory architecture keeps pace.

Amendment 266 on AI CSAM risk assessment is crucial. It addresses a simple but profound question: should the provider of a generative AI service be required to assess whether that service could be used to create or facilitate child sexual abuse material? Surely the answer is yes. This is not a theoretical risk, as we have heard in testimony from many noble Lords. We know that AI can generate vivid images, optimised on a dataset scraped from children themselves on the open internet, and that can be prompted to create CSAM-like content. On this, there is no ambiguity at all. We know that chatbots trained on the vast corpora of text from children can be manipulated to generate grooming scripts and sexualised narratives to engage children and make them semi-addicted to those conversations. We know that these tools are increasingly accessible, easy to use and almost impossible to monitor by parents and, it seems, regulators.

These amendments create a proportionate duty. Providers regulated by the Online Safety Act would report identified risks to ofcom and agree steps to mitigate them, backed by Ofcom’s enforcement powers. That must the right thing to do.

Amendments 479 and 480, on large language models and chatbots, address a related gap. The Online Safety Act’s user-to-user and search services categories were designed for the pre-generative AI world, despite our efforts to bring it up to speed. They do not neatly capture LLMs used to search interfaces or conversational agents that retrieve and synthesise content in response to user prompts, and we have heard such vivid testimony of what those dangers are to children.

I will bring a public health dimension to this debate since these amendments are consistent with a public health approach. In health systems, we do not wait for harm to manifest itself or show symptoms before we act. We assess risk, implement controls and monitor outcomes; that is the framework in which we apply ourselves. The same logic should apply here. We do not need to wait for an epidemic of AI-generated CSAM before requiring providers to assess and mitigate risk. Prevention is always preferable to cure and in the context of child sexual abuse, in which every image represents a real or simulated violation, prevention is a moral imperative, and there enough biomarkers already in the digital world for us to know that there is a severe risk.

I hope very much indeed that the Government will look seriously at this group of amendments.

Amendment

As a bill passes through Parliament, MPs and peers may suggest amendments - or changes - which they believe will improve the quality of the legislation.

Many hundreds of amendments are proposed by members to major bills as they pass through committee stage, report stage and third reading in both Houses of Parliament.

In the end only a handful of amendments will be incorporated into any bill.

The Speaker - or the chairman in the case of standing committees - has the power to select which amendments should be debated.

laws

Laws are the rules by which a country is governed. Britain has a long history of law making and the laws of this country can be divided into three types:- 1) Statute Laws are the laws that have been made by Parliament. 2) Case Law is law that has been established from cases tried in the courts - the laws arise from test cases. The result of the test case creates a precedent on which future cases are judged. 3) Common Law is a part of English Law, which has not come from Parliament. It consists of rules of law which have developed from customs or judgements made in courts over hundreds of years. For example until 1861 Parliament had never passed a law saying that murder was an offence. From the earliest times courts had judged that murder was a crime so there was no need to make a law.

Ofcom

Ofcom is the independent regulator and competition authority for the UK communications industries, with responsibilities across television, radio, telecommunications and wireless communications services.

Ofcom Web Site http://www.ofcom.org.uk