Amendment 266

Part of Crime and Policing Bill - Committee (4th Day) (Continued) – in the House of Lords at 4:15 pm on 27 November 2025.

Alert me about debates like this

Photo of Baroness Kidron Baroness Kidron Crossbench 4:15, 27 November 2025

My Lords, in moving Amendment 266, I will speak also to Amendments 479 and 480, all of which are in my name. I thank the noble Baroness, Lady Morgan, the noble Lords, Lord Clement-Jones and Lord Russell, and the noble Viscount, Lord Colville, for their support.

All three amendments concern illegal or harmful online activity. Amendment 266 places a legal duty on online services, including generative AI services, to conduct risk assessments evaluating the likelihood that their systems could be used to create or facilitate child sexual abuse material. Subsection (1) of the proposed new Clause establishes that duty. Subsection (2) requires providers to report the results to ofcom or the National Crime Agency, depending on whether or not they are regulated under the Online Safety Act. Subsections (3) to (7) set out the enforcement mechanisms, drawing on Ofcom’s existing enforcement powers under the OSA or equivalent powers for the NCA.

Amendment 266 complements Clause 63, which creates the new offence relating to the supply of CSA image generators to which the Minister has just spoken, but it is in addition to those powers. In June 2023, the BBC reported that the open-source AI model Stable Diffusion was being used to generate child sexual abuse material. Researchers at Stanford University subsequently found that Stable Diffusion had been trained on datasets containing child sexual abuse material. This issue is not confined to a single model. The Internet Watch Foundation and the chair of the AI Security Institute have warned of the potential for open-source AI models to be used for the creation of CSAM.

Given these facts, it should not be possible to release an AI model to the public without first checking whether it is capable of producing CSAM, which is exactly what Amendment 266 would ensure. Earlier today in Committee, the Minister reiterated more than once the Government’s commitment to protect children from exploitation. As he has just acknowledged, AI-generated child sexual abuse material that blurs the line between real and synthetic images and desensitises users, particularly adult men, is cultivating disordered sexual appetites. Among experts there are concerns about the scale of reproduction and the spread of images not bound by the physical world. They need no gravity, no oxygen, yet they retraumatise survivors, divert limited resources away from protecting real children who are being abused and merge real children with fictitious children in improbable and impossible scenarios.

I am delighted to see Clause 63. I would like it if the Minister would allow me to say quite gently that while Ministers now appear to claim this proposal as their own, in fact it originated with the same specialist police force behind the amendment to which I am currently speaking, was proposed by the same noble Lords who will undoubtedly speak for it and was resisted by two Governments in two separate Bills. So now we are here. I underline this fact not for any sort of chiding but to ask the Government to please take seriously what we have to say on this matter.

I also want to touch on the fact that the Government introduced amendments in the previous group that allow AI developers and child protection organisations to test models for CSA-producing capabilities. I welcome them, but I am very disappointed that the Government chose not to engage in discussions about these provisions, because allowing risk assessments and testing is not the same as requiring them. Preventing the creation of CSA is a duty, not a discretionary option.

Amendment 479 is a probing amendment that seeks to establish beyond doubt that generative AI services, including large language models, are indeed categorised as search services under the Online Safety Act. Subsection (1) would establish that it is an offence for the provider of a generative service to allow content and activity defined as illegal under the OSA or harmful to children if the user is a child. Subsections (2) and (3) define generative AI search service and content, and subsection (4) deals with enforcement, once again leveraging Ofcom’s powers under the OSA.

A recent report by the Center for Countering Digital Hate found that GPT-5 responded to prompts about self-harm and suicide, eating disorders and substance abuse with harmful content in more than half the cases. A Microsoft study found that OpenAI, Meta, Microsoft and Mistral models could all be coaxed into producing content harmful to children between a third and a half of the times attempted. In July, Elon Musk’s chatbot Grok collapsed into an antisemitic frenzy, and earlier this year parents in California filed against OpenAI for the role that ChatGPT played in the suicide of 16 year-old Adam Raine.

I asked Ofcom whether the OSA covers LLMs. It replied that they may be regulated as search services under the Act. Volume 4 of Ofcom’s Protecting Children from Harms Online guidance also uses the word “may” in several places, and page 25 of the same code says:

“It is for the provider of any service using GenAI to determine whether their service is in scope of the Act as a regulated user-to-user or search service (or combined service) and where it is, to ensure compliance with the relevant safety duties”.

“May” and self-selection are not adequate for technology that is about to be the organising technology of our world. I would like to hear categorically from the Minister whether LLMs are regulated by the Online Safety Act, whether they are characterised as search or user to user, or whether there are complications or ambiguities in their classification. If they are in scope, I would like to understand what evidence there is that Ofcom is enforcing the OSA appropriately. If they are not, what possible objection to this amendment could the Government have?

In the new clause in Amendment 480, subsections (1) and (2) would establish that it is an offence to create, supply or otherwise make available a chatbot that produces illegal content or content harmful to children if the user is a child under the age of 18. Subsections (3) and (4) deal with enforcement. Subsections (5) and (6) would establish the specific conditions of a defence for those investigating crimes, working for Ofcom or testing products. Subsection (7) defines “chatbot”.

Two weeks ago in Rome I met Megan Garcia, a mother from the United States whose tragic story has been widely reported. In spring 2023 her son Sewell started spending hours talking to a chatbot character on character.ai. Within 10 months he was dead, lured by the chatbot into taking his own life. Now Megan is suing character.ai in the United States for the wrongful death of her son, but she is bravely and heartbreakingly campaigning all over the globe to improve safety in this area. While I was meeting her, sitting in a café in Rome, a mother of a UK child texted her and said her son was displaying the same symptoms and was being groomed by an AI chatbot. Megan turned to me and said, “I regularly get such texts”. I want to make clear that Amendment 480 has her explicit support, and she almost begs us to take action here. In recent weeks character.ai has said it will stop allowing under-18s to use its service, but there are plenty of chatbot services available to children and they are dangerous. Leaked internal standards for chatbots at Meta suggest that it is acceptable to

“engage a child in conversations that are romantic or sensual”,

and for a bot to tell a shirtless 8 year-old that

“every inch of you is a masterpiece—a treasure I cherish deeply”.

That is allowed internally.

Once again, that begs the question of whether this is a lack of scope in the OSA or a failure to regulate. On trying to establish that with Ofcom, it pointed me to a letter that appeared to suggest that some chatbots may be search, some may be user-to-user and those that offer pornography would be subject to Part 5 duties. However, that raises two issues. First, it does not seem to cover all types of chatbots, for example Replika, which does not enable user-to-user sharing but promotes itself as:

“The AI companion who cares … Always on your side”.

Secondly, this fundamental lack of clarity is horribly amplified by the fact that there is nowhere to go.

Ofcom, in spite of all our efforts, does not handle individual complaints. The police will not and cannot deal with a chatbot, because a chatbot is not a person, so the only answer I have for the British mother who texted Megan Garcia, worried about her child, is that she can fill in a form or she can tell the lived experience team at Ofcom. That is not adequate. Last week, the DSIT Secretary of State said:

“If chatbots aren’t included or properly covered by the legislation, and we’re really working through that now, then they will have to be”.

Amendment 480 presents the Government with an opportunity to ensure that they are. On behalf of Megan and all the other parents in this situation, I ask: what are we waiting for?

Finally, while Amendment 480 focuses primarily on harmful content, the most dangerous aspect of chatbots is that they are deliberately addictive. When I was familiarising myself with the many chatbots that children are using, each time I brought the conversation to a close, the chatbot, in a plaintive tone, asked me to stay. Even I, who many noble Lords in the House will know have a robust view of tech, found myself feeling guilty, or at least confused, when I was asked to reject these automated appeals to my empathy. A child does not stand a chance.

This House has long campaigned for the Government to include addictiveness as a stand-alone harm. We believed we had secured it on the last day of Report on what is now the OSA, but Ofcom has repeatedly said that it does not have the power. I recognise that this last point is outwith this Bill and my amendments, but can the Minister go back to the Government and ask: if the regulator does not have the power to regulate addictiveness, would the Secretary of State use her powers under the Act to bring forward a code of conduct on it? When we advocate for a safer online environment by making an analogy with smoking, very often a Minister, an interviewer, a tech lobbyist or a civil servant interjects to say that it is a false analogy because tech does not kill. We are well past that; my inbox is a litany of bereaved parents. It does kill. I beg to move.

Amendment

As a bill passes through Parliament, MPs and peers may suggest amendments - or changes - which they believe will improve the quality of the legislation.

Many hundreds of amendments are proposed by members to major bills as they pass through committee stage, report stage and third reading in both Houses of Parliament.

In the end only a handful of amendments will be incorporated into any bill.

The Speaker - or the chairman in the case of standing committees - has the power to select which amendments should be debated.

Secretary of State

Secretary of State was originally the title given to the two officials who conducted the Royal Correspondence under Elizabeth I. Now it is the title held by some of the more important Government Ministers, for example the Secretary of State for Foreign Affairs.

clause

A parliamentary bill is divided into sections called clauses.

Printed in the margin next to each clause is a brief explanatory `side-note' giving details of what the effect of the clause will be.

During the committee stage of a bill, MPs examine these clauses in detail and may introduce new clauses of their own or table amendments to the existing clauses.

When a bill becomes an Act of Parliament, clauses become known as sections.

Bills

A proposal for new legislation that is debated by Parliament.

Minister

Ministers make up the Government and almost all are members of the House of Lords or the House of Commons. There are three main types of Minister. Departmental Ministers are in charge of Government Departments. The Government is divided into different Departments which have responsibilities for different areas. For example the Treasury is in charge of Government spending. Departmental Ministers in the Cabinet are generally called 'Secretary of State' but some have special titles such as Chancellor of the Exchequer. Ministers of State and Junior Ministers assist the ministers in charge of the department. They normally have responsibility for a particular area within the department and are sometimes given a title that reflects this - for example Minister of Transport.

Ofcom

Ofcom is the independent regulator and competition authority for the UK communications industries, with responsibilities across television, radio, telecommunications and wireless communications services.

Ofcom Web Site http://www.ofcom.org.uk