Part of Crime and Policing Bill - Committee (4th Day) (Continued) – in the House of Lords at 5:15 pm on 27 November 2025.
Lord Hanson of Flint
The Minister of State, Home Department
5:15,
27 November 2025
I am grateful to the noble Baroness, Lady Kidron, for the way she introduced this group of amendments and for her tireless work to protect children online. I say on behalf of all noble Lords that the support she has received today across the Committee shows that her work is vital, especially in the face of emerging technologies, such as generative AI, which present opportunities but, sadly, also have a darker side with new risks for criminal misuse.
She has received the support of the noble Baronesses, Lady Morgan of Cotes, Lady Boycott, Lady Bertin and Lady Doocey, my noble friends Lady Berger, Lady Royall of Blaisdon and Lord Hacking, the noble Lords, Lord Bethell, Lord Russell of Liverpool, Lord Hampton and Lord Davies of Gower, the noble Viscount, Lord Colville of Culross, and others to whom I will refer later. That is quite an array of colleagues in this House. It is my job to respond to this on behalf of the Government, and I will try to be as helpful as I can to the noble Baroness.
The Government share her desire to protect the public, especially children, online, and are committed to protecting all users from illegal online content. We will continue to act to keep citizens safe. Amendment 266 seeks to create a new duty on online service providers—including those already regulated under the Online Safety Act—to assess and report to ofcom or the National Crime Agency on the risk that their services could be used to create or facilitate the generation of AI child sexual abuse material. The amendment would also require online service providers to implement measures to mitigate and manage the risks identified.
I say to the noble Baroness that UK law is already clear: creating, possessing or distributing child sexual abuse images, including those generated by AI, is already illegal, regardless of whether they depict a real child or not. Child sexual abuse material offences are priority offences under the Online Safety Act. The Act requires in-scope services to take proactive steps to prevent such material from appearing on their services and to remove it swiftly if it does.
As she will know, the Government have gone even further to tackle these appalling crimes through the measures in the Bill. I very much welcome her support for Clause 63. We are introducing a world-leading offence criminalising the possession, adaptation and supply of, or offer to supply, an AI model that has been fine-tuned by offenders to create child sexual abuse material. As I mentioned earlier, we are also extending the existing paedophile manual offence to cover advice on how to abuse AI to create child sexual abuse material.
We have also introduced measures that reflect the critical role that AI developers play in ensuring their systems are not misused. To support the crucial work of the Government’s AI Security Institute, we have just debated and agreed a series of amendments in the previous group to provide authorised bodies with the powers to legally test commercial AI models for extreme pornography and other child sexual abuse material. That is essential to allow experts to safely test measures, and I am pleased that we received the Committee’s support earlier.
I recognise the intent of Amendment 266 but—I say this as genuinely humbly as I can—the assessment of the Government is that it would be unworkable and would place unmanageable and unnecessary operational burdens on both the National Crime Agency and Ofcom. The National Crime Agency is a law enforcement body. It does not have statutory powers, and it does not have the necessary people or expertise to regulate companies or enforce compliance with safety standards for online service providers. It would further create significant legal challenges for businesses or individuals looking to operate an online service in the UK. While it is right that we protect online users from the risks posed by technology, it is vital that we do not criminalise people or businesses without demonstrable culpability.
I agree with the noble Baroness in seeking to prevent abhorrent activity. Child sexual exploitation and abuse is an atrocious crime. The Government have acted and will continue to act to bring perpetrators to justice and keep our children safe. While I appreciate the intention behind the amendment, for the reasons I have set out, I hope that she will reflect on what I have said and not push the amendment further at this stage. I have already outlined that we have and will be implementing legislation and regulation to tackle these risks. I will leave the noble Baroness to reflect upon that in due course.
As a bill passes through Parliament, MPs and peers may suggest amendments - or changes - which they believe will improve the quality of the legislation.
Many hundreds of amendments are proposed by members to major bills as they pass through committee stage, report stage and third reading in both Houses of Parliament.
In the end only a handful of amendments will be incorporated into any bill.
The Speaker - or the chairman in the case of standing committees - has the power to select which amendments should be debated.
A parliamentary bill is divided into sections called clauses.
Printed in the margin next to each clause is a brief explanatory `side-note' giving details of what the effect of the clause will be.
During the committee stage of a bill, MPs examine these clauses in detail and may introduce new clauses of their own or table amendments to the existing clauses.
When a bill becomes an Act of Parliament, clauses become known as sections.
Ofcom is the independent regulator and competition authority for the UK communications industries, with responsibilities across television, radio, telecommunications and wireless communications services.
Ofcom Web Site http://www.ofcom.org.uk