Amendment 209

Crime and Policing Bill - Report (2nd Day) – in the House of Lords at 3:23 pm on 2 March 2026.

Alert me about debates like this

Votes in this debate

Baroness Kidron:

Moved by Baroness Kidron

209: Clause 65, page 84, line 42, at end insert—“46D Child sexual abuse image-generation risk assessment(1) A provider of an online service, including but not limited to a generative AI large language model, must risk assess the likelihood of their service being used to create or facilitate the creation of a CSA image or images as defined by section 46A. (2) If a risk is identified in a CSA image-generation risk assessment—(a) where the provider is regulated by the Online Safety Act 2023, a provider must report the risk within two working days to ofcom, and agree to steps to reduce, mitigate and manage the risks within 14 days;(b) where the provider is not regulated by the Online Safety Act 2023, a provider must notify the National Crime Agency within two working days and agree to steps to reduce, mitigate and manage the risks of the online service being used to create or facilitate the creation of CSA images within 14 days.(3) Where a provider regulated by the Online Safety Act 2023 fails to agree to or implement steps to reduce, mitigate and manage the risks with OFCOM (see subsection (2)(a)), they can be subjected to OFCOM’s enforcement powers as set out in Part 7, Chapter 6 (enforcement powers) of that Act.(4) Where a provider not regulated by the Online Safety Act 2023 fails to agree to or implement steps to reduce, mitigate and manage the risks with the National Crime Agency (see subsection (2)(b)), they commit an offence.(5) A provider that commits an offence under this section is liable to be issued with a penalty notice by the National Crime Agency.(6) In this section a “penalty notice” means a notice requiring its recipient to pay a penalty of an amount not exceeding whichever is the greater of—(a) £18 million, or(b) 10% of a provider’s qualifying worldwide revenue for the most recent complete accounting period.(7) A penalty notice may be reissued where a provider continues to commit an offence under this section.(8) In carrying out its duties set out in this section, the National Crime Agency may consult with OFCOM.”Member’s explanatory statementThe Bill includes amendments which prohibit the creation of Gen-AI models specifically designed to create CSA images, but it is still possible for general-purpose models to be used to create CSA images. The Government has committed to allow providers of other Gen-AI services to risk assess how their services could be used for this purpose. This Amendment makes that a requirement.

Photo of Baroness Kidron Baroness Kidron Crossbench

My Lords, I thank the noble Baroness, Lady Morgan of Cotes, and the noble Lords, Lord Russell and Lord Clement-Jones, for adding their names. I also thank the noble Baroness, Lady Barran, for trying to add her name. Such was the enthusiasm that there was no space.

As already discussed, the Government have brought in new Clauses 92 and 93 to allow companies and responsible third parties to risk-assess the creation of CSA by gen AI models. That is an important detail. If the company is red teaming, or the regulator needs to test, it must not be guilty of an offence for doing so. But this new measure is permission, not obligation—and permission is not enough.

Amendment 209 seeks to do three things: to make risk assessment mandatory; to require mitigation within 14 days; and to hold companies not covered by the Online Safety Act to the same standard via the National Crime Agency.

A report from UNICEF last month referenced an Interpol study across 11 countries which found that at least 1.2 million children have disclosed having their images manipulated into sexually explicit images in the past year. In some countries that is equivalent to one child in every classroom being subjected to this new form of child sexual abuse. The report recommended the introduction of guardrails for AI developers at the design stage. In a meeting earlier in your Lordships’ House, we were told repeatedly and reminded graphically that AI CSAM creates appetite in offenders and that what happens online does not stay online.

We have consulted, and ofcom has consulted—Parliament has debated this for years—and now we are consulting again. I argue that there are three reasons for accepting the amendment right now.

First, in Committee, I laid this amendment and a similar one about LLMs to establish exactly which services were covered by the Online Safety Act. So far, neither Ministers nor officials in repeated meetings have given a clear answer. I ask the Minister today to say whether all large language models are already in scope and required to do CSEA risk assessments. If the confusion remains, the amendment is necessary. We just cannot wait.

Secondly, there is confusion about Ofcom’s ability to demand mitigation in a timely manner, or indeed at all. The amendment would deal with that by requiring mitigation in 14 days. There is nothing that I have seen yet in the consultation about enforcement, and unless and until we can act swiftly, the Online Safety Act will continue to disappoint.

Thirdly, the Government plan to consult and then bring changes in secondary legislation. Irrespective of whether it is an affirmative or negative procedure, in practice this is beyond meaningful parliamentary oversight. I have been fighting for this group of amendments since 2023, and the best I have had is for the Government to take a key offence, which I am very happy to see, but make it so narrow that it does not cover most service providers. Risk assessing for CSAM should not be controversial; it should be done, and it should be done in sight of Parliament.

In Committee, the Government said there were already clear Laws prohibiting CSA creation, but they also made clear there are gaps. I do not understand how any reasonable person, let alone a Government that has claimed violence against women and girls as a core purpose, could in good faith reject the amendment. Yet in Committee, the Minister said that the amendments would place an “unmanageable and unnecessary” burden. Are the Government saying that the burden on a public sector crime agency or regulator is a reasonable justification for enabling widespread child sexual abuse? If the argument is that it is too much of a burden, I ask noble Lords themselves to consider if it is too much of a burden when it is their daughters, or their granddaughters, who have been made to look younger, put on all fours, and turned to the camera with a stranger abusing them. Is it too much of a burden when children who cannot yet walk and talk are abused?

It is not a burden. It is a privilege to check that something you have built and profit from does not accidentally allow for fun, by accident or deliberately, the sexual abuse of children. If this does go to the vote, it will not be because I enjoy the support of the Government or Opposition, but so that we in this House can put on record the unwillingness of either party, despite the obvious need and the performative outrage at Grok, to do anything at all about it.

I have made clear at every stage that I am willing to look at drafting, and if it is simply a case of drafting, I will happily bring something back at Third Reading that is drafted as the Government would accept. But this is a harm that organisations such as the IWF have long raised concern about. It was a risk reported related to Grok as early as July last year. It is something the police have asked for for years, and we have been discussing it in Parliament since 2023. What these models can do reflects how they are designed, tested and deployed. When they fail to protect children, that is not an accident. It is a design choice.

Government Ministers, like the rest of the public, rightly expressed outrage at Grok, but, according to the Centre for Countering Digital Hate, at least 23,000 images were made using its nudification tool that featured children. Last week, Ofcom announced that it could not enforce against Grok but would instead take action against X for distribution. It is not enough, and it is after the fact—not prevention but huge resources deployed too late and too narrowly in the aftermath.

A risk is something that can be mitigated. A harm is something already done. This is an opportunity to deal with a risk here and now—not maybe and not sometime in the future. I beg to move.

Photo of Lord Russell of Liverpool Lord Russell of Liverpool Deputy Chairman of Committees, Deputy Speaker (Lords) 3:30, 2 March 2026

My Lords, I support my noble friend Lady Kidron—I was very happy to put my name to this. The noble Baroness and others in this Chamber were at a meeting that we had at lunchtime today with a variety of really knowledgeable experts in this area. Even for those of us who have been to these sorts of meetings in the last few years fairly regularly, the latest news is really deeply shocking. I cannot even begin to tell your Lordships how shocking it is.

Indeed, there was an expert from Finland there who is about to deliver a very comprehensive analysis of the status quo, which will be delivered to ofcom and published shortly. She was unable to give any details; however, she did tell us—I must confess that I am not that shockable, but I did find this pretty shocking—that the earliest instance that this research has discovered of a child being abused sexually was a child who was seven hours old, if noble Lords can believe that. What is more, we were told that there are manuals available on the web and the dark web which tell perpetrators, if they wish to sexually abuse newly born infants, how to do so in such a way that it is not able to be medically identified.

It reminds one slightly of the recent, very brave, interview that Gisèle Pelicot gave, which some of your Lordships may have seen—if noble Lords have not, I recommend it—in which it appeared that the reason that Gisèle did not realise what was happening to her was that her husband had availed himself of sufficient medical knowledge to know that, when he drugged her, he also put muscle relaxants into the medication. The normal physical reaction of anyone’s body, particularly a female body, when it is being violated is to resist it and seize up; in the case where you had muscle relaxants administered, of course, that was not the case, so, when Gisèle woke up, she did not feel well, but she did not realise what had happened. There are manuals on the web telling perpetrators how to do that with newly born infants in order that it is not identified. This is the world we are living in.

I am reminded of an analogy that we often used to use when I was a management consultant, when we were trying to indicate to a business that things were getting slightly out of control and not going the way they wanted: the parable of the frog in the water, which is gently increasing in temperature until the point that it realises it is being boiled alive, by which time it is too late. If you look at the scale of the abuse that is happening and the way in which artificial intelligence is accelerating this exponentially, it is never too late, but I can only add to the words of my noble friend Lady Kidron: how much longer do we have to keep on beseeching the Government to listen?

I reminded the meeting of a meeting I had a few months ago with a Minister from another department and her team. The Minister was female and all the advisers were female. We asked them, “How many of you have children, and what age are they?” They told us, and we then described some of the things that are happening to children of that age. You could see a visible change in demeanour and body language. This is not something that is happening to other people, or happening remotely on the BBC news or online; it is happening to us and our children, and it becomes deeply personal. The reason why the noble Baroness and others of us feel so passionately about this is that it is happening all around us—to our children, grandchildren, nephews and nieces—and we appear to be blind to what is going on.

We are blind in the sense of finding solutions that will work and blind to even trying solutions that may not be perfect but at least indicate a level of intent to do something about it. The companies that are the aim of the noble Baroness’s Amendment know what they are doing; they are aware of what they are allowing. They are probably doing some risk analysis, which is probably not very good reading, but they know exactly what they are doing. To try to limit the Government’s approach to only those engines that have clearly been designed primarily to produce child sexual abuse material is the tip of the iceberg. It is all the other ones that are doing the damage. Until and unless we face up to that, zero in on them in such a way that they have to pay attention, and make it seriously painful for them, we are not going to change anything.

I appeal to the House, should the noble Baroness decide to take this to a vote, to send a clear signal to the Government about what is going on. Those of us in this House who are involved in this are frequently approached by the Government’s own Back-Benchers from Another place—many of whom have young children —who are deeply concerned about what is going on. They are desperate for their Government to show real leadership and, rather than having consultation after consultation, to take action. So I appeal to the Government to look at this very seriously and I appeal to the House, if the noble Baroness decides to divide, to go with her.

Photo of Baroness Benjamin Baroness Benjamin Liberal Democrat

My Lords, I support Amendment 209, in the name of the noble Baroness, Lady Kidron. I was at the meeting that the noble Lord, Lord Russell, so graphically described. I wish all noble Lords had been there too. If they had been, they too would support this amendment. It makes me weep to think of the harm and damage being done to babies—babies—and young children. It is shocking, and if we do not vote for this amendment, we should be ashamed of ourselves. It might not affect you personally, but you have to care about the millions of children out there who are having to face this abuse.

The growth in artificial intelligence tools is exposing children to new and enhanced harms. Perpetrators are using image generators to create hyper-realistic child sexual abuse material that can be used to abuse and extort children, including to financially blackmail young people. Devastatingly, Childline is hearing from more and more children who are experiencing this type of abuse. For example, a 16 year-old boy contacted the charity saying that a girl claiming to be his age made fake sexual images of him and threatened to share them with his friends unless he sent her £200. What is this world coming to, with children being blackmailed like this? Children are speaking about feeling incredibly scared, distressed and isolated in these situations. They are unsure about why it is happening or where to turn for help.

The Internet Watch Foundation has reported a dramatic surge in AI-generated child sex abuse material: a fourfold increase in just one year. Because this damaging content is harming children, I welcome the steps by the Government to tackle AI-generated child sex abuse material in the Bill and the fact that they recently announced changes to the Online Safety Act. The recent horrific case of Grok being used to generate child abuse images showed that the Act does not include the relevant provisions to tackle such abuse. It is therefore important that we have a regulatory system that tackles risks before they occur by embedding safety by design in AI platform models.

The Online Safety Act currently focuses on content detection and moderation. Adding AI chatbots to the Act therefore requires a different approach, as these systems have the ability to generate such content in the first place. The amendment from the noble Baroness, Lady Kidron—hallelujah for her—would ensure that this risk assessment process tackles the distinct risk of AI platforms. This should involve auditing training data and testing systems, using methods such as red-teaming, to ensure that models cannot be exploited to create child sexual abuse material. Only after we know that these models are safe should they be released to the public.

How do the Government intend to risk-assess AI chatbots so that we do not find ourselves back here in a few years’ time, after more and more children have been victimised, with us asking the question: “How can we find better ways to regulate this technology?” This is why I urge the whole House and the Government to accept this amendment. I look forward to the Minister’s response.

Photo of Lord Pannick Lord Pannick Crossbench 3:45, 2 March 2026

My Lords, this is a grim subject, like, I am afraid, many of those that we are going to discuss in our proceedings today. An overwhelming case has been made by those who have spoken, particularly the noble Baroness, Lady Kidron. I very much hope that the Front Benches—Government and Opposition—are listening to the views that have been expressed.

I shall offer one argument additional to those that the noble Baroness has set out. In addition to regulatory sanctions against the providers of these online services, and in addition to any possible criminal remedies that may arise, there is also the possibility of civil sanctions: claims for damages brought by groups of parents who have the misfortune to have had their children dealt with in this appalling way. Any such claim for damages would be immeasurably assisted were the providers of the online services to have a legal duty to risk-assess the likelihood of their services being used in this way.

Photo of Lord Stevenson of Balmacara Lord Stevenson of Balmacara Labour

My Lords, I was also at the meeting, which has been referred to, that was held this lunchtime and dealt with the troubling question of what seems to be an epidemic of growth in the exploitation of children on the internet. I must say that it revealed figures that I was not aware of, and I regard myself as relatively well briefed on this matter.

Further information came out today—particularly from the work, which has already been alluded to, by Members who were present at that meeting—that much of the of the material that is seen online also moves across into the real world. The use of these elements on the internet to groom children, to set up meetings with them and then to participate with them in illegal acts has been growing to a point where it is quite clearly an epidemic that must be dealt with. We are at the start of something extraordinarily unpleasant that needs to be looked at in the round, in a way that we have not yet done or been able to do.

Having been heavily involved in the Online Safety Act, I am conscious of the fact that we are dealing with legislation which has been overtaken by technology. The developments that have happened since we the Bill became an Act have meant that the tools we thought were being given to ofcom and being used by the Government are very often no longer appropriate. They are probably not as far-reaching and certainly do not deal with the speed with which this technology is moving forward.

I have not been able to attend any meetings which Ministers may have had with my own side on this, but I gather that there is a Whip on against this Amendment. I wonder whether the Minister could think hard about how he wants to play this issue out. It seems that one of the problems we have in dealing with legislation in this area is that we are never dealing with the right legislation. We want to amend the Online Safety Act but obviously, by moving an amendment to this Bill, which is from another department, we are not maximising the chances of having an output which will work. In addition, the way Ofcom is interpreting the Act seems to make it very difficult for it to reach out on new technologies, such as those described by the noble Baroness, Lady Kidron, in her excellent speech introducing the amendment.

In a moment of transition, when we are so keen to try to grasp things so that they do not get out of our control, there may be a case for further work to be done. The noble Baroness, Lady Kidron, mentioned that she was happy to try to look again at the wording of her amendment if it is not appropriate for the Government. I am conscious that the Government are also trying to move in other areas and that other departments are also issuing measures which may or may not bear directly on the issue. It seems that there is a very strong case—although I do not know how my noble friend will respond—for asking for this issue to be kept alive and brought back, perhaps at Third Reading, where a joint amendment might be brought between the noble Baroness and her supporters and the Government to try to make sure that we do what we can, even if it is not the complete picture, to take this another step down the road.

Photo of Baroness Bertin Baroness Bertin Conservative

I will make a very small Intervention because people have spoken so eloquently before me. I support the Amendment 100% and I am surprised that the Front Benches are not taking a different view. For crying out loud, I am not easily shocked but the briefing that we have all spoken about that we went to this afternoon shocked me. We are so behind the curve on this and we have to get ahead of it, so I support the amendment.

Photo of The Earl of Erroll The Earl of Erroll Crossbench

My Lords, I can see what the noble Lord, Lord Stevenson, is saying about Third Reading, but it would be wiser to vote for this Amendment now—if noble Lords have any conscience at all, they have to vote for it—and if it is slightly defective it can be amended at Third Reading. If we do not do it now, there is a huge risk of it not coming back.

Photo of Lord Clement-Jones Lord Clement-Jones Liberal Democrat Lords Spokesperson (Science, Innovation and Technology)

My Lords, from these Benches, I strongly support Amendment 209, which was so convincingly spoken to by the noble Baroness, Lady Kidron. I was very pleased to have signed it, alongside the noble Lord, Lord Russell of Liverpool, and the noble Baroness, Lady Morgan of Cotes.

This amendment is a vital safeguard against the “innovation first, safety later” culture of big tech. Although the Bill will rightly prohibit the creation of models specifically designed to generate CSA images, it remains silent on general-purpose models that can be easily manipulated or jailbroken to produce the same horrific results. As the unacceptable use of tools such as Grok—referred to by my noble friend Lady Benjamin in her powerful speech—has recently illustrated, we cannot leave the safety of our children to chance. We face a technological and moral emergency. The Internet Watch Foundation, represented at the meeting today which the noble Lord, Lord Russell, and my noble friend mentioned, has warned of a staggering 380% increase in confirmed cases of AI-generated child exploitation imagery. The noble Lord, Lord Russell, is right that the extent of this abuse is sickening beyond imagination.

The amendment would mandate a safety-by-design Intervention, requiring providers to proactively risk-assess their services and report identified risks to ofcom within 48 hours. In Committee, the Minister, the noble Lord, Lord Hanson, pushed back against this proposal, arguing that it

“would place unmanageable and unnecessary operational burdens on … the National Crime Agency and Ofcom

He further claimed that these measures risk creating “legal uncertainty” by “duplicating” the Online Safety Act. Both assertions need rebutting. First, protecting children from an industrial-scale explosion of AI-generated abuse is not an unnecessary burden; it is the primary duty of our law enforcement and regulatory bodies. Secondly, we cannot rely on the theoretical protections of an Online Safety Act designed for a world before generative AI. Ofcom itself has maintained what might be called a tactical ambiguity about how the Act applies to stand-alone AI chatbots and large language models.

Alongside the noble Baroness, Lady Kidron, who we will support if she puts the amendment to a vote, we ask for an ex ante duty: providers must check whether their models can be used to generate CSAM before they are released to the public. Voluntary commitments and retrospective enforcement are simply not enough. The Government have already committed to this principle; it is time to put that commitment into statute. I urge the Minister to accept Amendment 209 and ensure that we move away from ex post measures that address harm only after a child has been victimised.

The current definitions of “search” and “user-to-user” services do not neatly or comprehensively capture these new generative technologies. We cannot allow a situation where tech developers release highly capable models to the public without first explicitly checking whether they can be used to generate CSAM. Voluntary commitments and retrospective civil enforcement are simply not enough. We need this explicit statutory duty in the Bill today and I urge the Minister to accept Amendment 209.

Photo of Lord Davies of Gower Lord Davies of Gower Shadow Minister (Home Office)

My Lords, Amendment 209, in the name of the noble Baroness, Lady Kidron, would require providers of relevant online services to assess and address the risks that their platforms may be used for the creation, sharing or facilitation of child sexual abuse material, placing a strengthened duty on them to take preventive action. More than anyone in this Chamber, I fully recognise the intention behind strengthening preventive mechanisms and ensuring that providers properly assess and mitigate risks to children. Requiring companies to examine how their services may facilitate abuse is, in principle, entirely sensible. The scale and evolving nature of online exploitation means that proactive duties are essential.

However, I have some concerns about the proposed mechanism, on which I hope the Minister may also be able to provide some input. The amendment appears to rely on providers conducting their own risk assessments. That immediately raises several practical questions, such as what objective standard those assessments would be measured against, whether there would be statutory guidance setting out minimum criteria, and how consistency would be ensured across companies of vastly different sizes and capabilities. There also remains the crucial question of what enforcement mechanisms would apply if an assessment was superficial or inadequate. Without clear parameters and oversight, there is a danger that such a system could become uneven in practice.

I would welcome reassurance from the Minister as to how the Government intend to ensure that risk-based duties in this space are transparent and robust for the purposes of child protection. The question is not whether we act, but how. We all share the same objective of reducing the prevalence of child sexual abuse material and protecting children from exploitation. The challenge is ensuring that the mechanisms we legislate for are clear and enforceable in practice. I look forward to the Minister’s response.

Photo of Lord Hanson of Flint Lord Hanson of Flint The Minister of State, Home Department

I am grateful to the noble Baroness Kidron, for tabling Amendment 209 and for her commitment to doing all we can to prevent online harms. I was struck strongly by the contributions from the noble Baronesses, Lady Benjamin and Lady Bertin, the noble Lords, Lord Pannick and Lord Russell of Liverpool, my noble friend Lord Stevenson of Balmacara and the noble Earl, Lord Erroll.

This is a really serious issue. The Government are committed to making sure that we have constructive engagement with the noble Baroness, as I have tried to do, including one formal and one informal meeting this very day, to ensure that we can make this work in the interests of what everybody in this House wants to do: to ensure, particularly given the rapid development of technology, that the public, and especially children, are safeguarded from harm. This Government are committed to tackling sexual exploitation and abuse and ensuring that new technologies are developed and deployed responsibly. I know that that matters; I know that it is important, and I know that this Government want to make sure that we deal with it.

A few weeks ago, the Grok AI chatbot was used to create and share vile, degrading and non-consensual intimate deepfakes. This House should ensure that no one lives in fear of having their image sexually manipulated by technology. From the Prime Minister to the DSIT Secretary, we said at the time that we will do something to stamp out this demeaning and illegal image production.

I speak today for the Government, the Home Office, DSIT and my Right Honourable Friend the Prime Minister when I say that the Government have taken decisive action in the Bill. I draw noble Lords’ attention to Clauses 65 to 76, which we have brought forward to prohibit the creation of AI models that are designed specifically to generate child sexual abuse material. I will be frank with the House that we do not think that we have done enough so far in the Bill.

Things are moving, and that is why we have tabled an amendment today that will be debated later in the month—I hope on 18 March. It will confer a regulation-making power to enable us to expand the scope of the Online Safety Act to include unregulated generative AI services, such as chatbots. The amendment should have been tabled no later than 4 pm today; I apologise that the House has not had sight of it. I have tried to give details of the amendment to the noble Baroness, Lady Kidron. It will ensure that providers are bound to assess the level of risk held on their services and to take steps to protect users from illegal material, including by preventing CSAM.

I am happy to confirm to noble Lords that this will allow us to impose duties on these services that correspond to, or are similar to, the Act’s duties on tackling child sexual exploitation and abuse content. Chatbot providers will have a legal duty to protect all users from illegal content, including non-consensual sexual deepfakes, and where chatbots continue to generate such content, the providers should expect to face the consequences of breaking the law that we hope that this House will pass.

Photo of Baroness Kidron Baroness Kidron Crossbench 4:00, 2 March 2026

If I was in the same meeting as the Minister, officials were unable to say that LLMs and generative models would be covered by that Amendment. Indeed, they said that the policy of the Government was chatbots only. Chatbots are the subject of another amendment that I have tabled, which we will come to later. We have to be clear that the amendment in front of us remains only because I was told this afternoon that the new government amendment would not cover the same territory.

Photo of Lord Hanson of Flint Lord Hanson of Flint The Minister of State, Home Department

The government Amendment has been tabled. I am asking the noble Baroness—whether she does this is self-evidently a matter for her—to withdraw her amendment and look at the amendment that we have tabled today on a cross-party basis and on behalf of DSIT and the Home Office, the department that I represent. That amendment will be debated around 18 March, and she can make comments on it at that stage. I am trying to meet the needs of the House and the Government to respond to what are complex and difficult challenges. All I will say is that, by bringing more AI services into the scope of the Online Safety Act, we will ensure that there is a clear and consistent regulatory framework that will allow us to hold companies to account.

In Clause 93, we have introduced the technology testing defence that will enable persons authorised by the Secretary of State to test technology for these harms. The defence will give providers reassurance to test the robustness of their models’ safeguards, identify weaknesses and design out harmful inputs. This, in turn, will reduce the risk of their models being criminally misused, particularly to abuse women and children. This further supports all AI companies in scope of the Online Safety Act with their risk-assessment obligations.

Given those measures—the noble Baroness will have to make a judgment on this—but the Government consider that Amendment 209 is therefore unnecessary as it cuts across the approach that I have outlined to date both in the Bill, in Clause 93 and the clauses I outlined earlier, and the proposed amendment that I shared with her as best I could prior to this debate. The House has a chance to look at that now that it is published. This cuts across that duty and imposes a broad statutory duty on online services, duplicating regulatory mechanisms, and it could create legal uncertainty. The noble Lord, Clement-Jones, challenged me on that, but that is the view of Ministers, officials and our legal departments. We are worried about the similar enforcement routes outside the Online Safety Act framework.

We take this seriously. The points that the noble Baroness, Lady Benjamin, made are extremely important. I was not able to attend the briefing earlier, but I know how much that has impacted Members who have spoken today. The National Crime Agency and police will play a key role in protecting children from UK child abuse. It is warned that the scale and complexity of online child sexual abuse are resulting in tens of millions of annual referrals of suspected online sexual abuse. Policing resources are best spent on protecting children and arresting offenders, so it is appropriate that ofcom continues to play a critical regulatory role in preventing and tackling the AI generation of child sexual abuse material.

I have tried to persuade the noble Baroness but, if I have not succeeded, there will have to be a Division. I do not want there to be one because I think this House should speak with one voice on tackling this issue. The laudable objectives of the amendment are, we believe, better addressed through both the existing legislative framework and the targeted government amendment we have tabled today to expand the scope of the Online Safety Act to bring illegal content duties in line for chatbots. This will mean that providers need to mitigate potential risks to prevent children facing such abuse.

I hope I have convinced the noble Baroness. Again, I apologise to the House for the lateness of the tabling of the amendment. We are trying to work across government on this, and that amendment will be debated on 18 March. In light of that, I hope the noble Baroness feels able to withdraw her amendment.

Photo of Baroness Kidron Baroness Kidron Crossbench

My Lords, as a point of information, I feel it would be useful to say that Clauses 64 and 65, to which the Minister refers, are in fact a narrowing of an original Amendment, laid by me and other noble Lords, that the Government deliberately narrowed so that it deals only with electronic files that have been deliberately and exclusively created to create child sexual abuse. I very much welcome those clauses. However, if the Government had not narrowed that amendment, I would not be standing here today with this amendment.

I am grateful for the Minister’s time, and I am happy with the chatbot amendment as far as it goes—and inasmuch as I have seen it an hour before everyone else—but it does not deal with this issue. I rang the Minister this morning and asked for a meeting to say, “If you can tell me that this is covered by the chatbot amendment or that it’s already covered in another way, I will back down”. But I am afraid that nobody could tell me that, because it is not. That is just how it is.

I say to the noble Lord speaking for the Official Opposition, no, no, no. It is not okay to say, “We must work out how to do this”. This is an opportunity to work out how. We always do it this way. We pass an amendment; we get a power; and ofcom and the Government do the guidance. I say to the whole House, and particularly to my friends on the Labour Benches who may be considering voting against this, have any of you seen child sexual abuse made out of your image? I have. It is not funny; it is serious and it is easily done. I think it is unacceptable to vote against an amendment that says only, “Risk assess”. It is not okay to put a product out in the world if you do not have any responsibility for the harm it causes. So, I do not expect to win, because the Government are whipping against and the Opposition are sitting on their hands, but I think it is important to say to the people who are in a vortex of this kind of abuse that at least some of us in this House have their backs.

Photo of Lord Hanson of Flint Lord Hanson of Flint The Minister of State, Home Department

When the noble Baroness says that some of us in this House are concerned about this issue, I want to say to her that all of us in this House are concerned about this issue. The noble Lord, Lord Davies of Gower, and myself have many differences in this House, but we are at one in trying to improve the position of the regulations to tackle this issue. The Amendment that I have tabled is a very important step forward on behalf of the Government, on a DSIT and Home Office basis, and I am grateful for the support of the noble Lord. I do not want to have a Division in this House. The Government and the Opposition may well win that vote, but I do not want that Division to happen; I want us to go forward in a constructive way, to look at the amendments that are tabled and to make a change that really benefits people.

Photo of Baroness Kidron Baroness Kidron Crossbench

I say to the noble Lord that there is only one way to prevent a Division on this issue, which is either to stand at the Dispatch Box and say that it is covered, or that we will keep it alive until Third Reading so that we can make sure that it is covered. If I have insulted anyone by suggesting that only some of us are willing to walk through the Lobby to protect children from child sexual abuse, forgive me, but unless the Minister has something to say, then as a matter of principle I shall divide the House.

Ayes 121, Noes 145.

Division number 1 Crime and Policing Bill - Report (2nd Day) — Amendment 209

Aye: 119 Members of the House of Lords

No: 143 Members of the House of Lords

Aye: A-Z by last name

Tellers

No: A-Z by last name

Tellers

Amendment 209 disagreed.

Clause 66: Child sexual abuse image-generators: Northern Ireland

Amendment

As a bill passes through Parliament, MPs and peers may suggest amendments - or changes - which they believe will improve the quality of the legislation.

Many hundreds of amendments are proposed by members to major bills as they pass through committee stage, report stage and third reading in both Houses of Parliament.

In the end only a handful of amendments will be incorporated into any bill.

The Speaker - or the chairman in the case of standing committees - has the power to select which amendments should be debated.

amendment

As a bill passes through Parliament, MPs and peers may suggest amendments - or changes - which they believe will improve the quality of the legislation.

Many hundreds of amendments are proposed by members to major bills as they pass through committee stage, report stage and third reading in both Houses of Parliament.

In the end only a handful of amendments will be incorporated into any bill.

The Speaker - or the chairman in the case of standing committees - has the power to select which amendments should be debated.

Clause

A parliamentary bill is divided into sections called clauses.

Printed in the margin next to each clause is a brief explanatory `side-note' giving details of what the effect of the clause will be.

During the committee stage of a bill, MPs examine these clauses in detail and may introduce new clauses of their own or table amendments to the existing clauses.

When a bill becomes an Act of Parliament, clauses become known as sections.

OFCOM

Ofcom is the independent regulator and competition authority for the UK communications industries, with responsibilities across television, radio, telecommunications and wireless communications services.

Ofcom Web Site http://www.ofcom.org.uk

Minister

Ministers make up the Government and almost all are members of the House of Lords or the House of Commons. There are three main types of Minister. Departmental Ministers are in charge of Government Departments. The Government is divided into different Departments which have responsibilities for different areas. For example the Treasury is in charge of Government spending. Departmental Ministers in the Cabinet are generally called 'Secretary of State' but some have special titles such as Chancellor of the Exchequer. Ministers of State and Junior Ministers assist the ministers in charge of the department. They normally have responsibility for a particular area within the department and are sometimes given a title that reflects this - for example Minister of Transport.

laws

Laws are the rules by which a country is governed. Britain has a long history of law making and the laws of this country can be divided into three types:- 1) Statute Laws are the laws that have been made by Parliament. 2) Case Law is law that has been established from cases tried in the courts - the laws arise from test cases. The result of the test case creates a precedent on which future cases are judged. 3) Common Law is a part of English Law, which has not come from Parliament. It consists of rules of law which have developed from customs or judgements made in courts over hundreds of years. For example until 1861 Parliament had never passed a law saying that murder was an offence. From the earliest times courts had judged that murder was a crime so there was no need to make a law.

Ofcom

Ofcom is the independent regulator and competition authority for the UK communications industries, with responsibilities across television, radio, telecommunications and wireless communications services.

Ofcom Web Site http://www.ofcom.org.uk

Opposition

The Opposition are the political parties in the House of Commons other than the largest or Government party. They are called the Opposition because they sit on the benches opposite the Government in the House of Commons Chamber. The largest of the Opposition parties is known as Her Majesty's Opposition. The role of the Official Opposition is to question and scrutinise the work of Government. The Opposition often votes against the Government. In a sense the Official Opposition is the "Government in waiting".

another place

During a debate members of the House of Commons traditionally refer to the House of Lords as 'another place' or 'the other place'.

Peers return the gesture when they speak of the Commons in the same way.

This arcane form of address is something the Labour Government has been reviewing as part of its programme to modernise the Houses of Parliament.

intervention

An intervention is when the MP making a speech is interrupted by another MP and asked to 'give way' to allow the other MP to intervene on the speech to ask a question or comment on what has just been said.

right honourable friend

When speaking in the House of Commons, an MP will refer to another MP of the same party who is a member of the Privy Council as "my Right Honourable Friend"

Prime Minister

http://en.wikipedia.org/wiki/Prime_Minister_of_the_United_Kingdom

Secretary of State

Secretary of State was originally the title given to the two officials who conducted the Royal Correspondence under Elizabeth I. Now it is the title held by some of the more important Government Ministers, for example the Secretary of State for Foreign Affairs.

Division

The House of Commons votes by dividing. Those voting Aye (yes) to any proposition walk through the division lobby to the right of the Speaker and those voting no through the lobby to the left. In each of the lobbies there are desks occupied by Clerks who tick Members' names off division lists as they pass through. Then at the exit doors the Members are counted by two Members acting as tellers. The Speaker calls for a vote by announcing "Clear the Lobbies". In the House of Lords "Clear the Bar" is called. Division Bells ring throughout the building and the police direct all Strangers to leave the vicinity of the Members’ Lobby. They also walk through the public rooms of the House shouting "division". MPs have eight minutes to get to the Division Lobby before the doors are closed. Members make their way to the Chamber, where Whips are on hand to remind the uncertain which way, if any, their party is voting. Meanwhile the Clerks who will take the names of those voting have taken their place at the high tables with the alphabetical lists of MPs' names on which ticks are made to record the vote. When the tellers are ready the counting process begins - the recording of names by the Clerk and the counting of heads by the tellers. When both lobbies have been counted and the figures entered on a card this is given to the Speaker who reads the figures and announces "So the Ayes [or Noes] have it". In the House of Lords the process is the same except that the Lobbies are called the Contents Lobby and the Not Contents Lobby. Unlike many other legislatures, the House of Commons and the House of Lords have not adopted a mechanical or electronic means of voting. This was considered in 1998 but rejected. Divisions rarely take less than ten minutes and those where most Members are voting usually take about fifteen. Further information can be obtained from factsheet P9 at the UK Parliament site.

Dispatch Box

If you've ever seen inside the Commons, you'll notice a large table in the middle - upon this table is a box, known as the dispatch box. When members of the Cabinet or Shadow Cabinet address the house, they speak from the dispatch box. There is a dispatch box for the government and for the opposition. Ministers and Shadow Ministers speak to the house from these boxes.

teller

A person involved in the counting of votes. Derived from the word 'tallier', meaning one who kept a tally.