Part of Crime and Policing Bill - Committee (4th Day) (Continued) – in the House of Lords at 4:45 pm on 27 November 2025.
Viscount Colville of Culross
Deputy Chairman of Committees, Deputy Speaker (Lords)
4:45,
27 November 2025
My Lords, I put my name to Amendments 479 and 480, and I support the other amendments in this group. I have once again to thank my noble friend Lady Kidron for raising an issue which I had missed and which, I fear, the regulator might have missed as well. After extensive research, I too am very worried about the Online Safety Act, which many of your Lordships spent many hours refining. It does not cover some of the new developments in the digital world, especially personalised AI chatbots. They are hugely popular with children under 18; 31% use Snapchat’s My AI and 32% use Google’s Gemini.
The Online Safety Act Network set up an account on ChatGPT-5 using a 13 year-old persona. Within two minutes, the chatbot was engaged with the user about mental health, eating disorders and advice about how to safely cut yourself. Within 40 minutes, it had generated a list of pills for overdosing. The OSA was intended to stop such online behaviour. Your Lordships worked so hard to ensure that the OSA covered search and user-to-user functions in the digital space, but AI chatbots have varied functionalities that, as my noble friend pointed out, are not clearly covered by the legislation.
My noble friend Lady Kidron pointed out that, although Dame Melanie Dawes confirmed to the Communications and Digital Committee that chatbots are covered by the OSA, ofcom in its paper Era of Answer Engines admits:
“Under the OSA, a search service means a service that is, or which includes, a search engine, and this applies to some (though not all) GenAI search tools”.
There is doubt about whether the AI interpretive process, which can change the original search findings, excludes it from being in the scope of search under the OSA. More significantly, AI chatbots are not covered where the provider creates content that is personalised for one user and cannot be forwarded to another user. I am advised that this is not a user-to-user service as defined under the Act.
One chatbot that seems to fall under this category is Replika. I had never heard of it until I started my research for this Amendment. However, 2% of all children aged nine to 17 say that they have used the chatbot, and 18% have heard of it. Its aim is to stimulate human interaction by creating a replica chatbot personal to each user. It is very sophisticated in its output, using avatars to create images of a human interlocutor on screen and a speaking voice to reply conversationally to requests. The concern is that, unlike traditional search engines, it is programmed for sycophancy, or, in other words, to affirm and engage the user’s response—the more positive the response, the more engaged the child user. This has led to conversations with the AI companion talking the child user into self-harm and even suicide ideation.
Research by Internet Matters found that a third of children users think that interacting with chatbots is like talking to a friend. Most concerning is the level of trust they generate in children, with two in five saying that they have no concerns about the advice they are getting. However, because the replies are supposed to be positive, what might have started as trustworthy advice develops into unsafe advice as the conversation continues. My concern is that chatbots are not only affirming the echo chambers that we have seen developing for over a decade as a result of social media polarisation but are reducing yet further children’s critical faculties. We cannot leave the development of critical faculties to the already inadequate media literacy campaigns that Ofcom is developing. The Government need to discourage sycophancy and a lack of critical thinking at its digital source.
A driving force behind the Online Safety Act was the realisation that tech developers were prioritising user engagement over user safety. Once again, we find new AI products that are based on the same harmful principles. In looking at the Government’s headlong rush to surrender to tech companies in the name of AI growth, I ask your Lordships to read the strategic vision for AI laid out in the AI Opportunities Action Plan. It focuses on accelerating innovation but fails to mention once any concern about children’s safety. Your Lordships have fought hard to make children’s safety a priority online in legislation. Once again, I ask for these amendments to be scrutinised by Ofcom and the Government to ensure that children’s safety is at the very centre of their thinking as AI develops.
As a bill passes through Parliament, MPs and peers may suggest amendments - or changes - which they believe will improve the quality of the legislation.
Many hundreds of amendments are proposed by members to major bills as they pass through committee stage, report stage and third reading in both Houses of Parliament.
In the end only a handful of amendments will be incorporated into any bill.
The Speaker - or the chairman in the case of standing committees - has the power to select which amendments should be debated.
As a bill passes through Parliament, MPs and peers may suggest amendments - or changes - which they believe will improve the quality of the legislation.
Many hundreds of amendments are proposed by members to major bills as they pass through committee stage, report stage and third reading in both Houses of Parliament.
In the end only a handful of amendments will be incorporated into any bill.
The Speaker - or the chairman in the case of standing committees - has the power to select which amendments should be debated.
Ofcom is the independent regulator and competition authority for the UK communications industries, with responsibilities across television, radio, telecommunications and wireless communications services.
Ofcom Web Site http://www.ofcom.org.uk