Online Safety Bill - Committee (4th Day) – in the House of Lords at 4:18 pm on 2 May 2023.
My Lords, this large group of 33 amendments is concerned with preventing harm to children, by creating a legal requirement to design the sites and services that children will access in a way that will put their safety first and foremost. I thank my co-sponsors, the noble Baronesses, Lady Kidron and Lady Harding, and the noble Lord, Lord Knight. First of all, I wish to do the most important thing I will do today: to wish the noble Baroness, Lady Kidron, a very happy birthday.
My co-sponsors will deal with some of the more detailed elements of the 30 amendments that we are dealing with. These will include safety duties, functionality and harm, and codes of practice. I am sure that the noble Lords, Lord Stevenson and Lord Knight, and the right reverend Prelate the Bishop of Oxford will speak to their own amendments.
I will provide a brief overview of why we are so convinced of the paramount need for a safety by design approach to protect children and remind digital companies and platforms, forcibly and legally, of their obligation to include the interests and safety of children as a paramount element within their business strategies and operating models. These sites and services are artificial environments. They were designed artificially and can be redesigned artificially.
In her testimony to the US Senate in July 2021, the Facebook whistleblower Frances Haugen put her finger on it rather uncomfortably when talking about her erstwhile employer:
“Facebook know that they are leading young users to anorexia content … Facebook’s internal research is aware that there are a variety of problems facing children on Instagram … they know that severe harm is happening to children”.
She was talking about, probably, three years ago.
On the first day of Committee, the noble Lord, Lord Allan, who is not with us today, used the analogy of the legally mandated and regulated safe design of aeroplanes and automobiles and the different regimes that cover their usage to illustrate some of our choices in dealing with regulation. We know why aeroplanes and cars have to be designed safely; we also know that either form of transportation could be used recklessly and dangerously, which is why we do not allow children to fly or drive them.
First, let us listen to the designers of these platforms and services through some research done by the 5Rights Foundation in July 2021. These are three direct quotes from the designers:
“Companies make their money from attention. Reducing attention will reduce revenue. If you are a designer working in an attention business, you will design for attention … Senior stakeholders like simple KPIs. Not complex arguments about user needs and human values … If a senior person gives a directive, say increase reach, then that’s what designers design for without necessarily thinking about the consequences”.
Companies know exactly what they need to do to grow and to drive profitability. However, they mostly choose not to consider, mitigate and prioritise to avoid some of the potentially harmful consequences. What they design and prioritise are strategies to maximise consumption, activity and profitability. They are very good at it.
Let us hear what the children say, remembering that some recent research indicates that 42% of five to 12 year-olds in this country use social media. The Pathways research project I referred to earlier worked closely with 21 children aged 12 to 18, who said: “We spend more time online than we feel we should, but it’s tough to stop or cut down”. “If we’re not on social media, we feel excluded”. “We like and value the affirmations and validations we receive”. “We create lots of visual content, much of it about ourselves, and we share it widely”. “Many of us are contacted by unknown adults”. “Many of us recognise that, through using social media, we have experienced body image and relationships problems”.
To test whether the children in this research project were accurately reporting their experiences, the project decided to place a series of child avatars—ghost children, in effect—on the internet, whose profiles very clearly stated that they were children. It did this to test whether these experiences were true.
They found—in many cases within a matter of hours of the profiles going online—proactive contacting by strangers and rapid recommendations to engage more and more. If searches were conducted for eating disorders or self-harm, the avatars were quickly able to access content irrespective of their stated ages and clearly evident status as children. At the same time they were being sent harmful or inappropriate content, they also received age-relevant advertising for school revision and for toys—the social media companies knew that these accounts were registered as children.
This research was done two years ago. Has anything improved since then? It just so happens that 5Rights has produced another piece of research which is about to be released, and which used the exact same technique—creating avatars to see what they would experience online. They used 10 avatars based on real children aged between 10 and 16, so what happened? For an 11 year-old avatar, Instagram was recommending images of knives with the caption “This is what I use to self-harm”; design features were leading children from innocent searches to harmful content very quickly.
I think any grandparents in the Chamber will be aware of an interesting substance known as “Slime”—a form of particularly tactile playdough which one’s grandchildren seem to enjoy. Typing in “Slime” on Reddit was one search, and one click, away from pornography; exactly the same thing happened on Reddit when the avatar typed in “Minecraft”, another very popular game with our children or grandchildren. A 15 year-old female avatar was private-messaged on Instagram by a user that she did not follow—an unknown adult who encouraged her to link on to pornographic content on Telegram, another instant messaging service. On the basis of this evidence, it appears that little or nothing has changed; it may have even got slightly worse.
By an uncomfortable coincidence, last week, Meta, the parent company of Facebook and Instagram, published better than expected results and saw its market value increase by more than $50 billion in after-hours trading. Mark Zuckerberg, the founder of Meta, proudly announced that Meta is pouring investment into artificial intelligence tools to make its platform more engaging and its advertising more effective. Of particular interest and concern given the evidence of the avatars was his announcement that since the introduction of Reels, a short-term video feed designed specifically to respond to competition from TikTok, its AI-driven recommendations had boosted the average time people spend on Instagram by 24%.
To return to the analogy of planes and cars used by the noble Lord, Lord Allan, we are dealing here with planes and cars in the shape of platforms and applications which we know are flawed in their design. They are not adequately designed for safety, and we know that they can put users, particularly children and young people, in the way of great harm, as many grieving families can testify.
In conclusion, our amendments propose that companies must design digital services that cater for the vulnerabilities, needs, and rights of children and young people by default; children’s safety cannot and must not be an afterthought or a casualty of their business models. We are asking for safety by design to protect children to become the mandatory standard. What we have today is unsafe design by default, driven by commercial strategies which can lead to children becoming collateral damage.
Given that it is the noble Baroness’s birthday, I am sure we can feel confident that the Minister will have a positive tone when he replies. I beg to move.
It is a great pleasure to follow my noble friend Lord Russell and to thank him for his good wishes. I assure the Committee that there is nowhere I would rather spend my birthday, in spite of some competitive offers. I remind noble Lords of my interests in the register, particularly as the chair of 5Rights Foundation.
As my noble friend has set out, these amendments fall in three places: the risk assessments, the safety duties and the codes of practice. However, together they work on the overarching theme of safety by design. I will restrict my detailed remarks to a number of amendments in the first two categories. This is perhaps a good moment to recall the initial work of Carnegie, which provided the conceptual approach of the Bill several years ago in arguing for a duty of care. The Bill has gone many rounds since then, but I think the principle remains that a regulated service should consider its impact on users before it causes them harm. Safety by design, to which all the amendments in this group refer, is an embodiment of a duty of care. In thinking about these amendments as a group, I remind the Committee that both the proportionality provisions and the fact that this is a systems and processes Bill means that no company can, should or will be penalised for a single piece of content, a single piece of design or, indeed, low-level infringements.
Amendments 24, 31, 77 and 84 would delete “content” from the Government’s description of what is harmful to children, meaning that the duty is to consider harm in the round rather than just harmful content. The definition of “content” is drawn broadly in Clause 207 as
“anything communicated by means of an internet service”, but the examples in the Bill, including
“written material … music and data of any description”, once again fail to include design features that are so often the key drivers of harm to children.
On day three of Committee, the Minister said:
“The Bill will address cumulative risk where it is the result of a combination of high-risk functionality, such as live streaming, or rewards in service … This will initially be identified through Ofcom’s sector risk assessments, and Ofcom’s risk profiles and risk assessment guidance will reflect where a combination of risk in functionalities such as these can drive up the risk of harm to children. Service providers will have to take Ofcom’s risk profiles into account in their own risk assessments for content which is illegal or harmful to children”.—[Official Report, 27/4/23; col. 1385.]
However, in looking at the child safety duties, Clause 11(5) says:
“The duties … in subsections (2) and (3) apply across all areas of a service, including the way it is designed, operated and used”, but subsection (14) says:
“The duties set out in subsections (3) and (6)”— which are the duties to operate proportionate systems and processes to prevent and protect children from encountering harmful content and to include them in terms of service—
“are to be taken to extend only to content that is harmful to children where the risk of harm is presented by the nature of the content (rather than the fact of its dissemination)”.
I hesitate to say whether that is contradictory. I am not actually sure, but it is confusing. I am concerned that while we are reassured that “content” means content and activity and that the risk assessment considers functionality, “harm” is then repeatedly expressed only in the form of content.
Over the weekend, I had an email exchange with the renowned psychoanalyst and author, Norman Doidge, whose work on the plasticity of the brain profoundly changed how we think about addiction and compulsion. In the exchange, he said that
“children’s exposures to super doses, of supernormal images and scenes, leaves an imprint that can hijack development”.
Then, he said that
“the direction seems to be that AI would be working out the irresistible image or scenario, and target people with these images, as they target advertising”.
His argument is that it is not just the image but the dissemination and tailoring of that image that maximises the impact. The volume and frequency of those images create habits in children that take a lifetime to change—if they change at all. Amendments 32 and 85 would remove this language to ensure that content that is harmful by virtue of its dissemination is accounted for.
I turn now to Amendments 28 and 82, which cut the reference to the
“size and capacity of the provider of the service” in deeming what measures are proportionate. We have already discussed that small is not safe. Such platforms such as Yubo, Clapper and Discord have all been found to harm children and, as both the noble Baroness, Lady Harding, and the noble Lord, Lord Clement-Jones, told us, small can become big very quickly. It is far easier to build to a set of rules than it is to retrofit them after the event. Again, I point out that Ofcom already has duties of proportionality; adding size and capacity is unnecessary and may tip the scale to creating loopholes for smaller services.
Amendment 138 seeks to reverse the exemption in Clause 54 of financial harms. More than half of the 100 top-grossing mobile phone apps contain loot boxes, which are well established as unfair and unhealthy, priming young children to gamble and leading to immediate hardship for parents landed with extraordinary bills.
By rights, Amendments 291 and 292 could fit in the future-proof set of amendments. The way that the Bill in Clause 204 separates out functionalities in terms of search and user-to-user is in direct opposition to the direction of travel in the tech sector. TikTok does shopping, Instagram does video, Amazon does search; autocomplete is an issue across the full gamut of services, and so on and so forth. This amendment simply combines the list of functionalities that must be risk-assessed and makes them apply on any regulated service. I cannot see a single argument against this amendment: it cannot be the Government’s intention that a child can be protected, on search services such as Google, from predictive search or autocomplete, but not on TikTok.
Finally, Amendment 295 will embed the understanding that most harm is cumulative. If the Bereaved Parents for Online Safety were in the Chamber, or any child caught up in self-harm, depression sites, gambling, gaming, bullying, fear of exposure, or the inexorable feeling of losing their childhood to an endless scroll, they would say at the top of their voices that it is not any individual piece of content, or any one moment or incident, but the way in which they are nudged, pushed, enticed and goaded into a toxic, harmful or dangerous place. Adding the simple words
“the volume of the content and the frequency with which the content is accessed” to the interpretation of what can constitute harm in Clause 205 is one of the most important things that we can do in this Chamber. This Bill comes too late for a whole generation of parents and children but, if these safety by design amendments can protect the next generation of children, I will certainly be very glad.
My Lords, it is an honour, once again, to follow the noble Baroness, Lady Kidron, and the noble Lord, Lord Russell, in this Committee. I am going to speak in detail to the amendments that seek to change the way the codes of practice are implemented. Before I do, however, I will very briefly add my voice to the general comments that the noble Baroness, Lady Kidron, and the noble Lord, Lord Russell, have just taken us through. Every parent in the country knows that both the benefit and the harm that online platforms can bring our children is not just about the content. It is about the functionality: the way these platforms work; the way they suck us in. They do give us joy but they also drive addiction. It is hugely important that this Bill reflects the functionality that online platforms bring, and not just content in the normal sense of the word “content”.
I will now speak in a bit more detail about the following amendments: Amendments 65, 65ZA, 65AA, 89, 90, 90B, 96A, 106A, 106B, 107A, 114A—I will finish soon, I promise—112, 122ZA, 122ZB and 122ZC.
My noble friend may have left one out.
I am afraid I may well have done.
That list shows your Lordships some of the challenges we all have with the Bill. All these amendments seek to ensure that the codes of practice relating to child safety are binding. Such codes should be principles-based and flexible to allow companies to take the most appropriate route of compliance, but implementing these codes should be mandatory, rather than, as the Bill currently sets out, platforms being allowed to use “alternative measures”. That is what all these amendments do—they do exactly the same thing. That was a clear and firm recommendation from the joint scrutiny committee. The government’s response to that joint scrutiny committee report was really quite weak. Rather than rehearse the joint scrutiny committee’s views, I will rehearse the Government’s response and why it is not good enough to keep the Bill as it stands.
The first argument the Government make in their response to the joint scrutiny report is that there is no precedent for mandatory codes of conduct. But actually there are. There is clear precedent in child protection. In the physical world, the SEND code for how we protect some of our most vulnerable children is mandatory. Likewise, in the digital world, the age-appropriate design code, which we have mentioned many a time, is also mandatory. So there is plenty of precedent.
The second concern—this is quite funny—was that stakeholders were concerned about having multiple codes of conduct because it could be quite burdensome on them. Well, forgive me for not crying too much for these enormous tech companies relative to protecting our children. The burden I am worried about is the one on Ofcom. This is an enormous Bill, which places huge amounts of work on a regulator that already has a very wide scope. If you make codes of conduct non-mandatory, you are in fact making the work of the regulator even harder. The Government themselves in their response say that Ofcom has to determine what the minimum standards should be in these non-binding codes of practice. Surely it is much simpler and more straightforward to make these codes mandatory and, yes, to add potentially a small additional burden to these enormous tech companies to ensure that we protect our children.
The third challenge is that non-statutory guidance already looks as if it is causing problems in this space. On the video-sharing platform regime, which is non-mandatory, Ofcom has already said that in its first year of operation it has
“seen a large variation in platforms’ readiness to engage with Ofcom”.
All that will simply make it harder and harder, so the burden will lie on this regulator—which I think all of us in this House are already worried is being asked to do an awful lot—if we do not make it very clear what is mandatory and what is not. The Secretary of State said of the Bill that she is
“determined to put these vital protections for … children … into law as quickly as possible”.
A law that puts in place a non-mandatory code of conduct is not what parents across the country would expect from that statement from the Secretary of State. People out there—parents and grandparents across the land—would expect Ofcom to be setting some rules and companies to be required to follow them. That is exactly what we do in the physical world, and I do not understand why we would not want to do it in the digital world.
Finally—I apologise for having gone on for quite a long time—I will very briefly talk specifically to Amendment 32A, in the name of the noble Lord, Lord Knight, which is also in this group. It is a probing amendment which looks at how the Bill will address and require Ofcom and participants to take due regard of VPNs: the ability for our savvy children—I am the mother of two teenage girls—to get round all this by using a VPN to access the content they want. This is an important amendment and I am keen to hear what my noble friend Minister will say in response. Last week, I spoke about my attempts to find out how easy it would be for my 17 year-old daughter to access pornography on her iPhone. I spoke about how I searched in the App Store on her phone and found that immediately a whole series of 17-plus-rated apps came up that were pornography sites. What I did not mention then is that with that—in fact, at the top of the list—came a whole series of VPN apps. Just in case my daughter was naive enough to think that she could just click through and watch it, and Apple was right that 17 year-olds were allowed to watch pornography, which obviously they are not, the App Store was also offering her an easy route to access it through a VPN. That is not about content but functionality, and we need to properly understand why this bundle of amendments is so important.
My Lords, I was not going to speak on this group, but I was provoked into offering some reflections on the speech by the noble Lord, Lord Russell of Liverpool, especially his opening remarks about cars and planes, which he said were designed to be safe. He did not mention trains, about which I know something as well, and which are also designed to be safe. These are a few initial reflective points. They are designed in very different ways. An aeroplane is designed never to fail; a train is designed so that if it fails, it will come to a stop. They are two totally different approaches to safety. Simply saying that something must be designed to be safe does not answer questions; it opens questions about what we actually mean by that. The noble Lord went on to say that we do not allow children to drive cars and fly planes. That is absolutely true, but the thrust of his amendment is that we should design the internet so that it can be driven by children and used by children— so that it is designed for them, not for adults. That is my problem with the general thrust of many of these amendments.
A further reflection that came to mind as the noble Lord spoke was on a book of great interest that I recommend to noble Lords. It is a book by the name of Risk written in 1995 by Professor John Adams, then professor of geography at University College London. He is still an emeritus professor of geography there. It was a most interesting work on risk. First, it reflected how little we actually know of many of the things of which we are trying to assess risk.
More importantly, he went on to say that people have an appetite for risk. That appetite for risk—that risk budget, so to speak—changes over the course of one’s life: one has much less appetite for risk when one gets to a certain age than perhaps one had when one was young. I have never bungee jumped in my life, and I think I can assure noble Lords that the time has come when I can say I never shall, but there might have been a time when I was younger when I might have flung myself off a cliff, attached to a rubber band and so forth—noble Lords may have done so. One has an appetite for risk.
The interesting thing that he went on to develop from that was the notion of risk compensation: that if you have an appetite for risk and your opportunities to take risks are taken away, all you do is compensate by taking risks elsewhere. So a country such as New Zealand, which has some of the strictest cycling safety laws, also has a very high incidence of bungee jumping among the young; as they cannot take risks on their bicycles, they will find ways to go and do it elsewhere.
Although these reflections are not directly germane to the amendments, they are important as we try to understand what we are seeking to achieve here, which is a sort of hermetically sealed absence of risk for children. I do not think it will work. I said at Second Reading that I thought the flavour of the debate was somewhat similar to a late medieval conclave of clerics trying to work out how to mitigate the harmful effects of the invention of movable type. That did not work either, and I think we are in a very similar position today as we discuss this.
There is also the question of harm and what it means. While the examples being given by noble Lords are very specific and no doubt genuinely harmful, and are the sorts of things that we should like to stop, the drafting of the amendments, using very vague words such as “harm”, is dangerous overreach in the Bill. To give just one example, for the sake of speed, when I was young, administering the cane periodically was thought good for a child in certain circumstances. The mantra was, “Spare the rod and spoil the child”, though I never heard it said. Nowadays, we would not think it morally or psychologically good to do physical harm to a child. We would regard it as an unmitigated harm and, although not necessarily banned or illegal, it is something that—
My Lords, I respond to the noble Lord in two ways. First, I ask him to reflect on how the parents of the children who have died through what the parents would undoubtedly view as serious and unbearable harm would feel about his philosophical ruminations. Secondly, as somebody who has the privilege of being a Deputy Speaker in your Lordships’ House, it is incumbent and germane for us all to focus on the amendment in question and stay on it, to save time and get through the business.
Well, I must regard myself as doubly rebuked, and unfairly, because my reflections are very relevant to the amendments, and I have developed them in that direction. In respect of the parents, they have suffered very cruelly and wrongly, but although it may sound harsh, as I have said in this House before on other matters, hard cases make bad law. We are in the business of trying to make good law that applies to the whole population, so I do not think that these are wholly—
If my noble friend could, would he roll back the health and safety regulations for selling toys, in the same way that he seems so happy to have no health and safety regulations for children’s access to digital toys?
My Lords, if the internet were a toy, aimed at children and used only by children, those remarks would of course be very relevant, but we are dealing with something of huge value and importance to adults as well. It is the lack of consideration of the role of adults, the access for adults and the effects on freedom of expression and freedom of speech, implicit in these amendments, that cause me so much concern.
I seem to have upset everybody. I will now take issue with and upset the noble Baroness, Lady Benjamin, with whom I have not engaged on this topic so far. At Second Reading and earlier in Committee, she used the phrase, “childhood lasts a lifetime”. There are many people for whom this is a very chilling phrase. We have an amendment in this group—a probing amendment, granted—tabled by the noble Lord, Lord Knight of Weymouth, which seeks to block access to VPNs as well. We are in danger of putting ourselves in the same position as China, with a hermetically sealed national internet, attempting to put borders around it so that nobody can breach it. I am assured that even in China this does not work and that clever and savvy people simply get around the barriers that the state has erected for them.
Before I sit down, I will redeem myself a little, if I can, by giving some encouragement to the noble Baroness, Lady Kidron, on Amendments 28 and 32 —although I think the amendments are in the name of the noble Lord, Lord Russell of Liverpool. These amendments, if we are to assess the danger posed by the internet to children, seek to substitute an assessment of the riskiness of the provider for the Government’s emphasis on the size of the provider. As I said earlier in Committee, I do not regard size as being a source of danger. When it comes to many other services— I mentioned that I buy my sandwich from Marks & Spencer as opposed to a corner shop—it is very often the bigger provider I feel is going to be safer, because I feel I can rely on its processes more. So I would certainly like to hear how my noble friend the Minister responds on that point in relation to Amendments 28 and 32, and why the Government continue to put such emphasis on size.
More broadly, in these understandable attempts to protect children, we are in danger of using language that is far too loose and of having an effect on adult access to the internet which is not being considered in the debate—or at least has not been until I have, however unwelcomely, raised it.
My Lords, I assure your Lordships that I rise to speak very briefly. I begin by reassuring my noble friend Lord Moylan that he is loved in this Chamber and outside. I was going to say that he is the grit in the oyster that ensures that a consensus does not establish itself and that we think hard about these amendments, but I will revise that and say he is now the bungee jumper in our ravine. I think he often makes excellent and worthwhile points about the scope and reach of the Bill and the unintended consequences. Indeed, we debated those when we debated the amendments relating to Wikipedia, for example.
Obviously, I support these amendments in principle. The other reason I wanted to speak was to wish the noble Baroness, Lady Kidron—Beeban—a happy birthday, because I know that these speeches will be recorded on parchment bound in vellum and presented to her, but also to thank her for all the work that she has done for many years now on the protection of children’s rights on the internet. It occurred to me, as my noble friend Lady Harding was speaking, that there were a number of points I wanted to seek clarity on, either from the Minister or from the proponents of the amendments.
First, the noble Baroness, Lady Harding, mentioned the age-appropriate design code, which was a victory for the noble Baroness, Lady Kidron. It has, I think, already had an impact on the way that some sites that are frequented by children are designed. I know, for instance, that TikTok—the noble Baroness will correct me—prides itself on having made some changes as a result of the design code; for example, its algorithms are able, to a certain extent, to detect whether a child is under 13. I know anecdotally that children under 13 sometimes do have their accounts taken away; I think that is a direct result of the amendments made by the age-appropriate design code.
I would like to understand how these amendments, and the issue of children’s rights in this Bill, will interact with the age-appropriate design code, because none of us wants the confetti of regulations that either overlap or, worse, contradict themselves.
Secondly, I support the principle of functionality. I think it is a very important point that these amendments make: the Bill should not be focused solely on content but should take into account that functionality leads to dangerous content. That is an important principle on which platforms should be held to account.
Thirdly, going back to the point about the age-appropriate design code, the design of websites is extremely important and should be part of the regulatory system. Those are the points I wanted to make.
In relation to how my noble friend Lord Moylan is approaching the Bill, I would say this: having been a Minister when the British Government—and, indeed, other Governments—had no power at all, it was very telling when the then Prime Minister threatened Google with legislation on the issue of child abuse images, saying, “If you do not do something, I will legislate”.
At that time, I was on the tech side of the argument. Google went from saying, “It is impossible to do anything” to identifying 130,000 phrases that people might type into search engines when searching for child abuse images, which, in theory—I have not tried this myself, I hasten to add—would come up with no return and, indeed, a warning that the person in question was searching for those images.
Again, I say to my noble friend Lord Moylan—who I encourage to keep going with his scepticism about the Bill; it is important—that it is a bit of a dead end at any point in his argument to compare us with China. That is genuinely comparing apples with oranges. When people were resisting regulation in this sphere, they would always say, “That’s what the Chinese want”. We have broadcasting regulation and other forms of health and safety regulation. It is not the mark of an autocratic or totalitarian state to have regulation; platforms need to be held to account. I simply ask the proponents of the amendments to make it clear as they proceed how this fits in with existing regulations, such as the age-appropriate design code.
My Lords, I want, apart from anything else, to speak in defence of philosophical ruminations. The only way we can scrutinise the amendments in Committee is to do a bit of philosophical rumination. We are trying to work out what the amendments might mean in terms of changing the Bill.
I read these amendments, noted their use of “eliminate” —we have to “eliminate” all risks—and wondered what that would mean. I do not want to feel that I cannot ask these kinds of difficult questions for fear that I will offend a particular group or that it would be insensitive to a particular group of parents. It is difficult but we are required as legislators to try to understand what each other are trying to change, or how we are going to try to change the law.
I say to those who have put “eliminate” prominently in a number of these amendments that it is impossible to eliminate all risks to children—is it not?—if they are to have access to the online world, unless you ban them from the platforms completely. Is “eliminate” really helpful here?
Previously in Committee, I talked a lot about the potential dangers, psychologically and with respect to development, of overcoddling young people, of cotton wool kids, and so on. I noted an article over the weekend by the science journalist Tom Chivers, which included arguments from the Oxford Internet Institute and various psychologists that the evidence on whether social media is harmful, particularly for teenagers, is ambiguous.
I am very convinced by the examples brought forward by the noble Baroness, Lady Kidron—and I too wish her a happy birthday. We all know about the targeting of young people and so forth, but I am also aware of the positives. I always try to balance these things out and make sure that we do not deny young people access to the positives. In fact, I found myself cheering at the next group of amendments, which is unusual. First, they depend on whether you are four or 14—in other words, you have to be age-specific—and, secondly, they recognise that we do not want to pass anything in the Bill that actually denies children access to either their own privacy or the capacity to know more.
I also wanted to explore a little the idea of expanding the debate away from content to systems, because this is something that I think I am not quite understanding. My problem is that moving away from the discussion on whether content is removed or accessible, and focusing on systems, does not mean that content is not in scope. My worry is that the systems will have an impact on what content is available.
Let me give some examples of things that can become difficult if we think that we do not want young people to encounter violence and nudity—which makes it seem as though we know what we are talking about when we talk about “harmful”. We will all recall that, in 2018, Facebook removed content from the Anne Frank Centre posted by civil rights organisations because it included photographs of the Holocaust featuring undressed children among the victims. Facebook apologised afterwards. None the less, my worry is these kinds of things happening. Another example, in 2016, was the removal of the Pulitzer Prize-winning photograph “The Terror of War”, featuring fleeing Vietnamese napalm victims in the 1970s, because the system thought it was something dodgy, given that the photo was of a naked child fleeing.
I need to understand how system changes will not deprive young people of important educational information such as that. That is what I am trying to distinguish. The point made by the noble Lord, Lord Moylan, about “harmful” not being defined—I have endlessly gone on about this, and will talk more about it later—is difficult because we think that we know what we mean by “harmful” content.
Finally, on the amendments requiring compliance with Ofcom codes of practice, that would give an extraordinary amount of power to the regulator and the Secretary of State. Since I have been in this place, people have rightly drawn my attention to the dangers of delegating power to the Executive or away from any kind of oversight—there has been fantastic debate and discussion about that. It seems to me that these amendments advocate delegated powers being given to the Secretary of State and Ofcom, an unelected body —the Secretary of State could amend for reasons of public policy in order to protect children—and this is to be put through the negative procedure. In any other instance, I would have expected outcry from the usual suspects, but, because it involves children, we are not supposed to object. I worry that we need to have more scrutiny of such amendments and not less, because in the name of protecting children unintended consequences can occur.
I want to answer the point that amendments cannot be seen in isolation. Noble Lords will remember that we had a long and good debate about what constituted harms to children. There was a big argument and the Minister made some warm noises in relation to putting harms to children in the Bill. There is some alignment between many people in the Chamber whereby we and Parliament would like to determine what harm is, and I very much share the noble Baroness’s concern about pointing out what that is.
On the issue of the system versus the content, I am not sure that this is the exact moment but the idea of unintended consequences keeps getting thrown up when we talk about trying to point the finger at what creates harm. There are unintended consequences now, except neither Ofcom nor the Secretary of State or Parliament but only the tech sector has a say in what the unintended consequences are. As someone who has been bungee jumping, I am deeply grateful that there are very strict rules under which that is allowed to happen.
My Lords, I support the amendments in this group that, with regard to safety by design, will address functionality and harms—whatever exactly we mean by that—as well as child safety duties and codes of practice. The noble Lord, Lord Russell, and the noble Baronesses, Lady Harding and Lady Kidron, have laid things out very clearly, and I wish the noble Baroness, Lady Kidron, a happy birthday.
I also support Amendment 261 in the name of my right reverend friend the Bishop of Oxford and supported by the noble Lord, Lord Clement-Jones, and the noble Viscount, Lord Colville. This amendment would allow the Secretary of State to consider safety by design, and not just content, when reviewing the regime.
As we have heard, a number of the amendments would amend the safety duties to children to consider all harms, not just harmful content, and we have begun to have a very interesting debate on that. We know that service features create and amplify harms to children. These harms are not limited to spreading harmful content; features in and of themselves may cause harm—for example, beautifying filters, which can create unrealistic body ideals and pressure on children to look a certain way. In all of this, I want us to listen much more to the voices of children and young people—they understand this issue.
Last week, as part of my ongoing campaign on body image, including how social media can promote body image anxiety, I met a group of young people from two Gloucestershire secondary schools. They were very good at saying what the positives are, but noble Lords will also be very familiar with many of the negative issues that were on their minds, which I will not repeat here. While they were very much alive to harmful content and the messages it gives them, they were keen to talk about the need to address algorithms and filters that they say feed them strong messages and skew the content they see, which might not look harmful but, because of design, accentuates their exposure to issues and themes about which they are already anxious. Suffice to say that underpinning most of what they said to me was a sense of powerlessness and anxiety when navigating the online world that is part of their daily lives.
The current definition of content does not include design features. Building in a safety by design principle from the outset would reduce harms in a systematic way, and the amendments in this group would address that need.
My Lords, I support this group of amendments. Last week, I was lucky—that is not necessarily the right word—to participate in a briefing organised by the noble Lord, Lord Russell of Liverpool, with the 5Rights Foundation on its recent research, which the noble Lord referred to. As the mother of a 13 year-old boy, I came away wondering why on earth you would not want to ensure safety by design for children.
I am aware from my work with disabled children that we know, as Ofcom knows from its own research, that children—or indeed anyone with a long-term health impact or a disability—are far more likely to encounter and suffer harm online. As I say, I struggle to see why you would not want to have safety by design.
This issue must be seen in the round. In that briefing we were taken through how quickly you could get from searching for something such as “slime” to extremely graphic pornographic content. As your Lordships can imagine, I went straight back to my 13 year-old son and said, “Do you know about slime and where you have you seen it?” He said, “Yes, Mum, I’ve watched it on YouTube”. That echoes the point made by the noble Baroness, Lady Kidron—to whom I add my birthday wishes—that these issues have to be seen in the round because you do not just consume content; you can search on YouTube, shop on Google, search on Amazon and all the rest of it. I support this group of amendments.
I too wish my noble friend Lady Kidron a happy birthday.
I will speak to Amendment 261. Having sat through the Communications Committee’s inquiries on regulating the internet, it seemed to me that the real problem was the algorithms and the way they operated. We have heard that again and again throughout the course of the Bill. It is no good worrying just about the content, because we do not know what new services will be created by technology. This morning we heard on the radio from the Google AI expert, who said that we have no idea where AI will go or whether it will become cleverer than us; what we need to do is to keep an eye on it. In the Bill, we need to make sure that we are looking at the way technology is being developed and the possible harms it might create. I ask the Minister to include that in his future-proofing of the Bill, because, in the end, this is a very fast-moving world and ecosystem. We all know that what is present now in the digital world might well be completely changed within a few years, and we need to remain cognisant of that.
My Lords, we have already had some very significant birthdays during the course of the Bill, and I suspect that, over many more Committee days, there will be many more happy birthdays to celebrate.
This has been a fascinating debate and the Committee has thrown up some important questions. On the second day, we had a very useful discussion of risk which, as the noble Lord, Lord Russell, mentioned, was prompted by my noble friend Lord Allan. In many ways, we have returned to that theme this afternoon. The noble Baroness, Lady Fox, who I do not always agree with, asked a fair question. As the noble Baroness, Lady Kidron, said, it is important to know what harms we are trying to prevent—that is how we are trying to define risk in the Bill—so that is an absolutely fair question.
The Minister has shown flexibility. Sadly, I was not able to be here for the previous debate, and it is probably because I was not that he conceded the point and agreed to put children’s harms in the Bill. That takes us a long way further, and I hope he will demonstrate that kind of flexibility as we carry on through the Bill.
The noble Lord, Lord Moylan, and I have totally different views about what risk it is appropriate for children to face. I am afraid that I absolutely cannot share his view that there is this level of risk. I do not believe it is about eliminating risk—I do not see how you can—but the Bill should be about preventing online risk to children; it is the absolute core of the Bill.
As the noble Lord, Lord Russell, said, the Joint Committee heard evidence from Frances Haugen about the business model of the social media platforms. We listened to Ian Russell, the father of Molly, talk about the impact of an unguarded internet on his daughter. It is within the power of the social media companies to do something about that; this is not unreasonable.
I was very interested in what the noble Viscount, Lord Colville, said. He is right that this is about algorithms, which, in essence, are what we are trying to get to in all the amendments in this really important group. It is quite possible to tackle algorithms if we have a requirement in the Bill to do so, and that is why I support Amendment 261, which tries to address to that.
However, a lot of the rest of the amendments are trying to do exactly the same thing. There is a focus not just on moderating harmful content but on the harmful systems that make digital services systematically unsafe for children. I listened with great interest to what the noble Lord, Lord Russell, said about the 5Rights research which he unpacked. We tend to think that media platforms such as Reddit are relatively harmless but that is clearly not the case. It is very interesting that the use of avatars is becoming quite common in the advertising industry to track where advertisements are ending up—sometimes, on pornography sites. It is really heartening that an organisation such as 5Rights has been doing that and coming up with its conclusions. It is extremely useful for us as policymakers to see the kinds of risks that our children are undertaking.
We were reminded about the origins—way back, it now seems—of the Carnegie duty of care. In a sense, we are trying to make sure that that duty of care covers the systems. We have talked about the functionality and harms in terms of risk assessment, about the child safety duties and about the codes of practice. All those need to be included within this discussion and this framework today to make sure that that duty of care really sticks.
I am not going to go through all the amendments. I support all of them: ensuring functionalities for both types of regulated service, and the duty to consider all harms and not just harmful content. It is absolutely not just about the content but making sure that regulated services have a duty to mitigate the impact of harm in general, not just harms stemming from content.
The noble Baroness, Lady Harding, made a terrific case, which I absolutely support, for making sure that the codes of practice are binding and principle based. At the end of the day, that could be the most important amendment in this group. I must admit that I was quite taken with her description of the Government’s response, which was internally contradictory. It was a very weak response to what I, as a member of the Joint Committee, thought was a very strong and clear recommendation about minimum standards.
This is a really important group of amendments and it would not be a difficult concession for the Government to make. They may wish to phrase things in a different way but we must get to the business case and the operation of the algorithms; otherwise, I do not believe this Bill is going to be effective.
I very much take on board what about the noble Viscount said about looking to the future. We do not know very much about some of these new generative AI systems. We certainly do not know a great deal about how algorithms within social media companies operate. We will come, no doubt, to later amendments on the ability to find out more for researchers and so on, but transparency was one of the things our Joint Committee was extremely keen on, and this is a start.
My Lords, I too agree that this has been a really useful and interesting debate. It has featured many birthday greetings to the noble Baroness, Lady Kidron, in which I obviously join. The noble Lord, Lord Moylan, bounced into the debate that tested the elasticity of the focus of the group, and bounced out again. Like the noble Lord, Lord Clement-Jones, I was particularly struck by the speech from the noble Baroness, Lady Harding, on the non-mandatory nature of the codes. Her points about reducing Ofcom’s workload, and mandatory codes having precedent, were really significant and I look forward to the Minister’s response.
If I have understood it correctly, the codes will be generated by Ofcom, and the Secretary of State will then table them as statutory instruments—so they will be statutory, non-mandatory codes, but with statutory penalties. Trying to unravel that in my mind was a bit of a thing as I was sitting there. Undoubtedly, we are all looking forward to the Minister’s definition of harm, which he promised us at the previous meeting of the Committee.
I applaud the noble Lord, Lord Russell, for the excellent way in which he set out the issues in this grouping and—along with the Public Bill Office—for managing to table these important amendments. Due to the Bill’s complexity, it is an achievement to get the relatively simple issue of safety by design for children into amendments to Clause 10 on children’s risk assessment duties for user-to-user services; Clause 11 on the safety duties protecting children; and the reference to risk assessments in Clause 19 on record-keeping. There is a similar set of amendments applying to search; to the duties in Clause 36 on codes of practice duties; to Schedule 4 on the content of codes of practice; and to Clause 39 on the Secretary of State’s powers of direction. You can see how complicated the Bill is for those of us attempting to amend it.
What the noble Lord and his amendments try to do is simple enough. I listened carefully to the noble Baroness, Lady Fox, as always. The starting point is, when designing, to seek to eliminate harm. That is not to say that they will eliminate all potential harms to children, but the point of design is to seek to eliminate harms if you possibly can. It is important to be clear about that. Of course, it is not just the content but the systems that we have been talking about, and ensuring that the codes of practice that we are going to such lengths to legislate for are stuck to—that is the point made by the noble Baroness, Lady Harding—relieving Ofcom of the duty to assess all the alternative methods. We certainly support the noble Lord, Lord Russell, in his amendments. They reinforce that it is not just about the content; the algorithmic dissemination, in terms of volume and context, is really important, especially as algorithms are dynamic—they are constantly changing in response to the business models that underpin the user-to-user services that we are debating.
The business models want to motivate people to be engaged, regardless of safety in many ways. We have had discussion of the analogy on cars and planes from the noble Lord, Lord Allan. As I recall, in essence he said that in this space there are some things that you want to regulate like planes, to ensure that there are no accidents, and some where you trade off freedom and safety, as we do with the regulation of cars. In this case, it is a bit more like regulating for self-driving cars; in that context, you will design a lot more around trying to anticipate all the things that humans when driving will know instinctively, because they are more ethical individuals than you could ever programme an AI to be when driving a car. I offer that slight adjustment, and I hope that it helps the noble Lord, Lord Moylan, when he is thinking about trains, planes and automobiles.
In respect of the problem of the business models and their engagement over safety, I had contact this weekend and last week from friends much younger than I am, who are users of Snap. I am told that there is an AI chatbot on Snap, which I am sure is about engaging people for longer and collecting more data so that you can engage them even longer and, potentially, collect data to drive advertising. But you can pay to get rid of that chatbot, which is the business model moving somewhere else as and when we make it harder for it to make money as it is. Snap previously had location sharing, which you had to turn off. It created various harms and risks for children that their location was being shared with other people without them necessarily authorising it. We can all see how that could create issues.
Does the noble Lord have any reflections, talking about Snap, as to how the internet has changed in our time? It was once really for adults, when it was on a PC and it was only adults who had access to it. There has, of course, been a huge explosion in child access to the internet because of the mobile phone—as we have heard, two-thirds of 10 year-olds now have a mobile phone—and an app such as Snap now has a completely different audience from the one it had five or 10 years ago. Does the noble Lord have any reflections on what the consequences of the explosion of children’s access to applications such as Snap has been on those thinking about the harms and protection of children?
I am grateful to the noble Lord. In many ways, I am reminded of the article I read in the New York Times this weekend and the interview with Geoffrey Hinton, the now former chief scientist at Google. He said that as companies improve their AI systems, they become increasingly dangerous. He said of AI technology:
“Look at how it was five years ago and how it is now. Take the difference and propagate it forwards. That’s scary”.
Yes, the huge success of the iPhone, of mobile phones and all of us, as parents, handing our more redundant iPhones on to our children, has meant that children have huge access. We have heard the stats in Committee around the numbers who are still in primary school and on social media, despite the terms and conditions of those platforms. That is precisely why we are here, trying to get things designed to be safe as far as is possible from the off, but recognising that it is dynamic and that we therefore need a regulator to keep an eye on the dynamic nature of these algorithms as they evolve, ensuring that they are safe by design as they are being engineered.
My noble friend Lord Stevenson has tabled Amendment 27, which looks at targeted advertising, especially that which requires data collection and profiling of children. In that, he has been grateful to Global Action Plan for its advice. While advertising is broadly out of scope of the Bill, apart from in respect of fraud, it is significant for the Minister to reflect on the user experience for children. Whether it is paid or organic content, it is pertinent in terms of their safety as children and something we should all be mindful of. I say to the noble Lord, Lord Vaizey, that as I understand it, the age-appropriate design code does a fair amount in respect of the data privacy of children, but this is much more about preventing children encountering the advertising in the first place, aside from the data protections that apply in the age-appropriate design code. But the authority is about to correct me.
Just to add to what the noble Lord has said, it is worth noting that we had a debate, on Amendment 92, about aligning the age-appropriate design code likely to be accessed and the very important issue that the noble Lord, Lord Vaizey, raised about alignment of these two regimes. I think we can say that these are kissing cousins, in that they take a by-design approach. The noble Lord is completely right that the scope of the Bill is much broader than data protection only, but they take the same approach.
I am grateful, as ever, to the noble Baroness, and I hope that has assisted the noble Lord, Lord Vaizey.
Finally—just about—I will speak to Amendment 32A, tabled in my name, about VPNs. I was grateful to the noble Baroness for her comments. In many ways, I wanted to give the Minister the opportunity to put something on the record. I understand, and he can confirm whether my understanding is correct, that the duties on the platforms to be safe is regardless of whether a VPN has been used to access the systems and the content. The platforms, the publishers of content that are user-to-user businesses, will have to detect whether a VPN is being used, one would suppose, in order to ensure that children are being protected and that that is genuinely a child. Is that a correct interpretation of how the Bill works? If so, is it technically realistic for those platforms to be able to detect whether someone is landing on their site via a VPN or otherwise? In my mind, the anecdote that the noble Baroness, Lady Harding, related, about what the App Store algorithm on Apple had done in pushing VPNs when looking for porn, reinforces the need for app stores to become in scope, so that we can get some of that age filtering at that distribution point, rather than just relying on the platforms.
Substantially, this group is about platforms anticipating harms, not reviewing them and then fixing them despite their business model. If we can get the platforms themselves designing for children’s safety and then working out how to make the business models work, rather than the other way around, we will have a much better place for children.
My Lords, I join in the chorus of good wishes to the bungee-jumping birthday Baroness, Lady Kidron. I know she will not have thought twice about joining us today in Committee for scrutiny of the Bill, which is testament to her dedication to the cause of the Bill and, more broadly, to protecting children online. The noble Lord, Lord Clement-Jones, is right to note that we have already had a few birthdays along the way; I hope that we get only one birthday each before the Bill is finished.
My birthday is in October, so I hope not.
Very good—only one each, and hopefully fewer. I thank noble Lords for the points they raised in the debate on these amendments. I understand the concerns raised about how the design and operation of services can contribute to risk and harm online.
The noble Lord, Lord Russell, was right, when opening this debate, that companies are very successful indeed at devising and designing products and services that people want to use repeatedly, and I hope to reassure all noble Lords that the illegal and child safety duties in the Bill extend to how regulated services design and operate their services. Providers with services that are likely to be accessed by children will need to provide age-appropriate protections for children using their service. That includes protecting children from harmful content and activity on their service. It also includes reviewing children’s use of higher-risk features, such as live streaming or private messaging. Service providers are also specifically required to consider the design of functionalities, algorithms and other features when delivering the child safety duties imposed by the Bill.
I turn first to Amendments 23 and 76 in the name of the noble Lord, Lord Russell. These would require providers to eliminate the risk of harm to children identified in the service’s most recent children’s risk assessment, in addition to mitigating and managing those risks. The Bill will deliver robust and effective protections for children, but requiring providers to eliminate the risk of harm to children would place an unworkable duty on providers. As the noble Baroness, Lady Fox, my noble friend Lord Moylan and others have noted, it is not possible to eliminate all risk of harm to children online, just as it is not possible entirely to eliminate risk from, say, car travel, bungee jumping or playing sports. Such a duty could lead to service providers taking disproportionate measures to comply; for instance, as noble Lords raised, restricting children’s access to content that is entirely appropriate for them to see.
Does the Minister accept that that is not exactly what we were saying? We were not saying that they would have to eliminate all risk: they would have to design to eliminate risks, but we accept that other risks will apply.
It is part of the philosophical ruminations that we have had, but the point here is that elimination is not possible through the design or any drafting of legislation or work that is there. I will come on to talk a bit more about how we seek to minimise, mitigate and manage risk, which is the focus.
Amendments 24, 31, 32, 77, 84, 85 and 295, from the noble Lord, Lord Russell, seek to ensure that providers do not focus just on content when fulfilling their duties to mitigate the impact of harm to children. The Bill already delivers on those objectives. As the noble Baroness, Lady Kidron, noted, it defines “content” very broadly in Clause 207 as
“anything communicated by means of an internet service”.
Under this definition, in essence, all communication and activity is facilitated by content.
I hope that the Minister has in his brief a response to the noble Baroness’s point about Clause 11(14), which, I must admit, comes across extraordinarily in this context. She quoted it, saying:
“The duties set out … are to be taken to extend only to content that is harmful to children where the risk of harm is presented by the nature of the content (rather than the fact of its dissemination)”.
Is not that exception absolutely at the core of what we are talking about today? It is surely therefore very difficult for the Minister to say that this applies in a very broad way, rather than purely to content.
I will come on to talk a bit about dissemination as well. If the noble Lord will allow me, he can intervene later on if I have not done that to his satisfaction.
I was about to talk about the child safety duties in Clause 11(5), which also specifies that they apply to the way that a service is designed, how it operates and how it is used, as well as to the content facilitated by it. The definition of content makes it clear that providers are responsible for mitigating harm in relation to all communications and activity on their service. Removing the reference to content would make service providers responsible for all risk of harm to children arising from the general operation of their service. That could, for instance, bring into scope external advertising campaigns, carried out by the service to promote its website, which could cause harm. This and other elements of a service’s operations are already regulated by other legislation.
I apologise for interrupting. Is that the case, and could that not be dealt with by defining harm in the way that it is intended, rather than as harm from any source whatever? It feels like a big leap that, if you take out “content”, instead of it meaning the scope of the service in its functionality and content and all the things that we have talked about for the last hour and a half, the suggestion is that it is unworkable because harm suddenly means everything. I am not sure that that is the case. Even if it is, one could find a definition of harm that would make it not the case.
Taking it out in the way that the amendment suggests throws up that risk. I am sure that it is not the intention of the noble Lord or the noble Baroness in putting it, but that is a risk of the drafting, which requires some further thought.
Clause 11(2), which is the focus of Amendments 32, 85 and 295, already means that platforms have to take robust action against content which is harmful because of the manner of its dissemination. However, it would not be feasible for providers to fulfil their duties in relation to content which is harmful only by the manner of its dissemination. This covers content which may not meet the definition of content which is harmful to children in isolation but may be harmful when targeted at children in a particular way. One example could be content discussing a mental health condition such as depression, where recommendations are made repeatedly or in an amplified manner through the use of algorithms. The nature of that content per se may not be inherently harmful to every child who encounters it, but, when aggregated, it may become harmful to a child who is sent it many times over. That, of course, must be addressed, and is covered by the Bill.
The Bill requires providers to specifically consider as part of their risk assessments how algorithms could affect children’s exposure to illegal content and content which is harmful to children on their service. Service providers will need specifically to consider the harm from content that arises from the manner of dissemination —for example, content repeatedly sent to someone by a person or persons, which is covered in Clause 205(3)(c). Providers will also need to take steps to mitigate and effectively manage any risks, and to consider the design of functionalities, algorithms and other features to meet their illegal content and child safety duties. Ofcom will have a range of powers at its disposal to help it assess whether providers are fulfilling their duties. That includes the power to require information from providers about the operation of their algorithms.
Can the Minister assure us that he will take another look at this between Committee and Report? He has almost made the case for this wording to be taken out—he said that it is already covered by a whole number of different clauses in the Bill—but it is still here. There is still an exception which, if the Minister is correct, is highly misleading: it means that you have to go searching all over the Bill to find a way of attacking the algorithm, essentially, and the way that it amplifies, disseminates and so on. That is what we are trying to get to: how to address the very important issue not just of content but of the way that the algorithm operates in social media. This seems to be highly misleading, in the light of what the Minister said.
I do not think so, but I will certainly look at it again, and I am very happy to speak to the noble Lord as I do. My point is that it would not be workable or proportionate for a provider to prevent or protect all children from encountering every single instance of the sort of content that I have just outlined, which would be the effect of these amendments. I will happily discuss that with the noble Lord and others between now and Report.
Amendment 27, by the noble Lord, Lord Stevenson, seeks to add a duty to prevent children encountering targeted paid-for advertising. As he knows, the Bill has been designed to tackle harm facilitated through user-generated content. Some advertising, including paid-for posts by influencers, will therefore fall under the scope of the Bill. Companies will need to ensure that systems for targeting such advertising content to children, such as the use of algorithms, protect them from harmful material. Fully addressing the challenges of paid-for advertising is a wider task than is possible through the Bill alone. The Bill is designed to reduce harm on services which host user-generated content, whereas online advertising poses a different set of problems, with different actors. The Government are taking forward work in this area through the online advertising programme, which will consider the full range of actors and sector-appropriate solutions to those problems.
I understand the Minister’s response, and I accept that there is a parallel stream of work that may well address this. However, we have been waiting for the report from the group that has been looking at that for some time. Rumours—which I never listen to—say that it has been ready for some time. Can the Minister give us a timescale?
I cannot give a firm timescale today but I will seek what further information I can provide in writing. I have not seen it yet, but I know that the work continues.
Amendments 28 and 82, in the name of the noble Lord, Lord Russell, seek to remove the size and capacity of a service provider as a relevant factor when determining what is proportionate for services in meeting their child safety duties. This provision is important to ensure that the requirements in the child safety duties are appropriately tailored to the size of the provider. The Bill regulates a large number of service providers, which range from some of the biggest companies in the world to small voluntary organisations. This provision recognises that what it is proportionate to require of providers at either end of that scale will be different.
Removing this provision would risk setting a lowest common denominator. For instance, a large multinational company could argue that it is required only to take the same steps to comply as a smaller provider.
Amendment 32A from the noble Lord, Lord Knight of Weymouth, would require services to have regard to the potential use of virtual private networks and similar tools to circumvent age-restriction measures. He raised the use of VPNs earlier in this Committee when we considered privacy and encryption. As outlined then, service providers are already required to think about how safety measures could be circumvented and take steps to prevent that. This is set out clearly in the children’s risk assessment and safety duties. Under the duty at Clause 10(6)(f), all services must consider the different ways in which the service is used and the impact of such use on the level of risk. The use of VPNs is one factor that could affect risk levels. Service providers must ensure that they are effectively mitigating and managing risks that they identify, as set out in Clause 11(2). The noble Lord is correct in his interpretation of the Bill vis-à-vis VPNs.
Is this technically possible?
Technical possibility is a matter for the sector—
I am grateful to the noble Lord for engaging in dialogue while I am in a sedentary position, but I had better stand up. It is relevant to this Committee whether it is technically possible for providers to fulfil the duties we are setting out for them in statute in respect of people’s ability to use workarounds and evade the regulatory system. At some point, could he give us the department’s view on whether there are currently systems that could be used —we would not expect them to be prescribed—by platforms to fulfil the duties if people are using their services via a VPN?
This is the trouble with looking at legislation that is technologically neutral and future-proofed and has to envisage risks and solutions changing in years to come. We want to impose duties that can technically be met, of course, but this is primarily a point for companies in the sector. We are happy to engage and provide further information, but it is inherently part of the challenge of identifying evolving risks.
The provision in Clause 11(16) addresses the noble Lord’s concerns about the use of VPNs in circumventing age-assurance or age-verification measures. For it to apply, providers would need to ensure that the measures they put in place are effective and that children cannot normally access their services. They would need to consider things such as how the use of VPNs affects the efficacy of age-assurance and age-verification measures. If children were routinely using VPNs to access their service, they would not be able to conclude that Clause 11(16) applies. I hope that sets out how this is covered in the Bill.
Amendments 65, 65ZA, 65AA, 89, 90, 90B, 96A, 106A, 106B, 107A, 114A, 122, 122ZA, 122ZB and 122ZC from the noble Lord, Lord Russell of Liverpool, seek to make the measures Ofcom sets out in codes of practice mandatory for all services. I should make it clear at the outset that companies must comply with the duties in the Bill. They are not optional and it is not a non-statutory regime; the duties are robust and binding. It is important that the binding legal duties on companies are decided by Parliament and set out in legislation, rather than delegated to a regulator.
Codes of practice provide clarity on how to comply with statutory duties, but should not supersede or replace them. This is true of codes in other areas, including the age-appropriate design code, which is not directly enforceable. Following up on the point from my noble friend Lady Harding of Winscombe, neither the age-appropriate design code nor the SEND code is directly enforceable. The Information Commissioner’s Office or bodies listed in the Children and Families Act must take the respective codes into account when considering whether a service has complied with its obligations as set out in law.
As with these codes, what will be directly enforceable in this Bill are the statutory duties by which all sites in scope of the legislation will need to abide. We have made it clear in the Bill that compliance with the codes will be taken as compliance with the duties. This will help small companies in particular. We must also recognise the diversity and innovative nature of this sector. Requiring compliance with prescriptive steps rather than outcomes may mean that companies do not use the most effective or efficient methods to protect children.
I reassure noble Lords that, if companies decide to take a different route to compliance, they will be required to document what their own measures are and how they amount to compliance. This will ensure that Ofcom has oversight of how companies comply with their duties. If the alternative steps that providers have taken are insufficient, they could face enforcement action. We expect Ofcom to take a particularly robust approach to companies which fail to protect their child users.
My noble friend Lord Vaizey touched on the age-appropriate design code in his remarks—
I do not think that it will. We have provided further resource for Ofcom to take on the work that this Bill will give it; it has been very happy to engage with noble Lords to talk through how it intends to go about that work and, I am sure, would be happy to follow up on that point with my noble friend to offer her some reassurance.
Responding to the point from my noble friend Lord Vaizey, the Bill is part of the UK’s overall digital regulatory landscape, which will deliver protections for children alongside the data protection requirements for children set out in the Information Commissioner’s age-appropriate design code. Ofcom has strong existing relationships with other bodies in the regulatory sphere, including through the Digital Regulation Co-operation Forum. The Information Commissioner has been added to this Bill as a statutory consultee for Ofcom’s draft codes of practice and relevant pieces of guidance formally to provide for the ICO’s input into its areas of expertise, especially relating to privacy.
Amendment 138 from the noble Lord, Lord Russell of Liverpool, would amend the criteria for non-designated content which is harmful to children to bring into scope content whose risk of harm derives from its potential financial impact. The Bill already requires platforms to take measures to protect all users, including children, from financial crime online. All companies in scope of the Bill will need to design and operate their services to reduce the risk of users encountering content amounting to a fraud offence, as set out in the list of priority offences in Schedule 7. This amendment would expand the scope of the Bill to include broader commercial harms. These are dealt with by a separate legal framework, including the Consumer Protection from Unfair Trading Regulations. This amendment therefore risks creating regulatory overlap, which would cause confusion for business while not providing additional protections to consumers and internet users.
Amendment 261 in the name of the right reverend Prelate the Bishop of Oxford seeks to modify the existing requirements for the Secretary of State’s review into the effectiveness of the regulatory framework. The purpose of the amendment is to ensure that all aspects of a regulated service are taken into account when considering the risk of harm to users and not just content.
As we have discussed already, the Bill defines “content” very broadly and companies must look at every aspect of how their service facilitates harm associated with the spread of content. Furthermore, the review clause makes explicit reference to the systems and processes which regulated services use, so the review can already cover harm associated with, for example, the design of services.
Amendments 291, 292, and 293 seek to ensure that companies’ child safety duties apply to a broader range of functionalities which can facilitate harm online. The current list of functionalities in the Bill is not exhaustive. Services will therefore need to assess the risk from any feature or functionality of their service which enables user interaction and could cause harm to users.
The points raised in these amendments are covered already in the Bill in the places I have set out. I will consult the official record of this debate to see whether there are any areas which I have not followed up, but I invite noble Lords not to press their amendments in this group.
My Lords, I thank the Minister for his response. I think the entire Chamber will be thankful that I do not intend to respond in any great detail to almost one hour and three-quarters of debate on this series of amendments—I will just make a few points and suggestions.
The point that the noble Baroness made at the beginning about understanding the design and architecture of the systems and processes is fundamental, both for understanding why they are causing the sorts of harm that they are at the moment and for trying to ensure that they are designed better in future than they have been to date. Clearly, they are seriously remiss in the harms that they are inflicting on a generation of young people.
On the point made by the noble Baroness, Lady Harding, about trying to make Ofcom’s job easier— I can see the noble Lord, Lord Grade, in the corner— I would hope and anticipate that anything we could suggest that would lead the Government to make Ofcom’s job slightly easier and clearer would be very welcome. The noble Lord appears to be making an affirmatory gesture, so I will take that as a yes.
I say to the noble Lord, Lord Moylan, that I fully understand the importance of waving the flag of liberty and free speech, and I acknowledge its importance. I also acknowledge the always-incipient danger of unintentionally preventing things from happening that can and should happen when you are trying to make things safer and prevent harm. Trying to get the right balance is extraordinarily difficult, but I applaud the noble Lord for standing up and saying what he said. If one were to judge the balance of the contributions here as a very rough opinion poll, the noble Lord might find himself in the minority, but that does not necessarily mean that he is wrong, so I would encourage him to keep contributing.
I sympathise with the noble Baroness, Lady Fox, in trying to find the right balance; it is something that we are all struggling to do. One of the great privileges we have in this House is that we have the time to do it in a manner which is actively discouraged in the other place. Even if we go on a bit, we are talking about matters which are very important—in particular, the pre-legislative scrutiny committee was able to cover them in greater detail than the House of Commons was able to do.
The noble Lord, Lord Clement-Jones, was right. In the same way as they say, “Follow the money”, in this case it is “follow the algorithms”, because it is the algorithms which drive the business model.
On the points made by the noble Lord, Lord Knight, regarding the New York Times article about Geoffrey Hinton, one of the architects of AI in Google, I would recommend that all your Lordships read it to see somebody who has been at the forefront of developing artificial intelligence. Rather like a character in a Jules Verne novel suddenly being slightly aghast at what they have created—Frankenstein comes to mind—it makes one pause for thought. Even as we are talking about these things, AI is racing ahead like a greyhound in pursuit of a very fast rabbit, and there is no way that we will be able to catch up.
While I thank the noble Minister for his reply, as when we debated some of the amendments last week where the noble Baroness, Lady Harding, spoke about the train journey she took when she was trying to interrogate and interpret the different parts of the Bill and was trying to follow the trail and understand what was going on to the extent that she became so involved that she missed her station, I think there is a real point here about the fact that this Bill is very complex to follow and understand. Indeed, the way in which the Minster had to point to all the different points of the compass—so to speak—both within the Bill and without it in many of the answers that he gave to some of the amendments indicates to me that the Bill team is finding it challenging to respond to some of them. It is like filling in one of those diagrams where you join the dots, and you cannot quite see what it is until you have nearly finished. I find it slightly disturbing if the Bill team and some of the officials appear to be having a challenging time in trying to interpret, understand and explain some of the points we are raising; I would hope and expect that that could be done much more simply.
One of the pleas from all of us in a whole variety of these amendments is to get the balance right between legislating what it is that we want to legislate and making it simple enough to be understandable. At the moment, a criticism of this Bill is that it is extraordinary difficult to understand in many parts. I will not go through all the points, but there are some germane areas where it would be extremely helpful to pursue with the Minister and the Bill team some of the points we are trying to make. Many of them are raised by a variety of outside bodies which know infinitely more about it than I do, and which have genuine concerns. We have the time between Committee and Report to put some of those to bed or at least to understand them better than we do at the moment. We will probably be happy and satisfied with some of the responses that we receive from the department once we feel that we understand them, and perhaps more importantly, once we feel that the department and the Bill team themselves fully understand them. It is fair to say that at the moment we are not completely comfortable that they do. I do not blame the Minister for that. If I were in his shoes, I would be on a very long holiday and I would not be returning any time soon. However, we will request meetings—for one meeting, it would be too much, so we will try to put this into bit-size units and then try to dig into the detail in a manageable way without taking too much time to make sure that we understand each other.
With that, I beg leave to withdraw the amendment.
Amendment 23 withdrawn.
Amendment 24 not moved.