Amendment 20

Online Safety Bill - Committee (3rd Day) (Continued) – in the House of Lords at 5:00 pm on 27 April 2023.

Alert me about debates like this

Baroness Kidron:

Moved by Baroness Kidron

20: Clause 10, page 9, line 11, leave out paragraphs (a) to (h) and insert—“(a) the level of risk that children who are users of the service encounter the harms as outlined in Schedule (Online harms to children) by means of the service;(b) any of the level of risks to children encountered singularly or in combination, having regard to—(i) the design of functionalities, algorithms and other features that present or increase risk of harm, such as low-privacy profile settings by default;(ii) the business model, revenue model, governance, terms of service and other systems and processes or mitigation measures that may reduce or increase the risk of harm;(iii) risks which can build up over time;(iv) the ways in which level of risks can change when experienced in combination with others;(v) the level of risk of harm to children in different age groups;(vi) the level of risk of harm to children with certain characteristics or who are members of certain groups; and(vii) the different ways in which the service is used including but not limited to via virtual and augmented reality technologies, and the impact of such use on the level of risk of harm that might be suffered by children;(c) whether the service has shown regard to the rights of children as set out in the United Nations Convention on the Rights of the Child (see general comment 25 on children’s rights in relation to the digital environment).”Member’s explanatory statementThis amendment would require providers to look at and assess risks on their platform in the round and in line with the 4 Cs of online risks to children (content, contact, conduct and contractual/commercial risks). Although these risks will not be presented on every service, this amendment requires providers to reflect on these risks, so they are not forgotten and can be built into future development of the service.

Photo of Baroness Kidron Baroness Kidron Crossbench

My Lords, this amendment and Amendments 74, 93 and 123 are part of a larger group that have been submitted as a package loosely referred to as the AV and harms package. They have been the subject of much private debate with the Government, for which we are grateful, and among parliamentarians, and have featured prominently in the media. The amendments are in my name and those of the noble Lord, Lord Bethell, the right reverend Prelate the Bishop of Oxford and the noble Lord, Lord Stevenson, but enjoy the support of a vast array of Members of both Houses. I thank all those who have voiced their support.

The full package of amendments defines and sets out the rules of the road for age assurance, including the timing of its introduction, and the definition of terms such as age verification and age assurance. They introduce the concept of measuring the efficacy of systems with one eye on the future so that we as parliamentarians can indicate where and when we feel that proportionality is appropriate and where it is simply not—for example, in relation to pornography. In parallel, we have developed a schedule of harms, which garners rather fewer column inches but is equally important in establishing Parliament’s intention. It is that schedule of harms that is up for debate today.

Before I lay out the amendment, I thank the 26 children’s charities which have so firmly got behind this package and acknowledge, in particular, Barnardo’s, CEASE and 5Rights, of which I am chair, which have worked tirelessly to ensure that the full expertise of children’s charities has been embedded in these amendments. I also pay tribute to the noble Baroness, Lady Benjamin, who in this area of policy has shown us all the way.

The key amendment in this group is Amendment 93, which would place a schedule of harms to children in the Bill. There are several reasons for doing so, the primary one being that by putting them in the Bill we are stating the intention of Parliament, which gives clarity to companies and underlines the authority of Ofcom to act on these matters. Amendments 20, 74 and 123 ensure that the schedule is mirrored in risk assessments and tasks Ofcom with updating its guidance every six months to capture new and emerging harms, and as such are self-evident.

The proposed harms schedule is centred around the four Cs, a widely used and understood taxonomy of harm used in legislation and regulation around the globe. Importantly, rather than articulate individual harms that may change over time, it sets its sight on categories of harm: content, contact, conduct and contract, which is sometimes referred to as commercial harm. It also accounts for cumulative harms, where two or more risk factors create a harm that is greater than any single harm or is uniquely created by the combination. The Government’s argument against the four Cs is that they are not future-proof, which I find curious since the very structure of the four Cs is to introduce broad categories of harm to which harms can be added, particularly emerging harms. By contrast, the Government are adding an ever-growing list of individual harms.

I wish to make three points in favour of our package of amendments relating first to language, secondly to the nature of the digital world, and finally to clarity of purpose. It is a great weakness of the Bill that it consistently introduces new concepts and language—for example, the terms “primary priority content”, “priority content” and “non-designated content”. These are not terms used in other similar Bills across the globe, they are not evident in current UK law and they do not correlate with established regimes, such as equalities legislation or children’s rights under the convention, more of which in group 7.

The question of language is non-trivial. It is the central concern of those who fight CSAE around the world, who frequently find that enforcement against perpetrators or takedown is blocked by legal systems that define child sexual abuse material differently—not differently in some theoretical sense but because the same image can be categorised differently in two countries and then be a barrier to enforcement across jurisdictions. Leadership from WeProtect, the enforcement community and representatives that I recently met from Africa, South America and Asia have all made this point. It undermines the concept of UK leadership in child protection that we are wilfully and deliberately rejecting accepted language which is embedded in treaties, international agreements and multilateral organisations to start again with our own, very likely with the same confused outcome.

Secondly, I am concerned that while both the Bill and the digital world are predicated on system design, the harms are all articulated as content with insufficient emphasis on systems harms, such as careless recommendations, spreading engagement and the sector-wide focus on maximising engagement, which are the very things that create the toxic and dangerous environment for children. I know, because we have discussed it, that the Minister will say that this is all in the risk assessment, but the risk assessment asks regulated companies to assess how a number of features contribute to harm, mostly expressed as content harm.

What goes through my mind is the spectre of Meta’s legal team, which I watched for several days during Molly Russell’s inquest; they stood in a court of law and insisted that hundreds, in fact thousands, of images of cut bodies and depressive messages did not constitute harm. Rather, they regarded them as cries for help or below the bar of harm as they interpreted it. Similarly, there was material that featured videos of people jumping off buildings—some of them sped-up versions of movie clips edited to suggest that jumping was freedom—and I can imagine a similar argument that says that kind of material cannot be considered harmful, because in another context it is completely legitimate. Yet this material was sent to Molly at scale.

It is not good enough to characterise harms simply by establishing what is or is not harmful content. The previous debate really underlined that it takes a long time and it is very complicated to see what is harmful. But we must make utterly clear that the drip feed of nudges, enticements and recommendations and the creation of a toxic environment, overwhelming a child of 14 with more than 1,400 messages, whether they meet that bar of harmful content or not, is in itself a harm. A jukebox of content harms is not future-proof, and it fails to name the risks of the system. It is to misunderstand where the power of digital design actually lies.

Finally, there is the question of simplicity and clarity. As we discussed on the first day of Committee, business wants clarity, campaigners want clarity, parents want clarity, and Ofcom could do with some clarity. If not the four Cs, my challenge to the Government is to deliver a schedule that has the clarity and simplicity of the amendments in front of us, in which harm is defined by category not by individual content measurements, so that it is flexible now and into the future, and foregrounds the specific role of the system design not only as an accomplice to the named harm but as a harm itself. I beg to move.

Photo of Baroness Ritchie of Downpatrick Baroness Ritchie of Downpatrick Non-affiliated 5:15, 27 April 2023

My Lords, it is a pleasure to follow the noble Baroness, Lady Kidron. I have listened intently today, and there is no doubt that this Bill not only presents many challenges but throws up the complexity of the whole situation. I think it was the noble Lord, Lord Kamall, in an earlier group who raised the issues of security, safety and freedom. I would add the issue of rights, because we are trying to balance all these issues and characterise them in statute, vis-à-vis the Bill.

On Tuesday, we spoke about one specific harm—pornography—on the group of amendments that I had brought forward. But I made clear at that time that I believe this is not the only harm, and I fully support the principles of the amendments from the noble Baroness, Lady Kidron. I would obviously like to get some clarity from her on the amendments, particularly as to how they relate to other clauses in the Bill.

The noble Baroness has been the pioneer in this field, and her expertise is well recognised across the House. I believe that these amendments really take us to the heart of the Bill and what we are trying to achieve—namely, to identify online harms to children, counteract them and provide a level of safety to young people.

As the noble Lord, Lord Clement-Jones, said on Tuesday,

“there is absolutely no doubt that across the Committee we all have the same intent; how we get there is the issue between us”.—[Official Report, 25/4/23; col. 1196.]

There is actually not that much between us. I fully agree with the principle of putting some of the known harms to children in the Bill. If we know the harms, there is little point in waiting for them to be defined in secondary legislation by Clause 54.

It is clear to me that there are harms to children that we know about, and those harms will not change. It would be best to name those harms clearly in the Bill when it leaves this House. That would allow content providers, search engines and websites in scope of the Bill to prepare to make any changes they need to keep children safe. Perhaps the Minister could comment on that aspect. We also know that parents will expect some harms to be in the Bill. The noble Baroness, Lady Kidron, laid out what they are, and I agree with her analysis. These issues are known and we should not wait for them to be named.

While known harms should be placed into the Bill, I know, understand and appreciate that the Government are concerned about future-proofing. However, I am of the view that a short list of key topics will not undermine that principle. Indeed, the Joint Committee’s report on the draft Bill stated,

“we recommend that key, known risks of harm to children are set out on the face of the Bill”.

In its report on the Bill, the DCMS Select Committee in the other place agreed, saying

“that age-inappropriate or otherwise inherently harmful content and activity (like pornography, violent material, gambling and content that promotes or is instructive in eating disorders, self-harm and suicide) should appear on the face of the Bill”.

Has there been any further progress in discussions on those issues?

At the beginning of the year, the Children’s Commissioner urged Parliamentarians

“to define pornography as a harm to children on the fact of the … Bill, such that the regulator, Ofcom, may implement regulation of platforms hosting adult content as soon as possible following the passage of the Bill”.

I fully agree with the Children’s Commissioner. While the ways in which pornographic content is delivered will change over time, the fact that pornography is harmful to children will not change. Undoubtedly, with the speed of technology—something that the noble Lord, Lord Allan of Hallam, knows a lot more about than the rest of us, having worked in this field—it will no doubt change and we will be presented with new types of challenges.

I therefore urge the Government to support the principle that the key risks are in the Bill, and I thank the noble Baroness, Lady Kidron, for raising this important principle. However, I hope she will indulge me as I seek to probe some of the detail of her amendments and their interactions with the architecture of other parts of the Bill. As I said when speaking to Clause 49 on Tuesday, the devil is obviously in the detail.

First, Clause 54 defines what constitutes

“Content that is harmful to children”,

and Clause 205 defines harm, and Amendment 93 proposes an additional new list of harms. As I have already said, I fully support the principle of harms being in the Bill, but I raise a question for the noble Baroness. How does she see these three definitions working together? That might refer back to a preliminary discussion that we had in the tearoom earlier.

These definitions of harms are in addition to the content to be defined as primary priority content and priority content. Duties in Clauses 11 and 25 continue to refer to these two types of content for Part 3 services, but Amendments 20 and 74 would remove the need for risk assessments in Clauses 10 and 24 to address these two types of content. It seems that the amendments could create a tension in the Bill, and I am interested to ascertain how the noble Baroness, Lady Kidron, foresees that tension operating. Maybe she could give us some detail in her wind-up about that issue. An explanation of that point may bring some clarity to understanding how the new schedule that the noble Baroness proposes will work alongside the primary priority content and the priority content lists. Will the schedule complement primary priority content, or will it be an alternative?

Secondly, as I said, some harms are known but there are harms that are as yet unknown. Will the noble Baroness, Lady Kidron, consider a function to add to the list of content in her Amendment 93, in advance of us coming back on Report? There is no doubt that the online space is rapidly changing, as this debate has highlighted. I can foresee a time when other examples of harm should be added to the Bill. I accept that the drafting is clear that the list is not exclusive, but it is intended to be a significant guide to what matters to the public and Parliament. I also accept that Ofcom can provide guidance on other content under Amendment 123, but, without a regulatory power added to Amendment 93, it feels that we are perhaps missing a belt-and-braces approach to online harms to children. After all, our principal purpose here is to protect children from online harm.

I commend the noble Baroness, Lady Kidron, on putting these important amendments before the Committee, and I fully support the principle of what she seeks to achieve. But I hope that, on further reflection, she will look at the points I have suggested. Perhaps she might suggest other ideas in her wind-up, and we could have further discussions in advance of Report. I also look forward to the Minister’s comments on these issues.

Photo of The Bishop of Oxford The Bishop of Oxford Bishop

My Lords, I support Amendments 20, 93 and 123, in my name and those of the noble Baroness, Lady Kidron, and the noble Lords, Lord Bethell and Lord Stevenson. I also support Amendment 74 in the name of the noble Baroness, Lady Kidron. I pay tribute to the courage of all noble Lords and their teams, and of the Minister and the Bill team, for their work on this part of the Bill. This work involves the courage to dare to look at some very difficult material that, sadly, shapes the everyday life of too many young people. This group of amendments is part of a package of measures to strengthen the protections for children in the Bill by introducing a new schedule of harms to children and plugging a chronological gap between Part 3 and Part 5 services, on when protection from pornography comes into effect.

Every so often in these debates, we have been reminded of the connection with real lives and people. Yesterday evening, I spent some time speaking on the telephone with Amanda and Stuart Stephens, the mum and dad of Olly Stephens, who lived in Reading, which is part of the diocese of Oxford. Noble Lords will remember that Olly was tragically murdered, aged 13, in a park near his home, by teenagers of a similar age. Social media played a significant part in the investigation and in the lives of Olly and his friends—specifically, social media posts normalising knife crime and violence, with such a deeply tragic outcome.

Last year in June, “Panorama” dared to look into this world. The programme revealed the depth and extent of the normalisation of knives and knife crime in posts offered to young people. I was struck by the comments of Frances Haugen, filmed when she met Stuart and Amanda. She said that each of us sees social media through a pinhole: a tiny snapshot of the total content. We have no idea how much darkness and evil are shaping children and young people, destroying their sense of proportion and deeply affecting offline behaviour. The only group that has the whole picture, of course, is the companies themselves.

The noble Baroness, Lady Kidron, and others have outlined the remarkable degree of support for this raft of amendments from charities working to protect children. We should listen. These amendments will ensure a much wider definition of “harm” and will again future-proof the Bill in terms of technology which is even now coming over the horizon.

The Center for Countering Digital Hate speaks about an arms race to devise ever more effective ways of keeping users’ attention, even if it means putting them at risk. Its researchers set up new accounts in the United States, United Kingdom, Canada and Australia at the minimum age TikTok allows: 13 years old. Those accounts paused briefly on videos about body image and mental health and liked them. What the researchers found was deeply disturbing. Within 2.6 minutes, TikTok recommended suicide content. Within 8 minutes, TikTok served content relating to eating disorders. Every 39 seconds, TikTok recommended videos about body image and mental health to teens. CCDH researchers found a community for eating disorder content on the platform amassing 13.2 billion views across 56 hashtags, often designed to evade moderation.

As the noble Baroness, Lady Kidron, said, this fourfold classification of harms to children is being adapted elsewhere in the world, including the European Union. The schedule in the amendment gives clear but non-exhaustive examples to guide service providers on the meaning of each of the four Cs. It is vital to have more comprehensive agreed definitions of harm in the Bill.

I will reflect for a moment on what each of the four Cs means. Content harms are the most familiar. At the moment, children who go online are likely to encounter age-inappropriate content, including violent, gory and graphic communication, hate speech, terrorism, online prostitution, drugs, eating disorders and self-harm. Research also shows that exposure to different types of harmful content is interrelated: so, if a child reports seeing one type of disturbing content, it is likely that they have seen others as well.

Secondly, contact harms encourage harmful actions in the non-virtual world. A 10 year-old girl was left with burns after spraying an aerosol deodorant with the nozzle right up against her skin to create a freezing sensation. Jane Platt’s daughter Sarah, aged 15, was rushed to hospital in February 2020 after doing the “skull-breaker challenge”, which involves two people kicking the legs from under a third, making them fall over. These suggestions could never be offered in young people’s magazines or broadcast media.

Thirdly, there are conduct harms. In a global survey, 54% of young people—57% of girls and 48% of boys—reported having experienced online sexual harms before they were 18 years old, including within interaction with adults and being asked something sexually explicit or being sent sexually explicit content.

Finally, there are commercial harms. Over half of the games on Google Play now include loot boxes and more than 93% of games that feature loot boxes are marked suitable for children aged 12 years-plus.

As the noble Baroness, Lady Kidron, and others have argued, these harms are often cumulative and interrelated. The social media companies are the only ones not looking through a keyhole but monitoring social media in the round and able to assess what is happening, but evidence suggests that they will do not so until compelled by legislation. These amendments are a vital step forward in fulfilling the Bill’s purpose of providing additional protection from harm for children. I urge the Government to adopt them.

Photo of Baroness Fox of Buckley Baroness Fox of Buckley Non-affiliated 5:30, 27 April 2023

My Lords, I really appreciated the contribution from the noble Baroness, Lady Ritchie of Downpatrick, because she asked a lot of questions about this group of amendments. Although I might be motivated by different reasons, I found it difficult to fully understand the impact of the amendments, so I too want to ask a set of questions.

Harm is defined in the Bill as “physical or psychological harm”, and there is no further explanation. I can understand the frustration with that and the attempts therefore to use what are described as the

“widely understood and used 4 Cs of online risk to children”.

They are not widely understood by me, and I have ploughed my way through it. I might well have misunderstood lots in it, but I want to look at and perhaps challenge some of the contents.

I was glad that Amendment 20 recognises the level of risk of harm to different age groups. That concerns me all the time when we talk about children and young people, and then end up treating four year-olds, 14 year-olds and 18 year-olds. I am glad that that is there, and I hope that we will look at it again in future.

I want to concentrate on Amendment 93 and reflect and comment more generally on the problem of a definition, or a lack of definition, of harm in the Bill. For the last several years that we have been considering bringing this Bill to this House and to Parliament, I have been worried about the definition of psychological harm. That is largely because this category has become ever more expansive and quite subjective in our therapeutic age. It is a matter of some discussion and quite detailed work by psychologists and professionals, who worry that there is an expanding concept group of what is considered harmful and what psychological harm really means.

As an illustration, I was invited recently to speak to a group of sixth-formers and was discussing things such as trigger warnings and so on. They said, “Well, you know, you’ve got to understand what it’s like”—they were 16 year-olds. “When we encounter certain material, it makes us have PTSD”. I was thinking, “No, it doesn’t really, does it?” Post-traumatic stress disorder is something that you might well gain if you have been in the middle of a war zone. The whole concept of triggering came from psychological and medical insights from the First World War, which you can understand. If you hear a car backfiring, you think it is somebody shooting at you. But the idea here is that we should have trigger warnings on great works of literature and that if we do not it will lead to PTSD.

I am not being glib, because an expanded, elastic and pathologised view of harm is being used quite cavalierly and casually in relation to young people and protecting them, often by the young people themselves. It is routinely used to close down speech as part of the cancel culture wars, which, as noble Lords know, I am interested in. Is there not a danger that this concept of harm is not as obvious as we think, and that the psychological harm issue makes it even more complicated?

The other thing is that Amendment 93 says:

“The harms in this Schedule are a non-exhaustive list of categories and other categories may be relevant”.

As with the discussion on whose judgment decides the threshold for removing illegal material, I think that judging what is harmful is even more tricky for the young in relation to psychological harm. I was reminded of that when the noble Baroness, Lady Kidron, complained that what she considered to be obviously and self-evidently harmful, Meta did not. I wondered whether that is just the case with Meta, or whether views will differ when it comes to—

Photo of Baroness Kidron Baroness Kidron Crossbench

The report found—I will not give a direct quotation—that social media contributed to the death of Molly Russell, so it was the court’s judgment, not mine, that Meta’s position was indefensible.

Photo of Baroness Fox of Buckley Baroness Fox of Buckley Non-affiliated

I completely understand that; I was making the point that there will be disagreements in judgments. In that instance, it was resolved by a court, but we are talking about a situation where I am not sure how the judgment is made.

In these amendments, there are lists of particular harms—a variety are named, including self-harm—and I wanted to provide some counterexamples of what I consider to be harms. I have been inundated by algorithmic adverts for “Naked Education” on Channel 4, maybe because of the algorithms I am on. I think that the programme is irresponsible; I say that having watched it, rather than just having read a headline. Channel 4 is posing this programme with naked adults and children as educational by saying that it is introducing children to the naked body. I think it is harmful for children and that it should not be on the television, but it is advertised on social media—I have seen quite a lot of it.

The greatest example of self-harm we encounter at present is when gender dysphoric teenagers—as well as some younger than teenagers; they are predominately young women—are affirmed by adults, as a kind of social contagion, into taking body-changing and body-damaging hormones and performing self-mutilation, whether by breast binding or double mastectomies, which is advertised and praised by adults. That is incredibly harmful for young people, and it is reflected online at lot, because much of this is discussed, advertised or promoted online.

This is related to the earlier contributions, because I am asking: should those be added to the list of obvious harms? Although not many noble Lords are in the House now, if there were many more here, they would object to what I am saying by stating, “That is not harmful at all. What is harmful is what you’re saying, Baroness Fox, because you’re causing psychological harm to all those young people by being transphobic”. I am raising these matters because we think we all agree that there is a consensus on what is harmful material online for young people, but it is not that straightforward.

The amendment states that the Bill should target any platform that posts

“links to, or … encourages child users to seek” out “dangerous or illegal activity”. I understand “illegal activity”, but on “dangerous” activities, I assume that we do not mean extreme sports, mountain climbing and so on, which are dangerous—that comes to mind probably because I have spent too much time with young people who spend their whole time looking at those things. I worry about the unintended consequences of things being banned or misinterpreted in that way.

Photo of Lord Russell of Liverpool Lord Russell of Liverpool Deputy Chairman of Committees

To respond briefly to the noble Baroness, I shall give a specific example of how Amendment 93 would help. Let us go back to the coroner’s courtroom where the parents of Molly Russell were trying to get the coroner to understand what had happened to their daughter. The legal team from Meta was there, with combined salaries probably in seven figures, and the argument was about the detail of the content. At one point, I recall Ian Russell saying that one of the Meta lawyers said, “We are topic agnostic”. I put it to the noble Baroness that, had the provisions in Amendment 93 been in place, first, under “Content harms” in proposed new paragraph 3(c) and (d), Meta would have been at fault; under “Contact harms” in proposed new paragraph 4(b), Meta would have been at fault; under “Conduct harms” in proposed new paragraph 5(b), Meta would have been at fault; and under “Commercial harms” in proposed new paragraph 6(a) and (b), Meta would have been at fault. That would have made things a great deal simpler.

Photo of Baroness Fox of Buckley Baroness Fox of Buckley Non-affiliated 5:45, 27 April 2023

I appreciate that that this is the case we all have in the back of our minds. I am asking whether, when Meta says it is content agnostic, the Bill is the appropriate place for us to list the topics that we consider harmful. If we are to do that, I was giving examples of contentious, harmful topics. I might have got this wrong—

Photo of Baroness Kidron Baroness Kidron Crossbench

I will answer the noble Baroness more completely when I wind up, but I just want to say that she is missing the point of the schedule a little. Like her, I am concerned about the way we concentrate on content harms, but she is bringing it back to content harms. If she looks at it carefully, a lot of the provisions are about contact and conduct: it is about how the system is pushing children to do certain things and pushing them to certain places. It is about how things come together, and I think she is missing the point by keeping going back to individual pieces of content. I do not want to take the place of the Minister, but this is a systems and processes Bill; it is not going to deal with individual pieces of content in that way. It asks, “Are you creating these toxic environments for children? Are you delivering this at scale?” and that is the way we must look at this amendment.

Photo of Baroness Fox of Buckley Baroness Fox of Buckley Non-affiliated

I will finish here, because we have to get on, but I did not introduce content; it is in the four Cs. One of the four Cs is “content” and I am reacting to amendments tabled by the noble Baroness. I do not think I am harping on about content; I was responding to amendments in which content was one of the key elements.

Photo of Baroness Kidron Baroness Kidron Crossbench

Let us leave it there.

Photo of Baroness Benjamin Baroness Benjamin Liberal Democrat

My Lords, I speak in support of these amendments with hope in my heart. I thank the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, for leading the charge with such vigour, passion and determination: I am with them all the way.

The Government have said that the purpose of the Bill is to protect children, and it rests on our shoulders to make sure it delivers on this mission. Last week, on the first day in Committee, the Minister said:

“Through their duties of care, all platforms will be required proactively to identify and manage risk factors associated with their services in order to ensure that users do not encounter illegal content and that children are protected from harmful content. To achieve this, they will need to design their services to reduce the risk of harmful content or activity occurring and take swift action if it does.—[Official Report, 19/4/23; cols. 274-75.]

This is excellent and I thank the Government for saying it. But the full range of harms and risk to children will not be mitigated by services if they do not know what they are expected to risk-assess for and if they must wait for secondary legislation for this guidance.

The comprehensive range of harms children face every day is not reflected in the Bill. This includes sexual content that does not meet the threshold of pornography. This was highlighted recently in an investigation into TikTok by the Telegraph, which found that a 13 year-old boy was recommended a video about the top 10 porn-making countries, and that a 13 year-old girl was shown a livestream of a pornography actor in her underwear answering questions from viewers. This content is being marketed to children without a user even seeking out pornographic content, but this would still be allowed under the Bill.

Furthermore, high-risk challenges, such as the Benadryl and blackout challenges, which encourage dangerous behaviour on TikTok, are not dealt with in the Bill. Some features, such as the ability of children to share their location, are not dealt with either. I declare an interest as vice-president of Barnardo’s, which has highlighted how these features can be exploited by organised criminal gangs that sexually exploit children to keep tabs on them and trap them in a cycle of exploitation.

It cannot be right that the user-empowerment duties in the Bill include a list of harmful content that services must enable adults to toggle off, yet the Government refuse to produce this list for children. Instead, we have to wait for secondary legislation to outline harms to children, causing further delay to the enforcement of services’ safety duties. Perhaps the Minister can explain why this is.

The four Cs framework of harm, as set out in these amendments, is a robust framework that will ensure service risk assessments consider the full range of harms children face. I will repeat it once again: childhood lasts a lifetime, so we cannot fail children any longer. Protections are needed now, not in years to come. We have waited far too long for this. Protections need to be fast-tracked and must be included in the Bill. That is why I fully support these amendments.

Photo of Lord Knight of Weymouth Lord Knight of Weymouth Labour

My Lords, in keeping with the Stevenson-Knight double act, I am leaving it to my noble friend to wind up the debate. I will come in at this point with a couple of questions and allow the Minister to have a bit of time to reflect on them. In doing so, I reinforce my support for Amendment 295 in the name of the noble Lord, Lord Russell, which refers to volume and frequency also being risk factors.

When I compare Amendment 20 with Clause 10(6), which refers to children’s risk assessments and what factors should be taken into account in terms of the risk profile, I see some commonality and then some further things which Amendment 20, tabled by the noble Baroness, Lady Kidron, adds. In my opinion, it adds value. I am interested in how the Minister sees the Bill, as it stands currently, covering some issues that I will briefly set out. I think it would be helpful if the Committee could understand that there may be ways that the Bill already deals with some of the issues so wonderfully raised by the noble Baroness; it would be helpful if we can flush those out.

I do not see proposed new subsection (b)(iii),

“risks which can build up over time”,

mentioned in the Bill, nor explicit mention of proposed new subsection (b)(iv),

“the ways in which level of risks can change when experienced in combination with others”,

which I think is critical in terms of the way the systems work. Furthermore, proposed new subsection (b)(vii),

“the different ways in which the service is used including but not limited to via virtual and augmented reality technologies”,

starts to anticipate some other potential harms that may be coming very rapidly towards us and our children. Again, I do not quite see it included. I see “the design of functionalities”, “the business model” and “the revenue model”. There is a lot about content in the original wording of the Bill, which is less so here, and, clearly, I do not see anything in respect of the UN Convention on the Rights of the Child, which has been debated in separate amendments anyway. I wanted to give the Minister some opportunity on that.

Photo of Lord Bethell Lord Bethell Conservative

My Lords, I restate my commitment to Amendments 20, 93 and 123, which are in my name and those of the noble Baroness, Lady Kidron, the right reverend Prelate the Bishop of Oxford, and the noble Lord, Lord Stevenson, and the noble Baroness’s Amendment 74. It is a great honour to follow the noble Lord, Lord Knight. He put extremely well some key points about where there are gaps in the existing Bill. I will build on why we have brought forward these amendments in order to plug these gaps.

In doing so, I wish to say that it has been a privilege to work with the right reverend Prelate, the noble Baroness and the noble Lord, Lord Stevenson. We are not from the same political geographies, but that collaboration demonstrates the breadth of the political concern, and the strength of feeling across the Committee, about these important gaps when it comes to harms—gaps that, if not addressed, will put children at great risk. In this matter we are very strongly united. We have been through a lot together, and I believe this unlikely coalition demonstrates how powerful the feelings are.

It has been said before that children are spending an increasing amount of their lives online. However, the degree of that inflection point in the last few years has been understated, as has how much further it has got to go. The penetration of mobile phones is already around 75% of 10 year-olds—it is getting younger, and it is getting broader.

In fact, the digital world is totally inescapable in the life of a child, whether that is for a young child who is four to six years old or an older child who is 16 or 17. It is increasingly where they receive their education—I do not think that is necessarily a good thing, but that is arguable—it is where they establish and maintain their personal relationships and it is a key forum for their self-expression.

For anyone who suspects otherwise, I wish to make it clear that I firmly believe in innovation and progress, and I regard the benefits of the digital world as really positive. I would never wish to prevent children accessing the benefits of the internet, the space it creates for learning and building community, and the opportunities it opens for them. However, environments matter. The digital world is not some noble wilderness free from original sin or a perfect, frictionless marketplace where the best, nicest, and most beautiful ideas triumph. It is a highly curated experience defined by the algorithms and service agreements of the internet companies. That is why we need rules to ensure that it is a safe space for children.

I started working on my first internet business in 1995, nearly 30 years ago. I was running the Ministry of Sound, and we immediately realised that the internet was an amazing way of getting through to young people. Our target audiences were either clubbers aged over 18 or the younger brothers and sisters of clubbers who bought our merchandise. The internet gave us an opportunity to get past all the normal barriers—past parents and regulation to reach a wonderful new market. I built a good business and it worked out well for me, but those were the days before GDPR and what we understand from the internet. I know from my experience that we need to ensure that children are protected and shielded from the harms that bombard them, because there are strong incentives—mainly financial but also other, malign incentives—for bad actors to use the internet to get through to children.

Unfortunately, as the noble Baroness, Lady Kidron, pointed out, the Bill as it stands does not achieve that aim. Take, for example, contact harms, such as grooming and child sexual abuse. In February 2020, Bark, a US-based organisation that helps families manage and protect their children’s digital lives, launched an 11 year-old persona online who it called Bailey. Bailey’s online persona clearly shows that she is an ordinary 11 year-old, posting content that is ordinary for an 11 year-old. Within 30 seconds of her persona being launched online she received a like from a man whose profile picture was a penis. Within two minutes, multiple messages were received from men, and within five minutes a video call. Shortly afterwards, she received requests from men to meet up. I remind your Lordships that Bailey was 11 years old. These are not trivial content harms; these are attempts to contact a minor using the internet as a medium.

The Bill does try to address contact harms. I am supportive of the Bill and its principles, and am a big fan of the team that is trying to drive it through Parliament. For example, the Bill specifies that features that enable adults to search for and contact children must be risk-assessed. However, that is where the Bill currently stops. There is no comprehensive list of harms or features to inform that risk assessment. For instance, a feature such as live-streaming, which can enable adults to access children directly, is not specifically referenced, meaning that there is no explicit obligation for services to risk-assess such features—a big gap. Services cannot be expected to read between the lines of what the Government’s intentions may be. We must be explicit and clear in the Bill if we are serious about delivering on children’s safety.

That is why our amendments would do four things. First, they would introduce into the Bill a new schedule of harms to children, framed around the four categories of risk: content, contact, conduct and contract or commercial. This will ensure that harms to children in the Bill reflect the full range of harms that children encounter online, including from design features that facilitate pathways to harm to pornography, self-harm, pro-suicide content and grooming. Secondly, we seek to ensure that services must risk-assess for the harms listed in this proposed new schedule in order to ensure that these risk assessments are comprehensive. Thirdly, we seek to task Ofcom with producing guidance, to be updated every 12 months, on new and emerging harms. Fourthly, we seek to ensure that Ofcom consults with children’s advocates and charities when producing this guidance.

The timing of this is very important. The real-life Bailey cannot wait for harms to be outlined in secondary legislation. These problems have been around for a long time but, as I said, the inflection curve on technology is shooting right up the hockey stick at the moment. If the primary purpose of the Bill is to protect children, it must include the harms that children face every day in primary legislation—not in secondary legislation, not after Royal Assent, not in a year or two, but now. Our Amendment 93 would introduce in the Bill a schedule of harms to children. This non-exhaustive list would produce a robust and future-proof framework, and have the clarity and mandate to produce comprehensive risk assessments to ensure that the Bill delivers on its chief purpose: protecting children.

Photo of Baroness Harding of Winscombe Baroness Harding of Winscombe Conservative 6:00, 27 April 2023

My Lords, I support this group of amendments, so ably introduced by my noble friend and other noble Lords this afternoon.

I am not a lawyer and I would not say that I am particularly experienced in this business of legislating. I found this issue incredibly confusing. I hugely appreciate the briefings and discussions—I feel very privileged to have been included in them—with my noble friend the Minister, officials and the Secretary of State herself in their attempt to explain to a group of us why these amendments are not necessary. I was so determined to try to understand this properly that, yesterday, when I was due to travel to Surrey, I took all my papers with me. I got on the train at Waterloo and started to work my way through the main challenges that officials had presented.

The first challenge was that, fundamentally, these amendments cut across the Bill’s definitions of “primary priority content” and “priority content”. I tried to find them in the Bill. Unfortunately, in Clause 54, there is a definition of primary priority content. It says that, basically, primary priority content is what the Secretary of State says it is, and that content that is harmful to children is primary priority content. So I was none the wiser on Clause 54.

One of the further challenges that officials have given us is that apparently we, as a group of noble Lords, were confusing the difference between harm and risk. I then turned to Clause 205, which comes out with the priceless statement that a risk of harm should be read as a reference to harm—so maybe they are the same thing. I am still none the wiser.

Yesterday morning, I found myself playing what I can only describe as a parliamentary game of Mornington Crescent, as I went round and round in circles. Unfortunately, it was such a confusing game of Mornington Crescent that I forgot that I needed to change trains, ended up in Richmond instead of Redhill, and missed my meeting entirely. I am telling the Committee this story because, as the debate has shown, it is so important that we put in the Bill a definition of the harms that we are intending to legislate for.

I want to address the points made by the noble Baroness, Lady Fox. She said that we might not all agree on what harms are genuinely harmful for children. That is precisely why Parliament needs to decide this, rather than abdicate it to a regulator who, as other noble Lords said earlier today, is then put into a political space. It is the job of Parliament to decide what is dangerous for our children and what is not. That is the approach that we take in the physical world, and it should be the approach that we take in the online world. We should do that in broad categories, which is why the four Cs is such a powerful framework. I know that we are all attempting to predict the known unknowns, which is impossible, but this framework, which gives categories of harm, is clear that it can be updated, developed and, as my noble friend Lord Bethell, said, properly consulted on. We as parliamentarians should decide; that is the purpose of voting in Parliament.

I have a couple of questions for my noble friend the Minister. Does he agree that Parliament needs to decide what the categories of online harms are that the Bill is attempting to protect our children from? If he does, why is it not the four Cs? If he really thinks it is not the four Cs, will he bring back an alternative schedule of harms?

Photo of Lord Allan of Hallam Lord Allan of Hallam Liberal Democrat Lords Spokesperson (Health)

My Lords, I will echo the sentiments of the noble Baroness, Lady Harding, in my contribution to another very useful debate, which has brought to mind the good debate that we had on the first day in Committee, in response to the amendment tabled by the noble Lord, Lord Stevenson, in which we were seeking to get into the Bill what we are actually trying to do.

I thought that the noble Baroness, Lady Fox, was also welcoming additional clarity, specifically in the area of psychological harm, which I agree with. Certainly in its earlier incarnations, the Bill was scattered throughout with references, some of which have been removed, but they are very much open to interpretation. I hope that we will come back to that.

I was struck by the point made by the noble Lord, Lord Russell, around what took place in that coroner’s hearing. You had two different platforms with different interpretations of what they thought that their duty of care would be. That is very much the point. In my experience, platforms will follow what they are told to follow. The challenge is when each of them comes to their own individual view around what are often complex areas. There we saw platforms presenting different views about their risk assessments. If we clarify that for them through amendments such as these, we are doing everyone a favour.

I again compliment my noble friend Lady Benjamin for her work in this area. Her speech was also a model of clarity. If we can bring some of that clarity to the legislation and to explaining what we want, that will be an enormous service.

The noble Lord, Lord Knight, made some interesting points around how this would add value to the Bill, teasing out some of the specific gaps that we have there. I look forward to hearing the response on that.

I was interested in the comments from the noble Lord, Lord Bethell, on mobile phone penetration. We should all hold in common that we are not going back to a time BC—before connection. Our children will be connected, which creates the imperative for us to get this right. There has perhaps been a tendency for us to bury our heads in the sand, and occasionally you hear that still—it is almost as if we would wish this world away. However, the noble Baroness, Lady Kidron, is at the other end of the spectrum; she has come alive on this subject, precisely because she recognises that that will not happen. We are in a world where our children will be connected, so it is on us to figure out how we want those connections to work and to instruct the people who provide those connective services on what they should do. It is certainly not for us to imagine that somehow they will all go away. We will come to that in later groups when we talk about minimum ages; if younger children are online, there is a real issue around how we are going to deal with that.

The right reverend Prelate the Bishop of Oxford highlighted some really important challenges based on real experiences that families today are suffering—let us use the word as it should be—and made the case for clarity. I do not know how much we are allowed to talk in praise of EU legislation, but I am looking at the Digital Services Act—I have looked at a lot of EU legislation—and this Bill, and there is a certain clarity to EU regulation, particularly the process of adding recitals, which are attached to the law and explain what it is meant to do. That is sometimes missing here. I know that there are different legal traditions, but you can sometimes look at an EU regulation and the UK law and the former appears to be much clearer in its intent.

That brings me to the substance of my comments in response to this group, so ably introduced by the noble Baroness, Lady Kidron. I hope that the Government heed and recognise that, at present, no ordinary person can know what is happening in the Bill—other than, perhaps, the wife of the noble Lord, Lord Stevenson, who will read it for fun—and what we intend to do.

I was thinking back to the “2B or not 2B” debate we had earlier about the lack of clarity around something even as simple as the classification of services. I was also thinking that, if you ask what the Online Safety Bill does to restrict self-harm content, the answer would be this: if it is a small social media platform, it will probably be categorised as a 2B service, then we can look at Schedule 7, where it is prohibited from assisting suicide, but we might want to come back to some of the earlier clauses with the specific duties—and it will go on and on. As the noble Baroness, Lady Harding, described, you are leaping backwards and forwards in the Bill to try to understand what we are trying to do with the legislation. I think that is a genuine problem.

In effect, the Bill is Parliament setting out the terms of service for how we want Ofcom to regulate online services. We debated terms of service earlier. What is sauce for the goose is sauce for the gander. We are currently failing our own tests of simplicity and clarity on the terms of service that we will give to Ofcom.

As well as platforms, if ordinary people want to find out what is happening, then, just like those platforms with the terms of service, we are going to make them read hundreds of pages before they find out what this legislation is intended to do. We can and should make this simpler for children and parents. I was able to meet Ian Russell briefly at the end of our Second Reading debate. He has been an incredibly powerful and pragmatic voice on this. He is asking for reasonable things. I would love to be able to give a Bill to Ian Russell, and the other families that the right reverend Prelate the Bishop of Oxford referred to, that they can read and that tells them very clearly how Parliament has responded to their concerns. I think we are a long way short of that simple clarity today.

It would be extraordinarily important for service providers, as I already mentioned in response to the noble Lord, Lord Russell. They need that clarity, and we want to make sure that they have no reason to say, “I did not understand what I was being asked to do”. That should be from the biggest to the smallest, as the noble Lord, Lord Moylan, keeps rightly raising with us. Any small service provider should be able to very clearly and simply understand what we are intending to do, and putting more text into the Bill that does that would actually improve it. This is not about adding a whole load of new complications and the bells and whistles we have described but about providing clarity on our intention. Small service providers would benefit from that clarity.

The noble Baroness, Lady Ritchie, rightly raised the issue of the speed of the development of technology. Again, we do not want the small service provider in particular to think it has to go back and do a whole new legal review every time the technology changes. If we have a clear set of principles, it is much quicker and simpler for it to say, “I have developed a new feature. How does it match up against this list?”, rather than having to go to Clause 12, Clause 86, Clause 94 and backwards and forwards within the Bill.

It will be extraordinarily helpful for enforcement bodies such as Ofcom to have a yardstick—again, this takes us back to our debate on the first day—for its prioritisation, because it will have to prioritise. It will not be able to do everything, everywhere, all at once. If we put that prioritisation into the legislation, it will, frankly, save potential arguments between Parliament, the Government and Ofcom later on, when they have decided to prioritise X and we wanted them to prioritise Y. Let us all get aligned on what we are asking them to do up front.

Dare I say—the noble Baroness, Lady Harding, reminded me of this—that it may also be extraordinarily helpful for us as politicians so that we can understand the state of the law. I mean not just the people who are existing specialists or are becoming specialists in this area and taking part in this debate but the other hundreds of Members of both Houses, because this is interesting to everyone. I have experience of being in the other place, and every Member of the other place will have constituents coming to them, often with very tragic circumstances, and asking what Parliament has done. Again, if they have the Online Safety Bill as currently drafted, I think it is hard for any Member of Parliament to be able to say clearly, “This is what we have done”. With those words and that encouraging wind, I hope the Government are able to explain, if not in this way, that they have a commitment to ensuring that we have that clarity for everybody involved in this process.

Photo of Lord Stevenson of Balmacara Lord Stevenson of Balmacara Shadow Spokesperson (Science, Innovation and Technology) 6:15, 27 April 2023

My Lords, over the last few hours I have praised us for having developed a style of discussion and debate that is certainly relatively new and not often seen in the House, where we have tried to reach out to each other and find common ground. That was not a problem in this last group of just over an hour; I think we are united around the themes that were so brilliantly introduced in a very concise and well-balanced speech by the noble Baroness, Lady Kidron, who has been a leading and inspirational force behind this activity for so long.

Although different voices have come in at different times and asked questions that still need to be answered, I sense that we have reached a point in our thinking, if not in our actual debates, where we need a plan. I too reached this point; that was exactly the motivation I had in tabling Amendment 1, which was discussed on the first day. Fine as the Bill is—it is a very impressive piece of work in every way—it lacks what we need as a Parliament to convince others that we have understood the issues and have the answers to their questions about what this Government, or this country as a whole, are going to do about this tsunami of difference, which has arrived in the wake of the social media companies and search engines, in the way we do our business and live our lives these days. There is consensus, but it is slightly different to the consensus we had in earlier debates, where we were reassuring ourselves about the issues we were talking about but were not reaching out to the Government to change anything so much as being happy that we were speaking the same language and that they were in the same place as we are gradually coming to as a group, in a way.

Just before we came back in after the lunch break, I happened to talk to the noble Lord, Lord Grade, who is the chair of Ofcom and is listening to most of our debates and discussions when his other duties allow. I asked him what he thought about it, and he said that it was fascinating for him to recognise the level of expertise and knowledge that was growing up in the House, and that it would be a useful resource for Ofcom in the future. He was very impressed by the way in which everyone was engaging and not getting stuck in the niceties of the legislation, which he admitted he was experiencing himself. I say that softly; I do not want to embarrass him in any way because he is an honourable man. However, the point he makes is really important.

I say to the Minister that I do not think we are very far apart on this. He knows that, because we have discussed it at some length over the last six to eight weeks. What I think he should take away from this debate is that this is a point where a decision has to be taken about whether the Government are going to go with the consensus view being expressed here and put deliberately into the Bill a repetitive statement, but one that is clear and unambiguous, about the intention behind the Government’s reason for bringing forward the Bill and for us, the Opposition and other Members of this House, supporting it, which is that we want a safe internet for our children. The way we are going to do that is by having in place, up front and clearly in one place, the things that matter when the regulatory structure sits in place and has to deal with the world as it is, of companies with business plans and business models that are at variance with what we think should be happening and that we know are destroying the lives of people we love and the future of our country—our children—in a way that is quite unacceptable when you analyse it down to its last detail.

It is not a question of saying back to us across the Dispatch Box—I know he wants to but I hope he will not—“Everything that you have said is in the Bill; we don’t need to go down this route, we don’t need another piece of writing that says it all”. I want him to forget that and say that actually it will be worth it, because we will have written something very special for the world to look at and admire. It is probably not in its perfect form yet, but that is what the Government can do: take a rough and ready potential diamond, polish it, chamfer it, and bring it back and set it in a diadem we would all be proud to wear—Coronations excepted—so that we can say, “Look, we have done the dirty work here. We’ve been right down to the bottom and thought about it. We’ve looked at stuff that we never thought in our lives we would ever want to see and survived”.

I shake at some of the material we were shown that Molly Russell was looking at. But I never want to be in a situation where I will have to say to my children and grandchildren, “We had the chance to get this right and we relied on a wonderful piece of work called the Online Safety Act 2023; you will find it in there, but it is going to take you several weeks and a lot of mental harm and difficulty to understand what it means”.

So, let us make it right. Let us not just say “It’ll be alright on the night”. Let us have it there. It is almost right but, as my noble friend Lord Knight said, it needs to be patched back into what is already in the Bill. Somebody needs to look at it and say, “What, out of that, will work as a statement to the world that we care about our kids in a way that will really make a difference?” I warn the Minister that, although I said at Second Reading that I wanted to see this Bill on the statute book as quickly as possible, I will not accept a situation where we do not have more on this issue.

Photo of Lord Parkinson of Whitley Bay Lord Parkinson of Whitley Bay Parliamentary Under Secretary of State (Department for Culture, Media and Sport)

I am grateful to all noble Lords who have spoken on this group and for the clarity with which the noble Lord, Lord Stevenson, has concluded his remarks.

Amendments 20, 74, 93 and 123, tabled by the noble Baroness, Lady Kidron, would mean a significant revising of the Bill’s approach to content that is harmful to children. It would set a new schedule of harmful content and risk to children—the 4 Cs—on the face of the Bill and revise the criteria for user-to-user and search services carrying out child safety risk assessments.

I start by thanking the noble Baroness publicly—I have done so privately in our discussions—for her extensive engagement with the Government on these issues over recent weeks, along with my noble friends Lord Bethell and Lady Harding of Winscombe. I apologise that it has involved the noble Baroness, Lady Harding, missing her stop on the train. A previous discussion we had also very nearly delayed her mounting a horse, so I can tell your Lordships how she has devoted hours to this—as they all have over recent weeks. I would like to acknowledge their campaigning and the work of all organisations that the noble Baroness, Lady Kidron, listed at the start of her speech, as well as the families of people such as Olly Stephens and the many others that the right reverend Prelate the Bishop of Oxford mentioned.

I also reassure your Lordships that, in developing this legislation, the Government carried out extensive research and engagement with a wide range of interested parties. That included reviewing international best practice. We want this to be world-leading legislation, including the four Cs framework on the online risks of harm to children. The Government share the objectives that all noble Lords have echoed in making sure that children are protected from harm online. I was grateful to the noble Baroness, Lady Benjamin, for echoing the remarks I made earlier in Committee on this. I am glad we are on the same page, even if we are still looking at points of detail, as we should be.

As the noble Baroness, Lady Kidron, knows, it is the Government’s considered opinion that the Bill’s provisions already deliver these objectives. I know that she remains to be convinced, but I am grateful to her for our continuing discussions on that point, and for continuing to kick the tyres on this to make sure that this is indeed legislation of which we can be proud.

It is also clear that there is broad agreement across the House that the Bill should tackle harmful content to children such as content that promotes eating disorders, illegal behaviour such as grooming and risk factors for harm such as the method by which content is disseminated, and the frequency of alerts. I am pleased to be able to put on record that the Bill as drafted already does this in the Government’s opinion, and reflects the principles of the four Cs framework, covering each of those: content, conduct, contact and commercial or contract risks to children.

First, it is important to understand how the Bill defines content, because that question of definition has been a confusing factor in some of the discussions hitherto. When we talk in general terms about content, we mean the substance of a message. This has been the source of some confusion. The Bill defines “content”, for the purposes of this legislation, in Clause 207 extremely broadly as

“anything communicated by means of an internet service”.

Under this definition, in essence, all user communication and activity, including recommendations by an algorithm, interactions in the metaverse, live streams, and so on, is facilitated by “content”. So, for example, unwanted and inappropriate contact from an adult to a child would be treated by the Bill as content harm. The distinctions that the four Cs make between content, conduct and contact risks is therefore not necessary. For the purposes of the Bill, they are all content risks.

Secondly, I know that there have been concerns about whether the specific risks highlighted in the new schedule will be addressed by the Bill.

Photo of Baroness Harding of Winscombe Baroness Harding of Winscombe Conservative

Where are the commercial harms? I cannot totally get my head around my noble friend’s definition of content. I can sort of understand how it extends to conduct and contact, but it does not sound as though it could extend to the algorithm itself that is driving the addictive behaviour that most of us are most worried about.

Photo of Lord Knight of Weymouth Lord Knight of Weymouth Labour

In that vein, will the noble Lord clarify whether that definition of content does not include paid-for content?

Photo of Lord Parkinson of Whitley Bay Lord Parkinson of Whitley Bay Parliamentary Under Secretary of State (Department for Culture, Media and Sport)

I was about to list the four Cs briefly in order, which will bring me on to commercial or contract risk. Perhaps I may do that and return to those points.

I know that there have been concerns about whether the specific risks highlighted in the new schedule will be addressed by the Bill. In terms of the four Cs category of content risks, there are specific duties for providers to protect children from illegal content, such as content that intentionally assists suicide, as well as content that is harmful to children, such as pornography. Regarding conduct risks, the child safety duties cover harmful conduct or activity such as online bullying or abuse and, under the illegal content safety duties, offences relating to harassment, stalking and inciting violence.

With regard to commercial or contract risks, providers specifically have to assess the risks to children from the design and operation of their service, including their business model and governance under the illegal content and child safety duties. In relation to contact risks, as part of the child safety risk assessment, providers will need specifically to assess contact risks of functionalities that enable adults to search for and contact other users, including children, in a way that was set out by my noble friend Lord Bethell. This will protect children from harms such as harassment and abuse, and, under the illegal content safety duties, all forms of child sexual exploitation and abuse, including grooming.

Photo of Baroness Kidron Baroness Kidron Crossbench

I agree that content, although unfathomable to the outside world, is defined as the Minister says. However, does that mean that when we see that

“primary priority content harmful to children” will be put in regulations by the Secretary of State under Clause 54(2)—ditto Clause 54(3) and (4)—we will see those contact risks, conduct risks and commercial risks listed as primary priority, priority and non-designated harms?

I do not want to make my speech twice, but in my final sentence I said that my challenge to the Government is to have a very simple way forward by other means, if those things were articulated, but my understanding is that they are to bring forward content harms that describe only content as we normally believe it.

Photo of Lord Parkinson of Whitley Bay Lord Parkinson of Whitley Bay Parliamentary Under Secretary of State (Department for Culture, Media and Sport) 6:30, 27 April 2023

I have tried to outline the Bill’s definition of content, which I think will give some reassurance that other concerns that noble Lords have raised are covered. I will turn in a moment to address priority and primary priority content, if the noble Baroness will allow me to do that, and then perhaps intervene again if I have not done so to her satisfaction. I want to set that out and try to keep track of all the questions which have been posed as I do so.

For now, I know there have been concerns from some noble Lords that if functionalities are not labelled as harm in the legislation they would not be addressed by providers, and I reassure your Lordships’ House that this is not the case. There is an important distinction between content and other risk factors such as, for instance, an algorithm, which without content cannot risk causing harm to a child. That is why functionalities are not covered by the categories of primary, priority and priority content which is harmful to children. The Bill sets out a comprehensive risk assessment process which will cover content or activity that poses a risk of harm to children and other factors, such as functionality, which may increase the risk of harm. As such, the existing children’s risk assessment criteria already cover many of the changes proposed in this amendment. For example, the duties already require service providers to assess the risk of harm to children from their business model and governance. They also require providers to consider how a comprehensive range of functionalities affect risk, how the service is used and how the use of algorithms could increase the risks to children.

Turning to the examples of harmful content set out in the proposed new schedule, I am happy to reassure the noble Baroness and other noble Lords that the Government’s proposed list of primary, priority and priority content covers a significant amount of this content. In her opening speech she asked about cumulative harm—that is, content sent many times or content which is harmful due to the manner of its dissemination. We will look at that in detail on the next group as well, but I will respond to the points she made earlier now. The definition of harm in the Bill under Clause 205 makes it clear that physical or psychological harm may arise from the fact or manner of dissemination of the content, not just the nature of the content—content which is not harmful per se, but which if sent to a child many times, for example by an algorithm, would meet the Bill’s threshold for content that is harmful to children. Companies will have to consider this as a fundamental part of their risk assessment, including, for example, how the dissemination of content via algorithmic recommendations may increase the risk of harm, and they will need to put in place proportionate and age-appropriate measures to manage and mitigate the risks they identify. I followed the exchanges between the noble Baronesses, Lady Kidron and Lady Fox, and I make it clear that the approach set out by the Bill will mean that companies cannot avoid tackling the kind of awful content which Molly Russell saw and the harmful algorithms which pushed that content relentlessly at her.

This point on cumulative harm was picked up by my noble friend Lord Bethell. The Bill will address cumulative risk where it is the result of a combination of high-risk functionality, such as live streaming, or rewards in service by way of payment or non-financial reward. This will initially be identified through Ofcom’s sector risk assessments, and Ofcom’s risk profiles and risk assessment guidance will reflect where a combination of risk in functionalities such as these can drive up the risk of harm to children. Service providers will have to take Ofcom’s risk profiles into account in their own risk assessments for content which is illegal or harmful to children. The actions that companies will be required to take under their risk assessment duties in the Bill and the safety measures they will be required to put in place to manage the services risk will consider this bigger-picture risk profile.

The amendments of the noble Baroness, Lady Kidron, would remove references to primary priority and priority harmful content to children from the child risk assessment duties, which we fear would undermine the effectiveness of the child safety duties as currently drafted. That includes the duty for user-to-user providers to prevent children encountering primary priority harms, such as pornography and content that promotes self-harm or suicide, as well as the duty to put in place age-appropriate measures to protect children from other harmful content and activity. As a result, we fear these amendments could remove the requirement for an age-appropriate approach to protecting children online and make the requirement to prevent children accessing primary priority content less clear.

The noble Baroness, Lady Kidron, asked in her opening remarks about emerging harms, which she was right to do. As noble Lords know, the Bill has been designed to respond as rapidly as possible to new and emerging harms. First, the primary priority and priority list of content can be updated by the Secretary of State. Secondly, it is important to remember the function of non-designated content that is harmful to children in the Bill—that is content that meets the threshold of harmful content to children but is not on the lists designated by the Government. Companies are required to understand and identify this kind of content and, crucially, report it to Ofcom. Thirdly, this will inform the actions of Ofcom itself in its review and report duties under Clause 56, where it is required to review the incidence of harmful content and the severity of harm experienced by children as a result of it. This is not limited to content that the Government have listed as being harmful, as it is intended to capture new and emerging harms. Ofcom will be required to report back to the Government with recommendations on changes to the primary priority and priority content lists.

I turn to the points that the noble Lord, Lord Knight of Weymouth, helpfully raised earlier about things that are in the amendments but not explicitly mentioned in the Bill. As he knows, the Bill has been designed to be tech-neutral, so that it is future-proof. That is why there is no explicit reference to the metaverse or virtual or augmented reality. However, the Bill will apply to service providers that enable users to share content online or interact with each other, as well as search services. That includes a broad range of services such as websites, applications, social media sites, video games and virtual reality spaces such as the metaverse; those are all captured. Any service that allows users to interact, as the metaverse does, will need to conduct a children’s access assessment and comply with the child safety duties if it is likely to be accessed by children.

Amendment 123 from the noble Baroness, Lady Kidron, seeks to amend Clause 48 to require Ofcom to create guidance for Part 3 service providers on this new schedule. For the reasons I have just set out, we do not think it would be workable to require Ofcom to produce guidance on this proposed schedule. For example, the duty requires Ofcom to provide guidance on the content, whereas the proposed schedule includes examples of risky functionality, such as the frequency and volume of recommendations.

I stress again that we are sympathetic to the aim of all these amendments. As I have set out, though, our analysis leads us to believe that the four Cs framework is simply not compatible with the existing architecture of the Bill. Fundamental concepts such as risk, harm and content would need to be reconsidered in the light of it, and that would inevitably have a knock-on effect for a large number of clauses and timing. The Bill has benefited from considerable scrutiny—pre-legislative and in many discussions over many years. The noble Baroness, Lady Kidron, has been a key part of that and of improving the Bill. The task is simply unfeasible at this stage in the progress of the Bill through Parliament and risks delaying it, as well as significantly slowing down Ofcom’s implementation of the child safety duties. We do not think that this slowing down is a risk worth taking, because we believe the Bill already achieves what is sought by these amendments.

Even so, I say to the Committee that we have listened to the noble Baroness, Lady Kidron, and others and have worked to identify changes which would further address these concerns. My noble friend Lady Harding posed a clear question: if not this, what would the Government do instead? I am pleased to say that, as a result of the discussions we have had, the Government have decided to make a significant change to the Bill. We will now place the categories of primary priority and priority content which is harmful to children on the face of the Bill, rather than leaving them to be designated in secondary legislation, so Parliament will have its say on them.

We hope that this change will reassure your Lordships that protecting children from the most harmful content is indeed the priority for the Bill. That change will be made on Report. We will continue to work closely with the noble Baroness, Lady Kidron, my noble friends and others, but I am not able to accept the amendments in the group before us today. With that, I hope that she will be willing to withdraw.

Photo of Baroness Kidron Baroness Kidron Crossbench

I thank all the speakers. There were some magnificent speeches and I do not really want to pick out any particular ones, but I cannot help but say that the right reverend Prelate described the world without the four Cs. For me, that is what everybody in the Box and on the Front Bench should go and listen to.

I am grateful and pleased that the Minister has said that the Government are moving in this direction. I am very grateful for that but there are a couple of things that I have to come back on. First, I have swiftly read Amendment 205’s definition of harm and I do not think it says that you do not have to reach a barrier of harm; dissemination is quite enough. There is always the problem of what the end result of the harm is. The thing that the Government are not listening to is the relationship between the risk assessment and the harm. It is about making sure that we are clear that it is the functionality that can cause harm. I think we will come back to this at another point, but that is what I beg them to listen to. Secondly, I am not entirely sure that it is correct to say that the four Cs mean that you cannot have primary priority, priority and so on. That could be within the schedule of content, so those two things are not actually mutually exclusive. I would be very happy to have a think about that.

What was not addressed in the Minister’s answer was the point made by the noble Lord, Lord Allan of Hallam, in supporting the proposal that we should have in the schedule: “This is what you’ve got to do; this is what you’ve got to look at; this is what we’re expecting of you; and this is what Parliament has delivered”. That is immensely important, and I was so grateful to the noble Lord, Lord Stevenson, for putting his marker down on this set of amendments. I am absolutely committed to working alongside him and to finding ways around this, but we need to find a way of stating it.

Ironically, that is my answer to both the noble Baronesses, Lady Ritchie and Lady Fox: we should have our arguments here and now, in this Chamber. I do not wish to leave it to the Secretary of State, whom I have great regard for, as it happens, but who knows: I have seen a lot of Secretaries of State. I do not even want to leave it to the Minister, because I have seen a lot of Ministers too—ditto Ofcom, and definitely not the tech sector. So here is the place, and we are the people, to work out the edges of this thing.

Not for the first time, my friend, the noble Baroness, Lady Harding, read out what would have been my answer to the noble Baroness, Lady Ritchie. I have gone round and round, and it is like the Marx brothers’ movie: in the end, harm is defined by subsection (4)(c), but that says that harm will defined by the Secretary of State. It goes around like that through the Bill.

There are three members of the pre-legislative committee in the Chamber. We were very clear about design features, and several members who are not present were even clearer. So I hear where we are with the Bill, but I have been following it for five years and have been saying the same thing, so if we are a little late to the party I do not think that is because of me. I do not want to delay the Bill but I want to stamp the authority of Parliament on the question of how harm happens, as well as what it is.

My last sentence has to be: let us remember our conversation about trying to measure illegal harm and then think about it at scale for children. We have to have something softer than that; we cannot do it for each piece of content. The saving grace of the Bill is its systems and processes: it will make the tsunami a trickle—that is what we want to do. It is not to say that young people should not have access to the internet. Although I spent quite a lot of time disagreeing with the noble Baroness, Lady Fox, today, I absolutely agree with her about evolving capacities, and I hope that we revisit that question later. With that, I beg leave to withdraw my amendment.

Amendment 20 withdrawn.

Amendment 21 not moved.