Amendment 2

Online Safety Bill - Committee (2nd Day) – in the House of Lords at 3:46 pm on 25 April 2023.

Alert me about debates like this

Baroness Kidron:

Moved by Baroness Kidron

2: Clause 3, page 3, line 14, at end insert—“(d) an internet service, other than a regulated user-to-user service or search service, that meets the child user condition and enables or promotes harmful activity and content as set out in Schedule (Online harms to children).”Member’s explanatory statementThis amendment would mean any service that meets the 'child user condition' and enables or promotes harmful activity and content to children, as per a new Schedule, would be in scope of the regulation of the bill.

Photo of Baroness Kidron Baroness Kidron Crossbench

My Lords, I refer the Committee to my interests as put in the register and declared in full at Second Reading. I will speak to Amendment 2 in my name and those of the right reverend Prelate the Bishop of Oxford and the noble Baroness, Lady Harding, to Amendments 3 and 5 in my name, and briefly to Amendments 19, 22, 298 and 299 in the name of the noble Baroness, Lady Harding.

The digital world does not have boundaries in the way that the Bill does. It is an ecosystem of services and products that are interdependent. A user journey is made up of incremental signals, nudges and enticements that mean that, when we use our devices, very often we do not end up where we intended to start. The current scope covers user-to-user, search and commercial porn services, but a blog or website that valorises self-harm and depression or suggests starving yourself to death is still exempt because it has limited functionality. So too are games without a user-to-user function, in spite of the known harm associated with game addiction highlighted recently by Professor Henrietta Bowden-Jones, national expert adviser on gambling harms, and the World Health Organization in 2019 when it designated gaming disorder as a behavioural addiction.

There is also an open question about immersive technologies, whose protocols are still very much in flux. I am concerned that the Government are willing to assert that these environments will meet the bar of user-to-user when those that are still building immersive environments make quite clear that that is not a given. Indeed, later in Committee I will be able to demonstrate that already the very worst harms are happening in environments that are not clearly covered by the Bill.

Another unintended consequence of the current drafting is that the task of working out whether you are on a regulated or unregulated service is left entirely to children. That is not what we had been promised. In December the Secretary of State wrote in a public letter to parents,

“I want to reassure every person reading this letter that the onus for keeping young people safe online will sit squarely on the tech companies’ shoulders”.

It is likely that the Minister will suggest that the limited- functionality services will be caught by the gatekeepers. But, as in the case of immersive technology, it is dangerous to suggest that, just because search and user- to-user are the primary access points in 2023, that will remain the case. We must be more forward thinking and ensure that services likely to be accessed that promote harm are in scope by default.

Amendments 3 and 5 are consequential, so I will not debate them now. I have listened to the Government and come back with a reasonable and implementable amendment that applies only to services that are likely to be accessed by children and that enable harm. I now ask the Government to listen and do likewise.

Amendments 92 and 193 cover the child user condition. The phrase “likely to be accessed”, introduced in this House into what became the Data Protection Act 2018, is one of the most unlikely successful British exports. Both the phrase and its definition, set out by the ICO, have been embedded in regulations in countries the world over—yet the Bill replaces this established language while significantly watering down the definition.

The Bill requires

“a significant number of children” to use the service, or for the service to be

“likely to attract a significant number of users who are children”.

“Significant” in the Bill is defined relative to the overall UK user base, which means that extremely large platforms could deem a few thousand child users not significant compared with the several million-strong user base. Since only services that cross this threshold need comply with the child safety duties, thousands of children will not benefit from the safety duties that the Minister told us last week were at the heart of the Bill.

Amendment 92 would put the ICO’s existing and much-copied definition into the Bill. It says a service is

“likely to be accessed by children” if

“the service is designed or intended for use by children … children form a substantive and identifiable user group … the possibility of a child accessing the service is more probable than not, taking that are likely to be accessed.

Having two phrases and definitions is bad for business and even worse for regulators. The ICO has first-mover advantage and a more robust test. It is my contention that parents, media and perhaps even our own colleagues would be very shocked to know that the definition in the Bill has the potential for many thousands, and possibly tens of thousands, of children to be left without the protections that the Bill brings forward. Perhaps the Minister could explain why the Government have not chosen regulatory alignment, which is good practice.

Finally, I will speak briefly in support of Amendments 19, 22, 298 and 299. I am certain that the noble Baroness, Lady Harding, will spell out how the app stores of Google and Apple are simply a subset of “search”, in that they are gatekeepers to accessing more than 5 million apps worldwide and the first page of each is indeed a search function. Their inclusion should be obvious, but I will add a specific issue about which I have spoken directly with both companies and about which the 5Rights Foundation, of which I am chair, has written to the ICO.

When we looked at the age ratings of apps across Google Play Store and Apple, four things emerged. First, apps are routinely rated much lower than their terms and conditions: for example, Amazon Shopping says 18 but has an age rating of 4 on Apple. This pattern goes across both platforms, covering social sites, gaming, shopping, et cetera.

Secondly, the same apps and services did not have the same age rating across both services, which, between them, are gatekeepers for more than 95% of the app market. In one extreme case, an app rated four on one of them was rated 16 on the other, with other significant anomalies being extremely frequent.

Thirdly, almost none of the apps considered their data protection duties in coming to a decision on their age rating, which is a problem, since privacy and safety and inextricably linked.

Finally, in the case of Apple, using a device registered to a 15 year-old, we were able to download age-restricted apps including a dozen or more 18-plus dating sites. In fairness, I give a shoutout to Google, which, because of the age-appropriate design code, chose more than a year ago not to show 18-plus content to children in its Play Store. So this is indeed a political and business choice and not a question of technology. Millions of services are accessed via the App Store. Given the Government’s position—that gatekeepers have specific responsibilities in relation to harmful content and activity—surely the amendments in the name of the noble Baroness, Lady Harding, are necessary.

My preference was for a less complicated Bill based on principles and judged on outcomes. I understand that that ship has sailed, but it is not acceptable for the Government now to use the length and complexity of the Bill as a reason not to accept amendments that would fill loopholes where harm has been proven. It is time to deliver on the promises made to parents and children, and to put the onus for keeping young people safe online squarely on tech companies’ shoulders. I beg to move.

Photo of Baroness Harding of Winscombe Baroness Harding of Winscombe Conservative

My Lords, I rise to speak to Amendments 19, 22, 298 and 299 in my name and those of the noble Baroness, Lady Stowell, and the noble Lords, Lord Knight and Lord Clement-Jones. I will also briefly add at the end of my speech my support for the amendments in the name of my friend, the noble Baroness, Lady Kidron. It has been a huge privilege to be her support act all the way from the beginnings of the age-appropriate design code; it feels comfortable to speak after her.

I want briefly to set out what my amendments would do. Their purpose is to bring app stores into the child protection elements of the Bill. Amendment 19 would require app stores to prepare

“risk assessments equal to user-to-user services due to their role in distributing online content through apps to children and as a primary facilitator of user-to-user” services reaching children. Amendment 22 would mandate app stores

“to use proportionate and proactive measures, such as age assurance, to prevent children” coming into contact with

“primary priority content that is harmful to children”.

Amendments 298 and 299 would simply define “app” and “app stores”.

Let us be clear what app stores do. They enable customers to buy apps and user-to-user services. They enable customers to download free apps. They offer up curated content in the app store itself and decide what apps someone would like to see. They enable customers to search for apps for user-to-user content. They provide age ratings; as the noble Baroness, Lady Kidron, said, they may be different age ratings in different app stores for the same app. They sometimes block the download of apps based on the age rating and their assessment of someone’s age, but not always, and it is different for different app stores.

Why should they be included in this Bill—if it is not obvious from what I have already said? First, two companies are profiting from selling user-to-user products to children. Two app stores account for some 98%-plus of all downloads of user-to-user services, with no requirements to assess the risk of selling those products to children or to mitigate those risks. We do not allow that in the physical world so we should not allow it in the digital world.

Secondly, parents and teenagers tell us that this measure would help. A number of different studies have been done; I will reference just two. One was by FOSI, the Family Online Safety Institute, which conducted an international research project in which parents consistently said that having age assurance at the app store level would make things simpler and more effective for them; ironically, the FOSI research was conducted with Google.

A second research study, conducted by Internet Matters and TikTok, unambiguously shows that teenagers themselves would prefer having app store age assurance. Neither of those research projects suggests that the age assurance should be instead of age assurance in the apps themselves. They view it as additive, as an addition that would make it simpler for them and ensure that fewer children reach the point of downloading apps that they should not.

The third reason why this is necessary is that, as the noble Baroness, Lady Kidron, said, Google and Apple are already doing some of this. They are doing it differently and should be commended, to some extent, for the progress that they have made over the past five years. Google Family Link and the family functionality on the Apple store are better than they were five years ago. However, we should be troubled that this is currently not regulated. They are age-rating apps differently. Can you imagine, in the physical world, Sainsbury’s deciding that alcohol was suitable for 17 year-olds and above, Tesco deciding that it was suitable for 18 year-olds and above, and government not being able to intervene? That is the world which we are in with access to pornography today.

I am the mother of a 17 year-old girl. I went into her iPhone last night and searched on the Apple App Store. Pornography apps come up as age appropriate for 17+. This is the consequence of an unregulated app store world. Today, as I said, the vast majority is with Google and Apple. On the day that the Government launch their digital competition Bill, we should hope that over time there will be further app stores. What is to say that those app stores will do anything to protect children as they try to compete with Google and Apple?

The final reason why we should do this is that a number of app developers, particularly small ones, have expressed to me a concern that app stores might abuse their power of age-gating the internet to block apps that compete with their own. That is exactly why we should regulate this space, rather than leaving it for Google and Apple to decide what an age gate should or should not look like. Self-regulation has failed to protect children online over the past 15 years. Many of us in the Chamber today have been working in this space for at least that long. There is no reason to believe that self-regulation would be any more successful for app stores than it has been for the rest of the internet.

I have tabled these amendments and ask my noble friend the Minister to recognise that I have done so in the spirit of starting the conversation on how we regulate app stores. It is unambiguously clear that we should regulate them. The last thing that I would want to do is have my amendment slow down the progress of this Bill. The last thing that I would want is to slow down Ofcom’s implementation of the Bill. However, we keep being told that this is a framework Bill to focus on systems and processes, and it is an essential part of that framework that app stores are included.

Very briefly, I will speak in support of the amendments tabled by the noble Baroness, Lady Kidron, by telling you a story. One of my first jobs in the retail world was as the commercial director for Woolworths—we are all old enough in this Chamber to remember Woolworths —which was the leading retailer of toys. One of my first category directors for the toy category had come from outside the toy industry. I will never forget the morning when he came to tell me that an own-label Woolworths toy had caused a near-fatal accident with a child. He was new to the industry and had not worked in toys before. He said, “It’s only one child; don’t worry, it’ll be okay”. I remember saying, “That is not how health and safety with children works. This is one incident; we need to delist the product immediately; we need to treat this incredibly seriously. Imagine if that was your child”. I do not begrudge his reaction; he had never worked in that sector before.

However, the reality is that if we do not look at the impact of the digital world on every child, then we are adopting a different standard in the digital world than we do in the physical world. That is why the “likely to be accessed by children” definition that has been tried and tested, not just in this House but in legislatures around the world, should be what is used in this Bill.

Photo of The Bishop of Oxford The Bishop of Oxford Bishop 4:00, 25 April 2023

My Lords, it is a pleasure to follow the two noble Baronesses. I remind the Committee of my background as a board member of the Centre for Data Ethics and Innovation. I also declare an indirect interest, as my oldest son is the founder and studio head of Mediatonic, which is now part of Epic Games and is the maker of “Fall Guys”, which I am sure is familiar to your Lordships.

I speak today in support of Amendments 2 and 92 and the consequent amendments in this group. I also support the various app store amendments proposed by the noble Baroness, Lady Harding, but I will not address them directly in these remarks.

I was remarkably encouraged on Wednesday by the Minister’s reply to the debate on the purposes of the Bill, especially by the priority that he and the Government gave to the safety of children as its primary purpose. The Minister underlined this point in three different ways:

“The main purposes of the Bill are: to give the highest levels of protection to children … The Bill will require companies to take stringent measures to tackle illegal content and protect children, with the highest protections in the Bill devoted to protecting children … Children’s safety is prioritised throughout this Bill”.—[Official Report, 19/4/23; col. 724.]

The purpose of Amendments 2 and 92 and consequent amendments is to extend and deepen the provisions in the Bill to protect children against a range of harms. This is necessary for both the present and the future. It is necessary in the present because of the harms to which children are exposed through a broad range of services, many of which are not currently in the Bill’s scope. Amendment 2 expands the scope to include any internet service that meets the child user condition and enables or promotes harmful activity and content as set out in the schedule provided. Why would the Government not take this step, given the aims and purposes of the Bill to give the highest protection to children?

Every day, the diocese of Oxford educates some 60,000 children in our primary and secondary schools. Almost all of them have or will have access to a smartphone, either late in primary, hopefully, or early in secondary school. The smartphone is a wonderful tool to access educational content, entertainment and friendship networks, but it is also a potential gateway for companies, children and individuals to access children’s inner lives, in secret, in the dead of night and without robust regulation. It therefore exposes them to harm. Sometimes that harm is deliberate and sometimes unintentional. This power for harm will only increase in the coming years without these provisions.

The Committee needs to be alert to generational changes in technology. When I was 16 in secondary school in Halifax, I did a computer course in the sixth form. We had to take a long bus ride to the computer building in Huddersfield University. The computer filled several rooms in the basement. The class learned how to program using punch cards. The answers to our questions came back days later, on long screeds of printed paper.

When my own children were teenagers and my oldest was 16, we had one family computer in the main living room of the house. The family was able to monitor usage. Access to the internet was possible, but only through a dial-up modem. The oldest of my grandchildren is now seven and many of his friends have smartphones now. In a few years, he will certainly carry a connected device in his pocket and, potentially, have access to the entire internet 24/7.

I want him and millions of other children to have the same protection online as he enjoys offline. That means recognising that harms come in a variety of shapes and sizes. Some are easy to spot, such as pornography. We know the terrible damage that porn inflicts on young lives. Some are more insidious and gradual: addictive behaviours, the promotion of gambling, the erosion of confidence, grooming, self-harm and suicidal thoughts, encouraging eating disorders, fostering addiction through algorithms and eroding the barriers of the person.

The NSPCC describes many harms to children on social networks that we are all now familiar with, but it also highlights online chat, comments on livestream sites, voice chat in games and private messaging among the vectors for harm. According to Ofcom, nine in 10 children in the UK play video games, and they do so on devices ranging from computers to mobile phones to consoles. Internet Matters says that most children’s first interaction with someone they do not know online is now more likely to be in a video game such as “Roblox” than anywhere else. It also found that parents underestimate the frequency with which their children are contacted by strangers online.

The Gambling Commission has estimated that 25,000 children in the UK aged between 11 and 16 are problem gamblers, with many of them introduced to betting via computer games and social media. Families have been left with bills, sometimes of more than £3,000, after uncontrolled spending on loot boxes.

Online companies, we know, design their products with psychological principles of engagement firmly in view, and then refine their products by scraping data from users. According to the Information Commissioner, more than 1 million underage children could have been exposed to underage content on TikTok alone, with the platform collecting and using their personal data.

As the noble Baroness, Lady Kidron, has said, we already have robust and tested definitions of scope in the ICO’s age-appropriate design code—definitions increasingly taken up in other jurisdictions. To give the highest protection to children, we need to build on these secure definitions in this Bill and find the courage to extend robust protection across the internet now.

We also need to future-proof this Bill. These key amendments would ensure that any development, any new kind of service not yet imagined which meets the child user condition and enables or promotes harmful activity and content, would be in scope. This would give Ofcom the power to develop new guidance and accountabilities for the applications that are certain to come in the coming years.

We have an opportunity and a responsibility, as the Minister has said, to build the highest protection into this Bill. I support the key amendments standing in my name.

Photo of Baroness Stowell of Beeston Baroness Stowell of Beeston Chair, Communications and Digital Committee, Chair, Communications and Digital Committee

My Lords, first, I beg the indulgence of the Committee to speak briefly at this juncture. I know that no one from the Lib Dem or Labour Benches has spoken yet, but I need to dash over to the Moses Room to speak to some amendments I am moving on the Bill being considered there. Secondly, I also ask the Committee that, if I do not get back in time for the wind-ups, I be forgiven on this occasion.

I simply wanted to say something briefly in support of Amendments 19, 22, 298 and 299, to which I have added my name. My noble friend Lady Harding has already spoken to them comprehensively, so there little I want to add; I just want to emphasise a couple of points. But first, if I may, I will pick up on something the right reverend Prelate said. I think I am right in saying that the most recent Ofcom research shows that 57% of 7 year-olds such as his grandchild have their own phone, and by the time children reach the age of 12 they pretty much all have their own phone. One can only imagine that the age at which children possess their own device is going to get lower.

Turning to app stores, with which these amendments are concerned, currently it is the responsibility of parents and developers to make sure that children are prevented from accessing inappropriate content. My noble friend’s amendments do not dilute in any way the responsibility that should be held by those two very important constituent groups. All we are seeking to do is ensure that app stores, which are currently completely unregulated, take their share of responsibility for making sure that those seeking to download and then use such apps are in the age group the apps are designed for.

As has already been very powerfully explained by my noble friend and by the noble Baroness, Lady Kidron, different age ratings are being given by the two different app stores right now. It is important for us to understand, in the context of the digital markets and competition Bill, which is being introduced to Parliament today—I cannot tell noble Lords how long we have waited for that legislation and how important it is, not least because it will open up competition, particularly in app stores—that the more competition there will be across app stores and the doorways through which children can go to purchase or download apps, the more important it is that there is consistency and some regulation. That is why I support my noble friend and was very happy to add my name to her amendments.

Photo of Lord Allan of Hallam Lord Allan of Hallam Liberal Democrat Lords Spokesperson (Health) 4:15, 25 April 2023

My Lords, it falls to me to inject some grit into what has so far been a very harmonious debate, as I will raise some concerns about Amendments 2 and 22.

I again declare my interest: I spent 10 years working for Facebook, doing the kind of work that we will regulate in this Bill. At this point noble Lords are probably thinking, “So it’s his fault”. I want to stress that, if I raise concerns about the way the regulation is going, it is not that I hold those views because I used to work for the industry; rather, I felt comfortable working in the industry because I always had those views, back to 2003 when we set up Ofcom. I checked the record, and I said things then that are remarkably consistent with how I feel today about how we need to strike the balance between the power of the state and the power of the citizen to use the internet.

I also should declare an interest in respect of Amendment 2, in that I run a blog called regulate.tech. I am not sure how many children are queueing up to read my thoughts about regulation of the tech industry, but they would be welcome to do so. The blog’s strap- line is:

“How to regulate the internet without breaking it”.

It is very much in that spirit that I raise concerns about these two amendments.

I certainly understand the challenges for content that is outside of the user-to-user or search spaces. I understand entirely why the noble Baroness, Lady Kidron, feels that something needs to be done about that content. However, I am not sure that this Bill is the right vehicle to address that kind of content. There are principled and practical reasons why it might be a mistake to extend the remit here.

The principle is that the Bill’s fundamental purpose is to restrict access to speech by people in the United Kingdom. That is what legislation such as this does: it restricts speech. We have a framework in the Human Rights Act, which tells us that when we restrict speech we have to pass a rigorous test to show that those restrictions are necessary and proportionate to the objective we are trying to achieve. Clearly, when dealing with children, we weight very heavily in that test whether something is necessary and proportionate in favour of the interest of the welfare of the children, but we cannot do away with the test altogether.

It is clear that the Government have applied that test over the years that they have been preparing this Bill and determined that there is a rationale for intervention in the context of user-to-user services and search services. At the same time, we see in the Bill that the Government’s decision is that intervention is not justified in all sorts of other contexts. Email and SMS are excluded. First-party publisher content is excluded, so none of the media houses will be included. We have a Bill that is very tightly and specifically framed around dealing with intermediaries, whether that is user-to-user intermediaries who intermediate in user-generated content, or search as an intermediary, which scoops up content from across the internet and presents it to you.

This Bill is about regulating the regulators; it is not about regulating first-party speakers. A whole world of issues will come into play if we move into that space. It does not mean that it is not important, just that it is different. There is a common saying that people are now bandying around, which is that freedom of speech is not freedom of reach. To apply a twist to that, restrictions on reach are not the same as restrictions on speech. When we talk about restricting intermediaries, we are talking about restricting reach. If I have something I want to say and Facebook or Twitter will not let me say it, that is a problem and I will get upset, but it is not the same as being told that I cannot say it anywhere on the internet.

My concern about Amendment 2 is that it could lead us into a space where we are restricting speech across the internet. If we are going to do that—there may be a rationale for doing it—we will need to go back and look at our necessity and proportionality test. It may play out differently in that context from user-to-user or intermediary-based services.

From a practical point of view, we have a Bill that, we are told, will give Ofcom the responsibility of regulating 25,000 more or less different entities. They will all be asked to pay money to Ofcom and will all be given a bunch of guidance and duties that they have to fulfil. Again, those duties, as set out in painful length in the Bill, are very specifically about the kind of things that an intermediary should do to its users. If we were to be regulating blogs or people’s first-party speech, or publishers, or the Daily Telegraph, or whoever else, I think we would come up with a very different set of duties from the duties laid out in the Bill. I worry that, however well-motivated, Amendment 2 leads us into a space for which this Bill is not prepared.

I have a lot of sympathy with the views of the noble Baroness, Lady Harding, around the app stores. They are absolutely more like intermediaries, or search, but again the tools in the Bill are not necessarily dedicated to how one would deal with app stores. I was interested in the comments of the noble Baroness, Lady Stowell, on what will be happening to our competition authorities; a lot will be happening in that space. On app stores, I worry about what is in Amendment 22: we do not want app stores to think that it is their job to police the content of third-party services. That is Ofcom’s job. We do not want the app stores to get in the middle, not least because of these commercial considerations. We do not want Apple, for instance, thinking that, to comply with UK legislation, it might determine that WhatsApp is unsafe while iMessage is safe. We do not want Google, which operates Play Store, to think that it would have a legal rationale for determining that TikTok is unsafe while YouTube is safe. Again, I know that this is not the noble Baroness’s intention or aim, but clearly there is a risk that we open that up.

There is something to be done about app stores but I do not think that we can roll over the powers in the Bill. When we talk about intermediaries such as user-to-user services and search, we absolutely want them to block bad content. The whole thrust of the Bill is about forcing them to restrict bad content. When it comes to app stores, the noble Baroness set out some of her concerns, but I think we want something quite different. I hesitate to say this, as I know that my noble friend is supportive of it, but I think that it is important as we debate these issues that we hear some of those concerns.

Photo of Lord Knight of Weymouth Lord Knight of Weymouth Labour

Could it not be argued that the noble Lord is making a case for regulation of app stores? Let us take the example of Apple’s dispute with “Fortnite”, where Apple is deciding how it wants to police things. Perhaps if this became a more regulated space Ofcom could help make sure that there was freedom of access to some of those different products, regardless of the commercial interests of the people who own the app stores.

Photo of Lord Allan of Hallam Lord Allan of Hallam Liberal Democrat Lords Spokesperson (Health)

The noble Lord makes a good point. I certainly think we are heading into a world where there will be more regulation of app stores. Google and Apple are commercial competitors with some of the people who are present in their stores. A lot of the people in their stores are in dispute with them over things such as the fees that they have to pay. It is precisely for that reason that I do not think we should be throwing online safety into the mix.

There is a role for regulating app stores, which primarily focuses on these commercial considerations and their position in the market. There may be something to be done around age-rating; the noble Baroness made a very good point about how age-rating works in app stores. However, if we look at the range of responsibilities that we are describing in this Bill and the tools that we are giving to intermediaries, we see that they are the wrong, or inappropriate, set of tools.

Photo of Baroness Harding of Winscombe Baroness Harding of Winscombe Conservative

Would the noble Lord acknowledge that app stores are already undertaking these age-rating and blocking decisions? Google has unilaterally decided that, if it assesses that you are under 18, it will not serve up over-18 apps. My concern is that this is already happening but it is happening indiscriminately. How would the noble Lord address that?

Photo of Lord Allan of Hallam Lord Allan of Hallam Liberal Democrat Lords Spokesperson (Health)

The noble Baroness makes a very good point; they are making efforts. There is a role for app stores to play but I hope she would accept that it is qualitatively different from that played by a search engine or a user-to-user service. If we were to decide, in both instances, that we want app stores to have a greater role in online safety and a framework that allows us to look at blogs and other forms of content, we should go ahead and do that. All I am arguing is that we have a Bill that is carefully constructed around two particular concepts, a user-to-user service and a search engine, and I am not sure it will stretch that far.

Photo of Baroness Kidron Baroness Kidron Crossbench

I want to reassure the noble Lord: I have his blog in front of me and he was quite right—there were not a lot of children on that site. It is a very good blog, which I read frequently.

I want to make two points. First, age-rating and age-gating are two different things, and I think the noble Lord has conflated them. There is a lot of age- rating going on, and it is false information. We need good information, and we have not managed to get it by asking nicely. Secondly, I slightly dispute his idea that we have a very structured Bill regarding user-to-user and so on. We have a very structured Bill from a harms perspective that describes the harms that must be prevented—and then we got to commercial porn, and we can also get to these other things.

I agree with the noble Lord’s point about freedom of speech, but we are talking about a fixed set of harms that will, I hope, be in the Bill by the end. We can then say that if children are likely to be accessed by this test, and known harm is there, that is what we are looking at. We are certainly not looking at the noble Lord’s blog.

Photo of Lord Allan of Hallam Lord Allan of Hallam Liberal Democrat Lords Spokesperson (Health)

I appreciate the intervention by the noble Baroness; I hope through this grit we may conjure up a pearl of some sort. The original concept of the Bill, as championed by the noble Baroness, would have been a generalised set of duties of care which could have stretched much more broadly. It has evolved in a particular direction and become ever more specific and tailored to those three services: user-to-user, search, and pornography services. Having arrived at that point, it is difficult to then open it back up and stretch it to reach other forms of service.

My intention in intervening in this debate is to raise some of those concerns because I think they are legitimate. I may be at the more sceptical end of the political world, but I am at the more regulation-friendly end of the tech community. This is said in a spirit of trying to create a Bill that will actually work. I have done the work, and I know how hard Ofcom’s job will be. That sums up what I am trying to say: my concern is that we should not give Ofcom an impossible job. We have defined something quite tight—many people still object to it, think it is too loose and do not agree with it—but I think we have something reasonably workable. I am concerned that, however tempting it is, by re-opening Pandora’s box we may end up creating something less workable.

That does not mean we should forget about app stores and non-user-to-user content, but we need to think of a way of dealing with those which does not necessarily just roll over the mechanism we have created in the Online Safety Bill to other forms of application.

Photo of Baroness Healy of Primrose Hill Baroness Healy of Primrose Hill Deputy Chairman of Committees

I strongly support the amendments in the name of the noble Baroness, Lady Kidron, because I want to see this Bill implemented but strengthened in order to fulfil the admirable intention that children must be safe wherever they are online. This will not be the case unless child safety duties are applicable in all digital environments likely to be accessed by children. This is not overly ambitious or unrealistic; the platforms need clarity as to these new responsibilities and Ofcom must be properly empowered to enforce the rules without worrying about endless legal challenges. These amendments will give that much-needed clarity in this complex area.

As the Joint Committee recommended, this regulatory alignment would simplify compliance with businesses while giving greater clarity to people who use the service and greater protection for children. It would give confidence to parents and children that they need not work out if they are in a regulated or unregulated service while online. The Government promised that the onus for keeping young people safe online would sit squarely on the tech companies’ shoulders.

Without these amendments, there is a real danger that a loophole will remain whereby some services, even those that are known to harm, are exempt, leaving thousands of children exposed to harm. They would also help to future-proof the Bill. For example, some parts of the metaverse as yet undeveloped may be out of scope, but already specialist police units have raised concerns that abuse rooms, limited to one user, are being used to practise violence and sexual violence against women and girls.

We can and must make this good Bill even better and support all the amendments in this group.

Photo of Lord Russell of Liverpool Lord Russell of Liverpool Deputy Chairman of Committees 4:30, 25 April 2023

My Lords, as I listen to the words echoing around the Chamber, I try to put myself in the shoes of parents or children who, in one way or another, have suffered as a result of exposure to things happening online. Essentially, the world that we are talking about has been allowed to grow like Topsy, largely unregulated, at a global level and at a furious pace, and that is still happening as we do this. The horses have not just bolted the stable; they are out of sight and across the ocean. We are talking about controlling and understanding an environment that is moving so quickly that, however fast we move, we will be behind it. Whatever mousetraps we put in place to try to protect children, we know there are going to be loopholes, not least because children individually are probably smarter than we are collectively at knowing how to get around well-meaning safeguards.

There are ways of testing what is happening. Certain organisations have used what they term avatars. Essentially, you create mythical profiles of children, which are clearly stated as being children, and effectively let them loose in the online world in various directions on various platforms and observe what happens. The tests that have been done on this—we will go into this in more detail on Thursday when we talk about safety by design—are pretty eye-watering. The speed with which these avatars, despite being openly stated as being profiles of children, are deluged by a variety of content that should be nowhere near children is dramatic and incredibly effective.

I put it to the Minister and the Bill team that one of the challenges for Ofcom will be not to be so far behind the curve that it is always trying to catch up. It is like being a surfer: if you are going to keep going then you have to keep on the front side of the wave. The minute you fall behind it, you are never going to catch up. I fear that, however well-intentioned so much of the Bill is, unless and until His Majesty’s Government and Ofcom recognise that we are probably already slightly behind the crest of the wave, whatever we try to do and whatever safeguards we put in place are not necessarily going to work.

One way we can try to make what we do more effective is the clever, forensic use of approaches such as avatars, not least because I suspect their efficacy will be dramatically increased by the advent and use of AI.

Photo of Lord Bethell Lord Bethell Conservative

Tim Cook, the CEO of Apple, put it very well:

“Kids are born digital, they’re digital kids now … And it is, I think, really important to set some hard rails around it”.

The truth is that in the area of app stores, Google and Apple, which, as we have heard, have a more than 95% share of the market, are just not voluntarily upholding their responsibilities in making the UK a safe place for children online. There is an air of exceptionalism about the way they behave that suggests they think the digital world is somehow different from the real world. I do not accept that, which is why I support the amendments in the name of my noble friend Lady Harding and others—Amendments 19, 22, 298, 299 and other connected amendments.

There are major holes in the app stores’ child safety measures, which mean that young teens can access adult apps that offer dating, random chats, casual sex and gambling, even when Apple and Google emphatically know that the user is a minor. I will give an example. Using an Apple ID for a simulated 14 year-old, the Tech Transparency Project looked at 80 apps in the App Store that are theoretically limited to 17 and older. It found that underage users could very easily evade age restrictions in the vast majority of cases. There is a dating app that opens directly into pornography before ever asking the user’s age; adult chat apps filled with explicit images that never ask the user’s age, and a gambling app that lets the minor account deposit and withdraw money.

What kind of apps are we talking about here? We are talking about apps such as UberHoney; Eros, the hook-up and adult chat app; Hahanono—Chat & Get Naughty, and Cash Clash Games: Win Money. The investigation found that Apple and other apps essentially pass the buck to each other when it comes to blocking underage users, making it easy for young teens to slip through the system. My day-to-day experience as a parent of four children completely echoes that investigation, and it is clear to me that Apple and Google just do not share age data with the apps in their app stores, or else children would not be able to download those apps.

There is a wilful blindness to minors tweaking their age. Parental controls on mobile phones are, to put it politely, a joke. It takes a child a matter of minutes to circumvent them—I know from my experience—and I have wasted many hours fruitlessly trying to control these arrangements. That is just not good enough for any business. It is not good enough because so many teenagers have mobile phones, as we discussed—two-thirds of children have a smartphone by the age of 10. Moreover, it is not good enough because they are accessing huge amounts of filthy content, dodgy services and predatory adults, things that would never be allowed in the real world. The Office of the Children’s Commissioner for England revealed that one in 10 children had viewed pornography by the time they were nine years old. The impact on their lives is profound: just read the testimony on the recent Mumsnet forums about the awful impact of pornography on their children’s lives.

To prevent minors from accessing adult-only apps, the most efficient measure would be, as my noble friend Lady Harding pointed out, to check users’ ages during the distribution step, which means directly in the app store or on the web browser, prior to the app store or the internet browser initiating the app or the platform download. This can be done without the developer knowing the user’s specific age. Developing a reliable age-verification regime applied at that “distribution layer” of the internet supply chain would significantly advance the UK’s objective of creating a safer online experience and set a precedent that Governments around the world could follow. It would apply real-world principles to the internet.

This would not absolve any developer, app or platform of their responsibilities under existing legislation—not at all: it would build on that. Instead, it would simply mandate that every player in the ecosystem, right from the app store distribution layer, was legally obliged to promote a safer experience online. That is completely consistent with the principles and aims of the Online Safety Bill.

These amendments would subject two of the biggest tech corporations to the same duties regarding their app stores as we do the wider digital ecosystem and the real world. It is all about age assurance and protecting children. To the noble Lord, Lord Allan, I say that I cannot understand why my corner shop requires proof of age to buy cigarettes, pornography or booze, but Apple and Google think it is okay to sell apps with inappropriate content and services without proper age-verification measures and with systems that are wilfully unreliable.

There is a tremendous amount that is very good about Tim Cook’s commitment to privacy and his objections to the data industrial complex; but in this matter of the app stores, the big tech companies have had a blind spot to child safety for decades and a feeling of exceptionalism that is just no longer relevant. These amendments are an important step in requiring that app store owners step up to their responsibilities and that we apply the same standards to shopkeepers in the digital world as we would to shopkeepers in the real world.

Photo of Lord Storey Lord Storey Liberal Democrat Lords Spokesperson (Education)

My Lords, I enter this Committee debate with great trepidation. I do not have the knowledge and expertise of many of your Lordships, who I have listened to with great interest. What I do have is experience working with children, for over 40 years, and as a parent myself. I want to make what are perhaps some innocent remarks.

I was glad that the right reverend Prelate the Bishop of Oxford raised the issue of online gaming. I should perhaps declare an interest, in that I think Liverpool is the third-largest centre of online gaming in terms of developing those games. It is interesting to note that over 40% of the entertainment industry’s global revenue comes from gaming, and it is steadily growing year on year.

If I am an innocent or struggle with some of these issues, imagine how parents must feel when they try to cope every single day. I suppose that the only support they currently have, other than their own common sense of course, are rating verifications or parental controls. Even the age ratings confuse them, because there are different ratings for different situations. We know that films are rated by the British Board of Film Classification, which also rates Netflix and now Amazon. But it does not rate Disney, which has its own ratings system.

We also know that the gaming industry has a different ratings system: the PEGI system, which has a number linked to an age. For example PEGI 16, if a parent knew this, says that that rating is required when depiction of violence or sexual activity reaches a stage where it looks realistic. The PEGI system also has pictures showing that.

Thanks to the Video Recordings Act 1984, the PEGI 12, PEGI 16 and PEGI 18 ratings became legally enforceable in the UK, meaning that retailers cannot sell those video games to those below those ages. If a child or young person goes in, they could not be sold those games. However, the Video Recordings Act does not currently apply to online games, meaning that children’s safety in online gaming relies primarily on parents setting up parental controls.

I will listen with great interest to the tussles between various learned Lords, as all these issues show to me that perhaps the most important issue will come several Committee days down the path, when we talk about media literacy. That is because it is not just about enforcement, regulation or ratings; it is about making sure that parents have the understanding and the capacity. Let us not forget this about young people: noble Lords have talked about them all having a phone and wanting to go on pornographic sites, but I do not think that is the case at all. Often, young people, because of peer pressure and because of their innocence, are drawn into unwise situations. Then there are the risks that gaming can lead to: for example, gaming addiction was mentioned by the right reverend Prelate the Bishop of Oxford. There is also the health impact and maybe a link with violent behaviour. There is the interactive nature of video game players, cyber bullying and the lack of a feeling of well-being. All these things can happen, which is why we need media literacy to ensure that young people know of those risks and how to cope with them.

The other thing that we perhaps need to look at is standardising some of the simple gateposts that we currently have, hence the amendment.

Photo of Baroness Wyld Baroness Wyld Conservative

My Lords, it is a pleasure to follow the noble Lord, Lord Storey. I support Amendments 19, 22 and so on in the name of my noble friend Lady Harding, on app stores. She set it out so comprehensively that I am not sure there is much I can add. I simply want to thank her for her patience as she led me through the technical arguments.

I support these amendments as I accept, reluctantly, that children are becoming more and more independent on the internet. I have ummed and ahhed about where parental responsibility starts and ends. I have a seven year-old, a 10 year-old and a 12 year-old. I do not see why any seven year-old, frankly, should have a smartphone. I do not know why any parent would think that is a good idea. It might make me unpopular, but there we are. I accept that a 12 year-old, realistically, has to have a smartphone in this day and age.

I said at Second Reading that Covid escalated digital engagement. It had to, because children had to go onto “Seesaw” and various other apps to access education. As a result, their social lives changed. They became faster and more digital. It seems to be customary to stand up and say that this Bill is very complicated, but at the end, when it passes after all this time, the Government will rightly want to go to parents and say, “We’ve done it; we’ve made this the safest place in the world to be online”.

Unless we support my noble friend’s amendments and can say to parents that we have been holistic about this and recognised a degree of parental responsibility but also the world that children will go into and how it may change—we have heard about the possibility of more app stores, creating a more confusing environment for parents and young people—I do not think we can confidently, hand on heart, say that we achieved what this Bill set out to achieve. On that note, I wholeheartedly support my noble friend’s amendments.

Photo of The Bishop of Guildford The Bishop of Guildford Bishop 4:45, 25 April 2023

My Lords, one of our clergy in the diocese of Guildford has been campaigning for more than a decade, as have others in this Committee, on children’s access to online pornography. With her, I support the amendments in the names of the noble Baronesses, Lady Kidron and Lady Harding.

Her concerns eventually made their way to the floor of the General Synod of the Church of England in a powerful debate in July last year. The synod voted overwhelmingly in favour of a motion, which said that we

“acknowledge that our children and young people are suffering grave harm from free access to online pornography” and urged us to

“have in place age verification systems to prevent children from having access to those sites”.

It asked Her Majesty’s Government to use their best endeavours to secure the passage and coming into force of legislation requiring age-verification systems preventing access by people under the age of 18. It also recommended more social and educational programmes to increase awareness of the harms of pornography, including self-generated sexually explicit images.

Introducing the motion, my chaplain, Reverend Jo Winn-Smith, said that age verification

“ought to be a no-brainer … Exposure to sexualised material is more likely to lead to young people engaging in more sexualised behaviour and to feel social pressure to have sex”,

as well as normalising sexual violence against girls and women. A speech from the chaplain-general of the Prison Service towards the end of the debate highlighted just where such behaviours and pressures could lead in extreme circumstances.

One major theme that emerged during the debate is highlighted by the amendments this afternoon: that access to online pornography goes far beyond materials that fall into what the Bill defines as Part 5 services. Another is highlighted in a further group of amendments: age assurance needs to be both mandatory and effective beyond reasonable doubt.

It was also commented on how this whole area has taken such an age to get on to the statute book, given David Cameron’s proposals way back in 2013 and further legislation proposed in 2018 that was never enacted. Talk of secondary legislation to define harmful content in that regard is alarming, as a further amendment indicates, given the dragging of feet that has now been perpetuated for more than a decade. That is a whole generation of children and young people.

In an imaginative speech in the synod debate, the most reverend Primate the Archbishop of York, Archbishop Stephen, reminded us that the internet is not a platform; it is a public space, where all the rights and norms you would expect in public should apply. In the 1970s, he continued, we famously put fluoride in the water supply, because we knew it would be great for dental health; now is the opportunity to put some fluoride into the internet. I add only this: let us not water down the fluoride to a point where it becomes feeble and ineffective.

Photo of Baroness Benjamin Baroness Benjamin Liberal Democrat

My Lords, I will speak in support of the amendments in this group in the names of the intrepid noble Baroness, Lady Kidron, the noble Baroness, Lady Harding, and my noble friend Lord Storey—we are kindred spirits.

As my noble friend said, the expectations of parents are clear: they expect the Bill to protect their children from all harm online, wherever it is encountered. The vast majority of parents do not distinguish between the different content types. To restrict regulation to user-to-user services, as in Part 3, would leave a great many websites and content providers, which are accessed by children, standing outside the scope of the Bill. This is a flagship piece of legislation; there cannot be any loopholes leaving any part of the internet unregulated. If there is a website, app, online game, educational platform or blog—indeed, any content that contains harmful material—it must be in the scope of the Bill.

The noble Baroness, Lady Kidron, seeks to amend the Bill to ensure that it aligns with the Information Commissioner’s age-appropriate design code—it is a welcome amendment. As the Bill is currently drafted, the threshold for risk assessment is too high. It is important that the greatest number of children and young people are protected from harmful content online. The amendments achieve that to a greater degree than the protection already in the Bill.

While the proposal to align with the age-appropriate design code is welcome, I have one reservation. Up until recently, it appears that the ICO was reluctant to take action against pornography platforms that process children’s data. It has perhaps been deemed that pornographic websites are unlikely to be accessed by children. Over the years, I have shared with this House the statistics of how children are accessing pornography and the harm it causes. The Children’s Commissioner also recently highlighted the issue and concerns. Pornography is being accessed by our children, and we must ensure that the provisions of the Bill are the most robust they can be to ensure that children are protected online.

I am concerned with ensuring two things: first, that any platform that contains harmful material falls under the scope of the Bill and is regulated to ensure that children are kept safe; and, secondly, that, as far as possible, what is harmful offline is regulated in the same way online. The amendments in the name of my noble friend Lord Storey raise the important question of online-offline equality. Amendments 33A and 217A seek to regulate online video games to ensure they meet the same BBFC ratings as would be expected offline, and I agree with that approach. Later in Committee, I will raise this issue in relation to pornographic content and how online content should be subject to the same BBFC guidance as content offline. I agree with what my noble friend proposes: namely, that this should extend to video game content as well. Video games can be violent and sexualised in nature, and controls should be in place to ensure that children are protected. The BBFC guidelines used offline appear to be the best way to regulate online as well.

Children must be kept safe wherever they are online. This Bill must have the widest scope possible to keep children safe, but ensuring online/offline alignment is crucial. The best way to keep children safe is to legislate for regulation that is as far reaching as possible but consistently applied across the online/offline world. These are the reasons why I support the amendments in this group.

Photo of Baroness Berridge Baroness Berridge Conservative

My Lords, I will lend my support to Amendments 19 and 22. It is a pleasure to speak after the noble Baroness, Lady Benjamin. I may be one of those people in your Lordships’ House who relies significantly on the British Board of Film Classification for movie watching, as I am one of the faint-hearted.

In relation to app stores, it is not just children under 18 for whom parents need the age verification. If you are a parent of a child who has significant learning delay, the internet is a wonderful place where they can get access to material and have development that they might not ordinarily have had. But, of course, turning 17 or 18 is not the threshold for them. I have friends who have children with significant learning delay. Having that assurance, so they know which apps are which in the app store, goes well beyond 18 for them. Obviously it will not be a numerical equivalent for their child—now a young adult—but it is important to them to know that the content they get on a free app or an app purchased from the app store is suitable.

I just wanted to raise that with noble Lords, as children and some vulnerable adults—not all—would benefit from the kind of age verification that we have talked about. I appreciate the points that the noble Lord, Lord Allan, raised about where the Bill has ended up conceptually and the framework that Ofcom will rely on. Like him, I am a purist sometimes but, pragmatically, I think that the third concept raised by the noble Baroness, Lady Kidron, about protection and putting this in the app store and bringing it parallel with things such as classification for films and other video games is really important.

Photo of Lord Clement-Jones Lord Clement-Jones Liberal Democrat Lords Spokesperson (Science, Innovation and Technology)

My Lords, this has been a really fascinating debate and I need to put a stake in the ground pretty early on by saying that, although my noble friend Lord Allan has raised some important points and stimulated an important debate, I absolutely agree with the vast majority of noble Lords who have spoken in favour of the amendment so cogently put forward by the noble Baronesses, Lady Kidron and Lady Harding.

Particularly as a result of the Bill’s being the subject of a Joint Committee, it has changed considerably over time in response to comment, pressure, discussion and debate and I believe very much that during Committee stage we will be able to make changes, and I hope the Minister will be flexible enough. I do not believe that the framework of the Bill is set in concrete. There are many things we can do as we go through, particularly in the field of making children safer, if we take some of the amendments that have been put forward on board. In particular, the noble Baroness, Lady Kidron, set out why the current scope of the Bill will fail to protect children if it is kept to user-to-user and search services. She talked about blogs with limited functionalities, gaming without user functionalities and mentioned the whole immersive environment, which the noble Lord, Lord Russell, described as eye-watering. As she said, it is not fair to leave parents or children to work out whether they are on a regulated service. Children must be safe wherever they are online.

As someone who worked with the noble Baroness, Lady Kidron, in putting the appropriate design code in place in that original Data Protection Act, I am a fervent believer that it is perfectly appropriate to extend in the way that is proposed today. I also support her second amendment, which would bring the Bill’s child user condition in line with the threshold of the age-appropriate design code. It is the expectation—I do not think it an unfair expectation—of parents, teachers and children themselves that the Bill will apply to children wherever they are online. Regulating only certain services will mean that emerging technologies that do not fit the rather narrow categories will not be subject to safety duties.

The noble Baroness talked about thousands of children being potentially at risk of not having the protection of the Bill. That is absolutely fair comment. Our Joint Committee report said:

“We recommend that the ‘likely to be accessed by children’ test in the draft Online Safety Bill should be the same as the test underpinning the Age Appropriate Design Code’.

The Government responded:

“The government considers that the approach taken in the Bill is aligned with the Age Appropriate Design Code and will ensure consistency for businesses. In addition, the status of the legislative test in the Online Safety Bill is binding in a way that the test in the Age Appropriate Design Code is not”.

In that case, since both those statements in the current Bill are patently not the case, it is incumbent on the Government to change the Bill in the direction that the noble Baroness has asked for.

My noble friend stimulated a very important debate about the amendments of the noble Baroness, Lady Harding, in particular. That is another major potential omission in the Bill. The tech giants responsible for the distribution of nearly all apps connecting smartphone users to the internet are not currently covered in the scope of the Bill. She said that the online safety regime must look at this whole area much more broadly. App stores should be added to the list of service providers who will be mandated by the Bill to protect children and all users online. I am not going to go into all the arguments that have been made so well by noble Lords today, but of course Google and app stores have a monopoly on app distribution, yet they do not control users’ ages. They have the technical ability to prevent minors accessing certain applications reserved for adults, as evidenced by the existing parental control functions on both smartphone operating systems and their corresponding app stores, and of course, as the noble Baroness, Lady Berridge, said, this applies not just to children but to vulnerable adults as well.

I thought the noble Lord, Lord Bethell, put it very well: other sectors of the economy have already implemented such control in the distribution of goods and services in the offline world; alcohol consumption provides a good example for understanding those issues. Why cannot Google and Apple have duties that a corner store can adhere to? App stores do not have age assurance systems in place and do not actually seem to wish to take any responsibility for the part they can play in permitting harms. I say to my noble friend that the word “store” is the clue: these are products being sold through the app store and there should be age-gating on those apps. The only way to improve safety is to make sure that app developers and companies that distribute these apps do more to ensure that children and vulnerable adults are appropriately kept away from adult applications and content. That is an entirely reasonable duty to place on them: it is an essential part, I think, of the framework of the Bill that we should take these sets of amendments on board.

The right reverend Prelate the Bishop of Oxford talked about the fact that harms will only increase in coming years, particularly, as he said, with ever younger children having access to mobile technology. Of course, I agree with my noble friend about the question of media literacy. This goes hand in hand with regulation, as we will discover when we talk about this later on. These amendments will not, in the words of my noble friend, break the internet: I think they will add substantially and beneficially to regulation.

I say to my noble friend Lord Storey that I support his amendments too; they are more like probing amendments. There is a genuine gap that I think many of us were not totally aware of. I assumed that, in some way, the PEGI classifications applied here, but if age ratings do not apply to online games, that is a major gap. We need to look at that very carefully, alongside these amendments, which I very much hope the Minister will accept.

Photo of Lord Knight of Weymouth Lord Knight of Weymouth Labour 5:00, 25 April 2023

My Lords, I echo the comments of the noble Lord, Lord Clement-Jones. This is an important group of amendments, and it has been a useful debate. I was slightly concerned when I heard the noble Baroness, Lady Harding, talk about using her daughter’s device to see whether it could access porn sites in terms of what that is going to do to her daughter’s algorithm and what it will now feed her. I will put that concern to one side, but any future report on that would be most welcome.

Amendments 2, 3 and 5, introduced so well by the noble Baroness, Lady Kidron, test what should be in scope to protect children. Clearly, we have a Bill that has evolved over some time, with many Ministers, to cover unambiguously social media, as user-to-user content, and search. I suspect that we will spend a lot more time discussing social media than search, but I get the rationale that those are perhaps the two main access points for a lot of the content we are concerned about. However, I would argue that apps are also main access points. I will come on to discuss the amendments in the name of the noble Baroness, Lady Harding, which I have also signed. If we are going to go with access points, it is worth probing and testing the Government’s intent in excluding some of these other things. The noble Lord, Lord Storey, raises in his amendments the issue of games, as others have done. Games are clearly a point of access for lots of children, as well as adults, and there is plenty of harm that can be created as a result of consuming them.

Along with some other noble Lords, some time ago I attended an all-party group which looked at the problems related to incel harm online and how people are breadcrumbed from mainstream sites to quite small websites to access the really problematic, most hateful and most dangerous content. Those small websites, as far as I can see, are currently excluded from the regime in the Bill, but the amendments in the name of the noble Baroness, Lady Kidron, potentially would bring them into scope. That meeting also discussed cloud services and the supply chain of the technical infrastructure that such risks, including incels and other things, use. Why are cloud services not included in some context in terms of the harms that might be created?

Questions have been asked about large language model AIs such as ChatGPT. These are future technologies that have now arrived, which lots of people are talking about and variously freaking out about or getting excited by. There is an important need to bring those quite quickly into the scope of regulation by Ofcom. ChatGPT is a privately owned platform—a privately owned technology—that is offering up not only access to the range of knowledge that is online but, essentially, the range of human concepts that are online in interaction with that knowledge—privately owned versions of truth.

What is to stop any very rich individual deciding to start their own large language model with their own version of the truth, perhaps using their own platform? Former President Trump comes to mind as someone who could do that and I suggest that, if truth is now a privatised thing, we might want to have some regulation here.

The future-proofing issues are why we should be looking very seriously at the amendments in the name of the noble Baroness, Lady Kidron. I listened carefully to the noble Lord, Lord Allan, as always, and I have reflected a lot on his very useful car safety and plane safety regulation analogy from our previous day in Committee. The proportionality issue that he raised in his useful contribution this time is potentially addressed by the proposed new clause we discussed last time. If the Bill sets out quite clearly the aim of the legislation, that would set the frame for the regulator and for how it would regulate proportionately the range of internet services that might be brought into scope by this set of amendments.

I also support Amendment 92, on bringing in safety by design and the regime that has been so successful in respect of the age-related design code and the probability of access by children, rather than what is set out in the Bill.

I turn to Amendments 19, 22, 298 and 299 in the names of the noble Baronesses, Lady Harding and Lady Stowell, the noble Lord, Lord Clement-Jones, and myself. Others, too, have drawn the analogy between app stores and corner shops selling alcohol, and it makes sense to think about the distribution points in the system—the pinch points that all users go through—and to see whether there is a viable way of protecting people and regulating through those pinch points. The Bill seeks to protect us via the platforms that host and promote content having regulation imposed on them, and risk assessments and so on, but it makes a lot of sense to add app stores, given how we now consume the internet.

I remember, all those years ago, having CD drives—floppy disk drives, even—in computers, and going off to buy software from a retail store and having to install it. I do not go quite as far back as the right reverend Prelate the Bishop of Oxford, but I remember those days well. Nowadays as consumers almost all of us access our software through app stores, be it software for our phones or software for our laptops. That is the distribution point for mobiles and essentially it is, as others have said, a duopoly that we hope will be addressed by the Digital Markets, Competition and Consumers Bill.

As others have said, 50% of children under 10 in this country use smartphones and tablets. When you get to the 12 to 15 bracket, you find that 97% of them use mobile phones and tablets. We have, as noble Lords have also said, Google Family Link and the Apple Family Sharing function. That is something we use in my family. My stepdaughter is 11—she will be 12 in June—and I appear to be in most cases the regulator who has to give her the Family Link code to go on to Google Classroom when she does her homework, and who has to allow her to download an app or add another contact—there is a whole range of things on her phone for which I provide the gatekeeper function. But you have to be relatively technically competent and confident to do all those things, and to manage her screen time, and I would like to see more protection for those who do not have that confidence—and indeed for myself as well, because maybe I would not have to be bothered quite as often.

It is worth noting that the vast majority of children in this country who have smartphones—the last time I looked at the stats, it was around 80%—have iPhones; there must be a lot of old iPhones that have been recycled down the family. To have an iCloud account, if you are under 13, you have to go through a parent or other suitable adult. However, if you are over 13, you can get on with it; that raises a whole set of issues and potential harms for children over the age of 13.

I am less familiar with the user journey and how it works on Google Play—we are more of an Apple family—but my understanding is that, for both Google Play and the Apple App Store, in order to set up an account you need credit card billing information. This creates ID verification, and the assurance that many of us are looking for is then provided as an additional safeguard for children. This is not something that anyone is arguing should replace the responsibilities set out in the Bill for internet service providers—for example, that they should carry out risk assessments and be regulated. This is about having additional safeguards at the point of distribution. We are not asking Apple and Google, in this case, to police the apps. We are asking them to ensure that the publishers of the applications set an age limit and then facilitate ensuring that that age limit is adhered to, according to everything that they know about the user of that device and their age. I am grateful to the noble Baroness, Lady Harding, for her amendments on this important issue.

Finally, let me say this in anticipation of the Minister perhaps suggesting that this might be a good idea but we are far down the road with the Bill and Ofcom is ready to go and we want to get on with implementing it, so maybe let us not do this now but perhaps in another piece of legislation. Personally, I am interested in having a conversation about the sequence of implementation. It might be that we can implement the regime that Ofcom is good to go on but with the powers there in the Bill for it to cover app stores and some other wider internet services, according to a road map that it sets out and that we in Parliament can scrutinise. However, my general message is, as the noble Baroness, Lady Kidron, said, that we should get this right in this legislation and grab the opportunity, particularly with app stores, to bring other internet services in—given that we consume so much through applications—and to provide a safer environment for our children.

Photo of Lord Parkinson of Whitley Bay Lord Parkinson of Whitley Bay Parliamentary Under Secretary of State (Department for Culture, Media and Sport) 5:15, 25 April 2023

My Lords, I share noble Lords’ determination to deliver the strongest protections for children and to develop a robust and future-proofed regulatory regime. However, it will not be possible to solve every problem on the internet through this Bill, nor through any piece of legislation, flagship or otherwise. The Bill has been designed to confer duties on the services that pose the greatest risk of harm—user-to-user services and search services—and where there are proportionate measures that companies can take to protect their users.

As the noble Baroness, Lady Kidron, and others anticipated, I must say that these services act as a gateway for users to discover and access other online content through search results and links shared on social media. Conferring duties on these services will therefore significantly reduce the risk of users going on to access illegal or harmful content on non-regulated services, while keeping the scope of the Bill manageable and enforceable.

As noble Lords anticipated, there is also a practical consideration for Ofcom in all this. I know that many noble Lords are extremely keen to see this Bill implemented as swiftly as possible; so am I. However, as the noble Lord, Lord Allan, rightly pointed out, making major changes to the Bill’s scope at this stage would have significant implications for Ofcom’s implementation timelines. I say this at the outset because I want to make sure that noble Lords are aware of those implications as we look at these issues.

I turn first to Amendments 2, 3, 5, 92 and 193, tabled by the noble Baroness, Lady Kidron. These aim to expand the number of services covered by the Bill to incorporate a broader range of services accessed by children and a broader range of harms. I will cover the broader range of harms more fully in a separate debate when we come to Amendment 93, but I am very grateful to the noble Baroness for her constructive and detailed discussions on these issues over the past few weeks and months.

These amendments would bring new services into scope of the duties beyond user-to-user and search services. This could include services which enable or promote commercial harms, including consumer businesses such as online retailers. As I have just mentioned in relation to the previous amendments, bringing many more services into scope would delay the implementation of Ofcom’s priorities and risk detracting from its work overseeing existing regulated services where the greatest risk of harm exists—we are talking here about the services run by about 2.5 million businesses in the UK alone. I hope noble Lords will appreciate from the recent communications from Ofcom how challenging the implementation timelines already are, without adding further complication.

Amendment 92 seeks to change the child-user condition in the children’s access assessment to the test in the age-appropriate design code. The test in the Bill is already aligned with the test in that code, which determines whether a service is likely to be accessed by children, in order to ensure consistency for providers. The current child-user condition determines that a service is likely to be accessed by children where it has a significant number or proportion of child users, or where it is of a kind likely to attract a significant number or proportion of child users. This will already bring into scope services of the kind set out in this amendment, such as those which are designed or intended for use by children, or where children form a—

Photo of Baroness Kidron Baroness Kidron Crossbench

I am sorry to interrupt. Will the Minister take the opportunity to say what “significant” means, because that is not aligned with the ICO code, which has different criteria?

Photo of Lord Parkinson of Whitley Bay Lord Parkinson of Whitley Bay Parliamentary Under Secretary of State (Department for Culture, Media and Sport)

If I can finish my point, this will bring into scope services of the kind set out in the amendments, such as those designed or intended for use by children, or where children form a substantial and identifiable user group. The current condition also considers the nature and content of the service and whether it has a particular appeal for children. Ofcom will be required to consult the Information Commissioner’s Office on its guidance to providers on fulfilling this test, which will further support alignment between the Bill and the age-appropriate design code.

On the meaning of “significant”, a significant number of children means a significant number in itself or a significant proportion of the total number of UK-based users on the service. In the Bill, “significant” has its ordinary meaning, and there are many precedents for it in legislation. Ofcom will be required to produce and publish guidance for providers on how to make the children’s access assessment. Crucially, the test in the Bill provides more legal certainty and clarity for providers than the test outlined in the code. “Substantive” and “identifiable”, as suggested in this amendment, do not have such a clear legal meaning, so this amendment would give rise to the risk that the condition is more open to challenge from providers and more difficult to enforce. On the other hand, as I said, “significant” has an established precedent in legislation, making it easier for Ofcom, providers and the courts to interpret.

The noble Lord, Lord Knight, talked about the importance of future-proofing the Bill and emerging technologies. As he knows, the Bill has been designed to be technology neutral and future-proofed, to ensure that it keeps pace with emerging technologies. It will apply to companies which enable users to share content online or to interact with each other, as well as to search services. Search services using AI-powered features will be in scope of the search duties. The Bill is also clear that content generated by AI bots is in scope where it interacts with user-generated content, such as bots on Twitter. The metaverse is also in scope of the Bill. Any service which enables users to interact as the metaverse does will have to conduct a child access test and comply with the child safety duties if it is likely to be accessed by children.

Photo of Lord Knight of Weymouth Lord Knight of Weymouth Labour

I know it has been said that the large language models, such as that used by ChatGPT, will be in scope when they are embedded in search, but are they in scope generally?

Photo of Lord Parkinson of Whitley Bay Lord Parkinson of Whitley Bay Parliamentary Under Secretary of State (Department for Culture, Media and Sport)

They are when they apply to companies enabling users to share content online and interact with each other or in terms of search. They apply in the context of the other duties set out in the Bill.

Amendments 19, 22, 298 and 299, tabled by my noble friend Lady Harding of Winscombe, seek to impose child safety duties on application stores. I am grateful to my noble friend and others for the collaborative approach that they have shown and for the time that they have dedicated to discussing this issue since Second Reading. I appreciate that she has tabled these amendments in the spirit of facilitating a conversation, which I am willing to continue to have as the Bill progresses.

As my noble friend knows from our discussions, there are challenges with bringing application stores—or “app stores” as they are popularly called—into the scope of the Bill. Introducing new duties on such stores at this stage risks slowing the implementation of the existing child safety duties, in the way that I have just outlined. App stores operate differently from user-to-user and search services; they pose different levels of risk and play a different role in users’ experiences online. Ofcom would therefore need to recruit different people, or bring in new expertise, to supervise effectively a substantially different regime. That would take time and resources away from its existing priorities.

We do not think that that would be a worthwhile new route for Ofcom, given that placing child safety duties on app stores is unlikely to deliver any additional protections for children using services that are already in the scope of the Bill. Those services must already comply with their duties to keep children safe or will face enforcement action if they do not. If companies do not comply, Ofcom can rely on its existing enforcement powers to require app stores to remove applications that are harmful to children. I am happy to continue to discuss this matter with my noble friend and the noble Lord, Lord Knight, in the context of the differing implementation timelines, as he has asked.

Photo of Lord Stevenson of Balmacara Lord Stevenson of Balmacara Shadow Spokesperson (Science, Innovation and Technology)

The Minister just said something that was material to this debate. He said that Ofcom has existing powers to prevent app stores from providing material that would have caused problems for the services to which they allow access. Can he confirm that?

Photo of Lord Parkinson of Whitley Bay Lord Parkinson of Whitley Bay Parliamentary Under Secretary of State (Department for Culture, Media and Sport)

Perhaps the noble Lord could clarify his question; I was too busy finishing my answer to the noble Lord, Lord Knight.

Photo of Lord Stevenson of Balmacara Lord Stevenson of Balmacara Shadow Spokesperson (Science, Innovation and Technology)

It is a continuation of the point raised by the noble Baroness, Lady Harding, and it seems that it will go part of the way towards resolving the differences that remain between the Minister and the noble Baroness, which I hope can be bridged. Let me put it this way: is it the case that Ofcom either now has powers or will have powers, as a result of the Bill, to require app stores to stop supplying children with material that is deemed in breach of the law? That may be the basis for understanding how you can get through this. Is that right?

Photo of Lord Parkinson of Whitley Bay Lord Parkinson of Whitley Bay Parliamentary Under Secretary of State (Department for Culture, Media and Sport)

Services already have to comply with their duties to keep children safe. If they do not comply, Ofcom has powers of enforcement set out, which require app stores to remove applications that are harmful to children. We think this already addresses the point, but I am happy to continue discussing it offline with the noble Lord, my noble friend and others who want to explore how. As I say, we think this is already covered. A more general duty here would risk distracting from Ofcom’s existing priorities.

Photo of Lord Allan of Hallam Lord Allan of Hallam Liberal Democrat Lords Spokesperson (Health)

My Lords, on that point, my reading of Clauses 131 to 135, where the Bill sets out the business disruption measures, is that they could be used precisely in that way. It would be helpful for the Minister responding later to clarify that Ofcom would use those business disruption measures, as the Government explicitly anticipate, were an app store, in a rogue way, to continue to list a service that Ofcom has said should not be made available to people in the United Kingdom.

Photo of Lord Parkinson of Whitley Bay Lord Parkinson of Whitley Bay Parliamentary Under Secretary of State (Department for Culture, Media and Sport)

I will be very happy to set that out in more detail.

Amendments 33A and 217A in the name of the noble Lord, Lord Storey, would place a new duty on user-to-user services that predominantly enable online gaming. Specifically, they would require them to have a classification certificate stating the age group for which they are suitable. We do not think that is necessary, given that there is already widespread, voluntary uptake of approval classification systems in online gaming.

The Government work closely with the industry and with the Video Standards Council to promote and encourage the displaying of Pan-European Games Information—PEGI—age ratings online. That has contributed to almost 3 million online games being given such ratings, including Roblox, mentioned by the right reverend Prelate the Bishop of Oxford. Most major online storefronts have made it mandatory for game developers supplying digital products on their platforms to obtain and display PEGI ratings. These include Google Play, Microsoft, PlayStation, Nintendo, Amazon Luna and Epic. Apple uses its own age ratings, rather than PEGI ratings, on all the video games available on its App Store.

Online games in the UK can obtain PEGI ratings by applying directly to the Video Standards Council or via the international age rating coalition system, which provides ratings based on answers to a questionnaire when a game is uploaded. That system ensures that, with unprecedented volumes of online video games—and the noble Lord is right to point to the importance of our creative industries—all digital content across most major digital storefronts can carry a PEGI rating. These ratings are regularly reviewed by international regulators, including our own Video Standards Council, and adjusted within hours if found to be incorrect.

I hope that gives the noble Lord the reassurance that the points he is exploring through his amendments are covered. I invite him not to press them and, with a promise to continue discussions on the other amendments in this group, I invite their proposers to do the same.

Photo of Baroness Kidron Baroness Kidron Crossbench 5:30, 25 April 2023

I thank the Minister for an excellent debate; I will make two points. First, I think the Minister was perhaps answering on my original amendment, which I have narrowed considerably to services

“likely to be accessed by children” and with proven harm on the basis of the harms described by the Bill. It is an “and”, not an “or”, allowing Ofcom to go after places that have proven to be harmful.

Secondly, I am not sure the Government can have it both ways—that it is the same as the age-appropriate design code but different in these ways—because it is exactly in the ways that it is different that I am suggesting the Government might improve. We will come back to both those things.

Finally, what are we asking here? We are asking for a risk assessment. The Government say there is no risk assessment, no harm, no mitigation, nothing to do. This is a major principle of the conversations we will have going forward over a number of days. I also believe in proportionality. It is basic product safety; you have a look, you have standards, and if there is nothing to do, let us not make people do silly things. I think we will return to these issues, because they are clearly deeply felt, and they are very practical, and my own feeling is that we cannot risk thousands of children not benefiting from all the work that Ofcom is going to do. With that, I beg leave to withdraw.

Amendment 2 withdrawn.

Amendment 3 not moved.