Amendment 192

Online Safety Bill - Committee (9th Day) (Continued) – in the House of Lords at 6:15 pm on 25 May 2023.

Alert me about debates like this

Baroness Morgan of Cotes:

Moved by Baroness Morgan of Cotes

192: Schedule 11, page 216, line 30, after “service” insert “, including significant risk of harm,”Member’s explanatory statementThere are some platforms which, whilst attracting small user numbers, are hubs for extreme hateful content and should be regulated as larger user-to-user services.

Photo of Baroness Morgan of Cotes Baroness Morgan of Cotes Conservative

My Lords, I am very grateful to the noble Baronesses, Lady Parminter and Lady Deech, and the noble Lord, Lord Mann, for their support. After a miscellaneous selection of amendments, we now come back to a group of quite tight amendments. Given the hour, those scheduling the groupings should be very pleased because for the first time we have done all the groups that we set out to do this afternoon. I do not want to tempt fate, but I think we will have a good debate before we head off for a little break from the Bill for a while.

I am very sympathetic to the other amendments in this grouping: Amendments 192A and 194. I think there is a common theme running through them all, unsurprisingly, which I hope my noble friend the Minister will be able to address in his remarks. Some of these amendments come about because we do not know the exact categorisation of the services we are most concerned about in this House and beyond, and how that categorisation process is going to work and be kept under review. That is probably the reason behind this group of amendments.

As noble Lords will be aware, the Bill proposes two major categories of regulated company, category 1 and category 2, and there is another carve-out for search services. Much of the discussion about the Bill has focused on the regulatory requirements for category 1 companies, but—again, we have not seen the list—it is expected that the list of category 1 companies may number only a few dozen, while thousands and thousands of platforms and search engines may not meet that threshold. But some of those other platforms, while attracting small user numbers, are hubs for extremely hateful content. In a previous debate we heard about the vile racist abuse often aimed at particular groups. Some of these platforms are almost outside some of our own experiences. They are deliberately designed to host such hateful content and to try to remain under the radar, but they are undoubtedly deeply influential, particularly to those—often vulnerable—users who access them.

Platforms such as 8kun, 4chan and BitChute are perhaps becoming more well known, whereas Odysee, Rumble and Minds remain somewhat obscure. There are numerous others, and all are easily accessible from anyone’s browser. What does the harm caused by these platforms look like? Some examples are in the public domain. For example, the mass shooting in Buffalo, in America, was carried out by a terrorist whose manifesto was inspired by 4chan’s board and who spoke of its influence on him. Later in this debate we are going to hear about specific content related to suicide, self-harm or eating disorders, which we have already debated in other contexts in these Committee proceedings.

The Center for Countering Digital Hate revealed that the four leading forums it analysed for incels—involuntary celibates—were filled with extreme hatred of women, glorification of violence and active discussion of paedophilia. On Gab, an “anti-Jewish meme repository”, grotesque anti-Semitic caricatures of Jews are shared from an account with an offensive name that seeks to deny the Holocaust. Holocaust denial material is similarly shared across BitChute, where it is also possible to find a video on the supposed

“Jewish Plan To Genocide The White Race” and, of course, 9/11 conspiracy theories. Meanwhile, on Odysee, other than discussion of the supposed “fake Holocaust” one can find discussion of the “Jewish problem”. On Minds, both President Zelensky and President Putin are condemned for having “kike”—an offensive term for Jews—inner circles, while other posts state that communism is Jewish control and the vessel to destroy our freedom.

The Government and many others know very well that these small, high-harm platforms are a problem. MPs in earlier debates on this Bill raised concerns repeatedly. The noble Lord, Lord Austin, raised this at Second Reading in your Lordships’ House and, nearly a year ago, the then Secretary of State issued a ministerial Statement indicating that, while the Government appreciated that small high-harm platforms do damage,

“more research is required before such platforms can be assigned to the category 1 designation for the online safety regime”.

This was despite Ofcom’s road map for online safety making it clear that it had already identified a number of small platforms that are clearly giving cause for concern.

So the case for action, as set out in my remarks and elsewhere, is proven. The Antisemitism Policy Trust has given evidence to the Joint Committee on the draft Bill and the Bill Committee in another place about this. The Community Security Trust, HOPE not hate and many others have data that demonstrates the level of hateful anti-Semitic and other racist and misogynistic abuse on these platforms. I know others will refer to the work of the Samaritans, the Mental Health Foundation and Beat in raising issues around suicide, self-harm and eating disorder content.

Extraordinarily, these are not platforms where this content is stumbled on or somehow hidden. They are set up deliberately to spread this content, to get people to look at it and to amplify this deeply harmful material. These sites act as feeders for hateful messages and activity on mainstream platforms or as receptors for those directed away from those larger services to niche, hate-filled rabbit holes. We need to think about this as the Bill is implemented. As we hope that the larger platforms will take action and live up to the terms of service they say they have, without action this content will unfortunately disappear to smaller platforms which will still be accessed and have action in the online and offline worlds. I hope my noble friend the Minister will say something about post-implementation in relation to these platforms.

Amendment 192 is a small, technical amendment. It does not compel Ofcom to add burdens to all small platforms but provides a specific recourse for the Secretary of State to consider the risks of harm as part of the process of categorisation. A small number of well-known, small high-harm sites would be required to add what will ultimately be minimal friction and other measures proportionate to their size. They will be required to deliver enhanced transparency. This can only be for the good, given that in some cases these sites are designed specifically to spread harm and radicalise users towards extreme and even terrorist behaviours.

The Government accept that there is a problem. Internet users broadly accept that there is a problem. It must be sensible, in deciding on categorisation, to look at the risk of harm caused by the platforms. I beg to move.

Photo of Lord Griffiths of Burry Port Lord Griffiths of Burry Port Labour 6:30, 25 May 2023

My Lords, I will speak to Amendment 192A. There can be nothing more comfortable within the terms of parliamentary debate than to find oneself cossetted by the noble Baroness, Lady Morgan, on one side and my noble friend Lord Stevenson on the other. I make no apology for repeating the thrust of the argument of the noble Baroness, but I will narrow the focus to matters that she hinted at which we need to think about in a particular way.

We have already debated suicide, self-harm and eating disorder content hosted by category 1 providers. There is a need for the Bill to do more here, particularly through strengthening the user empowerment duties in Clause 12 so that the safest option is the default. We have covered that ground. This amendment seeks to address the availability of this content on smaller services that will fall outside category 1, as the noble Baroness has said. The cut-off conditions under which services will be determined to fall within category 1 are still to be determined. We await further progress on that. However, there are medium-sized and small providers whose activities we need to look at. It is worth repeating—and I am aware that I am repeating—that these include suicide and eating disorder forums, whose main business is the sharing and discussion of methods and encouragement to engage in these practices. In other words, they are set up precisely to do that.

We know that that there are smaller platforms where users share detailed information about methods of suicide. One of these in particular has been highlighted by families and coroners as playing a role in the suicides of individuals in the UK. Regulation 28 reports—that is, an official request for action—have been issued to DCMS and DHSC by coroners to prevent future comparable deaths.

A recent systematic review, looking at the impact of suicide and self-harm-related videos and photographs, showed that potentially harmful content concentrated specifically on sites with low levels of moderation. Much of the material which promotes and glorifies this behaviour is unlikely to be criminalised through the Government’s proposed new offence of encouragement to serious self-harm. For example, we would not expect all material which provides explicit instructional information on how to take one’s life using novel and effective methods to be covered by it.

The content has real-world implications. There is clear evidence that when a particular suicide method becomes better known, the effect is not simply that suicidal people switch from one intended method to the novel one, but that suicides occur in people who would not otherwise have taken their own lives. There are, therefore, important public health reasons to minimise the discussion of dangerous and effective suicide methods.

The Bill’s pre-legislative scrutiny committee recommended that the legislation

“adopt a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model”.

This amendment is in line with that recommendation, seeking to extend category 1 regulation to services that carry a high level of risk.

The previous Secretary of State appeared to accept this argument—but we have had a lot of Secretaries of State since—and announced a deferred power that would have allowed for the most dangerous forums to be regulated; but the removal of the “legal but harmful” provisions from the legislation means that this power is no longer applicable, as its function related to the “adult risk assessment” duty, which is no longer in the Bill.

This amendment would not shut down dangerous services, but it would make them accountable to Ofcom. It would require them to warn their users of what they were about to see, and it would require them to give users control over the type of content that they see. That is, the Government’s proposed triple shield would apply to them. We would expect that this increased regulatory burden on small platforms would make them more challenging to operate and less appealing to potential users, and would diminish their size and reach over time.

This amendment is entirely in line with the Government’s own approach to dangerous content. It simply seeks to extend the regulatory position that they themselves have arrived at to the very places where much of the most dangerous content resides. Amendment 192A is supported by the Mental Health Foundation, the Samaritans and others that we have been able to consult. It is similar to Amendment 192, which we also support, but this one specifies that the harmful material that Ofcom must take account of relates to self-harm, suicide and eating disorders. I would now be more than happy to give way—eventually, when he chooses to do it—to my noble friend Lord Stevenson, who is not expected at this moment to use the true and full extent of his abilities at being cunning.

Photo of Baroness Bull Baroness Bull Deputy Chairman of Committees

My Lords, I rise to offer support for all the amendments in this group, but I will speak principally to Amendment 192A, to which I have added my name and which the noble Lord, Lord Griffiths, has just explained so clearly. It is unfortunate that the noble Baroness, Lady Parminter, cannot be in her place today. She always adds value in any debate, but on this issue in particular I know she would have made a very compelling case for this amendment. I will speak principally about eating disorders, because the issues of self-harm have already been covered and the hour is already late.

The Bill as it stands presumes a direct relationship between the size of a platform and its potential to cause harm. This is simply not the case: a systematic review which we heard mentioned confirmed what all users of the internet already know—that potentially harmful content is often and easily found on smaller, niche sites that will fall outside the scope of category 1. These sites are absolutely not hard to find—they come up on the first page of a Google search—and some hide in plain sight, masquerading, particularly in the case of eating disorder forums, as sources of support, solace or factual information when in fact they encourage and assist people towards dangerous practices. Without this amendment, those sites will continue spreading their harm and eating disorders will continue to have the highest mortality rate of all mental illnesses in the UK.

I was going to say something about the suicide sites, but the point which comes out of the research has already been made, that when a novel method of suicide becomes more well known, it is not just that people intending to kill themselves switch from one method to another but that the prevalence of suicide increases. As the noble Lord said, this is not just about preventing individual tragedies but is indeed a public health issue.

I very much welcome the steps being taken in the Bill to tackle the prevalence of damaging content, particularly as it applies to children. However, I believe that, as the Bill stands, smaller providers will fly under the radar and vulnerable adults will be harmed—the Bill is extremely light on protections for that category of people. Amendment 192A absolutely seeks to ensure that the Bill tackles content wherever it gives rise to a very high risk of harm, irrespective of the platform’s size. Arguments about regulatory burden on small sites should not apply when health, well-being and lives are at risk. The pre-legislative committee was absolutely alive to this, and its recommendations highlighted the risks here of the small, high-risk companies. As we heard, the previous Secretary of State announced a deferred power but that lapsed when the adult risk assessments were removed.

I fear that the current approach in the Bill will push people who promote this kind of content simply to create smaller platforms where they are beyond the arm of the law. It is not clear whether they would be caught instead by the Government’s new offence of encouraging or assisting serious self-harm. I know we have not debated that yet, but I cannot understand whether encouragement to starvation would be covered by that new offence. It is properly too early to ask the Minister to clarify that, but if he has the answer, I would like to understand that.

We have heard the term “rabbit hole”; there is a rabbit hole, where people intent on self-harm or indeed those who suffer from eating disorders go from larger platforms to smaller and niche ones where they encounter the very content that feeds their addiction, or which fuels and enables their desire to self-harm. As I said in a previous grouping, this cannot be the intention of the Bill, I do not believe it is the intention of the Government, and I hope that the Minister will listen to the arguments that the noble Baroness, Lady Morgan of Cotes, set out so effectively.

Photo of Lord Allan of Hallam Lord Allan of Hallam Liberal Democrat Lords Spokesperson (Health) 6:45, 25 May 2023

My Lords, I am a poor substitute for the noble Baroness, Lady Parminter, in terms of the substance of the issues covered by these amendments, but I am pleased that we have been able to hear from the noble Baroness, Lady Bull, on that. I will make a short contribution on the technology and the challenges of classification, because there are some important issues here that the amendments bring out.

We will be creating rules for categorising platforms. As I understand it, the rules will have a heavy emphasis on user numbers but will not be exclusively linked to user numbers. It would be helpful if the Minister could tease out a little more about how that will work. However, it is right even at this stage to consider the possibility that there will need to be exceptions to those rules and to have a mechanism in place for that.

We need to recognise that services can grow very quickly these days, and some of the highest-risk moments may be those when services have high growth but still very little revenue and infrastructure in place to look after their users. This is a problem generally with stepped models, where you have these great jumps; in a sense, a sliding scale would be more rational, so that responsibilities increase over time, but clearly from a practical view it is hard to do that, so we are going to end up with some kind of step model.

We also need to recognise that, from a technical point of view, it is becoming cheaper and easier to build new user-to-user services all the time. That has been the trend for years, but it is certainly the case now. If someone wants to create a service, they can rent the infrastructure from a number of providers rather than buying it, they can use a lot of code that is freely available—they do not need to write as much code as they used to—and they can promote their new service using all the existing social networks, so you can go from zero to significant user numbers in very quick time, and that is getting quicker all the time. I am interested to hear how the Minister expects such services to be regulated.

The noble Baroness, Lady Morgan, referred to niche platforms. There will be some that have no intention to comply, even if we categorise them as a 2B service. The letter will arrive from Ofcom and go in the bin. They will have no interest whatever. Some of the worst services will be like that. The advantage of us ensuring that we bring them into scope is that we can move through the enforcement process quickly and get to business disruption, blocking, or whatever we need to do to get them out of the UK market. Other niche services will be willing to come into line if they are told they are categorised as 2B but given a reasonable set of requirements. Some of Ofcom’s most valuable work might be precisely to work with them: services that are borderline but recognise that they want to have a viable business, and they do not have a viable business by breaking the law. We need to get hold of them and bring them into the net to be able to work with them.

Finally, there is another group which is very mainstream but in the growing phase and busy growing and not worrying about regulation. For that category of company, we need to work with them as they grow, and the critical thing is to get them early. I think the amendments would help Ofcom to be able get to them early—ideally, in partnership with other regulators, including the European Union, which is now regulating in a similar way under the Digital Services Act. If we can work with those companies as they come into 2B, then into category 1—in European speak, that is a VLOP, a very large online platform—and get them used to the idea that they will have VLOP and category 1 responsibilities before they get there, we can make a lot more progress. Then we can deliver what we are all trying to, which is a safer internet for people in the UK

Photo of Baroness Fox of Buckley Baroness Fox of Buckley Non-affiliated

I shall speak very briefly at this hour, just to clarify as much as anything. It seems important to me that there is a distinction between small platforms and large platforms, but my view has never been that if you are small, you have no potential harms, any more than if you are large, you are harmful. The exception should be the rule. We have to be careful of arbitrary categorisation of “small”. We have to decide who is going to be treated as though they are a large category 1 platform. I keep saying but stress again: do not assume that everybody agrees what significant risk of harm or hateful content is. It is such highly disputed political territory outside the online world and this House that we must recognise that it is not so straightforward.

I am very sympathetic, by the way, to the speeches made about eating disorders and other issues. I see that very clearly, but other categories of speech are disputed and argued over—I have given loads of examples. We end up where it is assumed that the manifestoes of mass shooters appear on these sites, but if you read any of those manifestoes of mass shooters, they will often be quoting from mainstream journalists in mainstream newspapers, the Bible and a whole range of things. Just because they are on 4Chan, or wherever, is not necessarily the problem; it is much more complicated.

I ask the Minister, and the proposers of the amendment, to some extent: would it not be straightforwardly the case that if there is a worry about a particular small platform, it might be treated differently—

Photo of Lord Allan of Hallam Lord Allan of Hallam Liberal Democrat Lords Spokesperson (Health)

I just want to react to the manifestos of mass shooters. While source material such the Bible is not in scope, I think the manifesto of a shooter is clear incitement to terrorism and any platform that is comfortable carrying that is problematic in my view, and I hope it would be in the noble Baroness’s view as well.

Photo of Baroness Fox of Buckley Baroness Fox of Buckley Non-affiliated

I was suggesting that we have a bigger problem than it appearing on a small site. It quotes from mainstream media, but it ends up being broadly disseminated and not because it is on a small site. I am not advocating that we all go round carrying the manifestos of mass shooters and legitimising them. I was more making the point that it can be complicated. Would not the solution be that you can make appeals that a small site is treated differently? That is the way we deal with harmful material in general and the way we have dealt with, for example, RT as press without compromising on press freedom. That is the kind of point I am trying to make.

I understand lots of concerns but I do not want us to get into a situation where we destroy the potential of all smaller platforms—many of them doing huge amounts of social good, part of civil society and all the rest of it—by treating them as though they are large platforms. They just will not have the resources to survive, that is all my point is.

Photo of Lord Clement-Jones Lord Clement-Jones Liberal Democrat Lords Spokesperson (Science, Innovation and Technology)

My Lords, I am going to be extremely brief given the extremely compelling way that these amendments have been introduced by the noble Baroness, Lady Morgan, and the noble Lord, Lord Griffiths, and contributed to by the noble Baroness, Lady Bull. I thank her for her comments about my noble friend Lady Parminter. I am sure she would have wanted to be here and would have made a very valuable contribution as she did the other day on exactly this subject.

As the noble Baroness, Lady Fox, has illustrated, we have a very different view of risk across this Committee and we are back, in a sense, into that whole area of risk. I just wanted to say that I think we are again being brought back to the very wise words of the Joint Committee. It may sound like special pleading. We keep coming back to this, and the noble Lord, Lord Stevenson, and I are the last people standing on a Thursday afternoon.

We took a lot of evidence in this particular area. We took the trouble to go to Brussels and had a very useful discussion with the Centre on Regulation in Europe and Dr Sally Broughton Micova. We heard a lot about interconnectedness between some of these smaller services and the impact in terms of amplification across other social media sites.

We heard in the UK from some of the larger services about their concerns about the activities of smaller services. You might say “They would say that, wouldn’t they?” but they were pretty convincing. We heard from HOPE not Hate, the Antisemitism Policy Trust and Stonewall, stressing the role of alternative services.

Of course, we know that these amendments today—some of them sponsored by the Mental Health Foundation, as the noble Lord, Lord Griffiths, said, and Samaritans—have a very important provenance. They recognise that these are big problems. I hope that the Minister will think strongly about this. The injunction from the noble Lord, Lord Allan, to consider how all this is going to work in practice is very important. I very much hope that when we come to consider how this works in practical terms that the Minister will think very seriously about the way in which risk is to the fore— the more nuanced approach that we suggested—and the whole way that profiling by Ofcom will apply. I think that is going to be extremely important as well. I do not think we have yet got to the right place in the Bill which deals with these risky sites. I very much hope that the Minister will consider this in the quite long period between now and when we next get together.

Photo of Lord Stevenson of Balmacara Lord Stevenson of Balmacara Shadow Spokesperson (Science, Innovation and Technology)

My Lords, this has been a good little debate with some excellent speeches, which I acknowledge. Like the noble Lord, Lord Clement-Jones, I was looking at the Joint Committee’s report. I concluded that one of the first big issues we discussed was how complicated the categorisation seemed in relation to the task that was being set for Ofcom. We comforted ourselves with the thought that if you believe that this is basically a risk-assessment exercise and that all the work Ofcom will subsequently do is driven by its risk assessments and its constant reviewing of them, then the categorisation is bound to fall down because the risks will reveal the things that need to happen.

As we have been through the process in our discussions in Committee, we keep coming across issues where proportionality seems to come around. The proportionality that I worry about is that which says, “If only a small number of people are affected by this, then obviously less needs to be happening at Ofcom level”. We debated this earlier in relation to the amendments on children and seemed to come out in two different positions.

I believe that there should be zero tolerance on whether children should be accessing material which is illegal for them, but the Bill does not say that. It says that all Ofcom’s work has to be done in proportion to the impact, not only in the direct work of trying to mitigate harms or illegality that could occur but taking into account the economic size of the company and the impact that the work would have on its activities. I do not think we can square that off, so I appeal to the Minister, when he comes to respond, to look at it from the other end. Why is it not possible to have a structure which is driven by the risk? If the risk assessment reveals risks that require action, there should not be a constraint simply because the categorisation hurdle has been met. The risk is what matters. Does he agree?

Photo of Lord Parkinson of Whitley Bay Lord Parkinson of Whitley Bay Parliamentary Under Secretary of State (Department for Culture, Media and Sport) 7:00, 25 May 2023

I am grateful to noble Lords for helping us to reach our target for the first time in this Committee, especially to do so in a way which has given us a good debate on which to send us off into the Whitson Recess. I am off to the Isle of Skye, so I will make a special detour to Balmacara in honour of the noble Lord.

Photo of Lord Parkinson of Whitley Bay Lord Parkinson of Whitley Bay Parliamentary Under Secretary of State (Department for Culture, Media and Sport)

The noble Lord does not believe anything that I say at this Dispatch Box, but I will send a postcard.

As noble Lords are by now well aware, all services in scope of the Bill, regardless of their size, will be required to take action against illegal content and all services likely to be accessed by children must put in place protections for children. Companies designated as category 1 providers have significant additional duties. These include the overarching transparency, accountability and freedom of expression duties, as well as duties on content of democratic importance, news publishers’ content, journalistic content and fraudulent advertising. It is right to put such duties only on the largest platforms with features enabling the greatest reach, as they have the most significant influence over public discourse online.

I turn first to Amendment 192 in the name of my noble friend Lady Morgan of Cotes and Amendment 192A from the noble Lord, Lord Griffiths of Burry Port, which are designed to widen category 1 definitions to include services that pose a risk of harm, regardless of their number of users. Following removal of the legal but harmful provisions in another place, the Bill no longer includes the concept of risk of harm in Category 1 designation. As we set out, it would not be right for the Government to define what legal content it considers harmful to adults, and it follows that it would not be appropriate for the Government to categorise providers and to require them to carry out duties based on this definition.

In addition, requiring all companies to comply with the full range of Category 1 duties would pose a disproportionate burden on services which do not exert the same influence over public discourse online. I appreciate the point made by the noble Baroness, Lady Bull, with regard to regulatory burden. There is a practical element to this as well. Services, particularly smaller ones, have finite resources. Imposing additional duties on them would divert them from complying with their illegal and child safety duties, which address the most serious online harms. We do not want to weaken their ability to tackle criminal activity or to protect children.

As we discussed in detail in a previous debate, the Bill tackles suicide and self-harm content in a number of ways. The most robust protections in the Bill are for children, while those for adults strike a balance between adults being protected from illegal content and given more choice over what legal content they see. The noble Lord, Lord Stevenson, asked why we do not start with the highest risk rather than thinking about the largest services, but we do. We start with the most severe harms—illegal activity and harm to children. We are focusing on the topics of greatest risk and then, for other categories, allowing adults to make decisions about the content with which they interact online.

A number of noble Lords referred to suicide websites and fora. We are concerned about the widespread availability of content online which promotes and advertises methods of suicide and self-harm, which can be easily accessed by young or vulnerable people. Under the Bill, where suicide and self-harm websites host user-generated content, they will be in scope of the legislation. These sites will need proactively to prevent users from being exposed to priority illegal content, including content which encourages or assists suicide under the terms of the Suicide Act 1961. Additionally, it is an offence under Section 4(3) of the Misuse of Drugs Act 1971 for a website to offer to sell controlled drugs to consumers in England and Wales. Posting advice on how to obtain such drugs in England and Wales is also likely to be an offence, regardless of where the person providing the advice is located.

The Bill also limits the availability of such content by placing illegal content duties on search services, including harmful content which affects children or where this content is shared on user-to-user services. This will play a key role in reducing traffic that directs people to websites which encourage or assist suicide, and reduce the likelihood of users encountering such content. The noble Baroness, Lady Bull, asked about starvation. Encouraging people to starve themselves or not to take prescribed medication will be covered.

Amendment 194 tabled by the noble Lord, Lord Stevenson of Balmacara, seeks to ensure that Ofcom can designate companies as category 1, 2A or 2B on a provisional basis, when it considers that they are likely to meet the relevant thresholds. This would mean that the relevant duties can be applied to them, pending a full assessment by Ofcom. The Government recognise the concern highlighted by the noble Lord, Lord Allan, about the rapid pace of change in the technology sector and how that can make it challenging to keep the register of the largest and most influential services up to date. I assure noble Lords that the Bill addresses this with a duty which the Government introduced during the Bill’s recommittal in another place. This duty, at Clause 88, requires Ofcom proactively to identify and publish a list of companies which are close to category 1 thresholds. This will reduce any delays in Ofcom adding additional obligations on companies which grow rapidly, or which introduce new high-risk features. It will also ensure that the regime remains agile and adaptable to emerging threats.

Platforms with the largest reach and greatest influence over public discourse will be designated as category 1. The Bill sets out a clear process for determining category 1 providers, based on thresholds relating to these criteria, which will be set by the Secretary of State in secondary legislation. The process has been designed to ensure that it is transparent and evidence-based. We expect the main social media platforms and possibly some others to be designated as category 1 services, but we do not wish to prejudge the process set out above by indicating which specific services are likely to be designated, as I have set out on previous groups.

The amendment would enable Ofcom to place new duties on companies without due process. Under the approach that we take in the Bill, Ofcom can designate companies as belonging to each category based only on an objective assessment of evidence against thresholds approved by Parliament. The Government’s approach also provides greater certainty for companies, as is proposed in this amendment. We have heard concerns in previous debates about when companies will have the certainty of knowing their category designation. These amendments would introduce continuous uncertainty and subjectivity into the designation process and would give Ofcom significant discretion over which companies should be subject to which duties. That would create a very uncertain operating environment for businesses and could reduce the attractiveness of the UK as a place to do business.

I hope that explains why we are not taken by these amendments but, in the spirit of the Whitsun Recess, I will certainly think about them on the train as I head north. I am very happy to discuss them with noble Lords and others between now and our return.

Photo of Lord Stevenson of Balmacara Lord Stevenson of Balmacara Shadow Spokesperson (Science, Innovation and Technology)

Before the Minister sits down, he did let slip that he was going on the sleeper, so I do not think that there will be much thinking going on—although I did not sleep a wink the last time I went, so I am sure that he will have plenty of time.

I am sure that the noble Baroness, Lady Morgan, will want to come in—but could he repeat that again? Risk assessment drives us, but the risk assessment for a company that will not be regarded as a category 1 provider because it does not meet categorisation thresholds means that, even though it is higher risk than perhaps even some of the category 1 companies, it will not be subject to the requirements to pick up the particular issues raised by the noble Baroness and the noble Lord, and their concerns for those issues, which are clearly social harms, will not really be considered on a par.

Photo of Lord Parkinson of Whitley Bay Lord Parkinson of Whitley Bay Parliamentary Under Secretary of State (Department for Culture, Media and Sport)

In the response I gave, I said that we are making the risk assessment that the riskiest behaviour is illegal content and content which presents a harm to children. That is the assessment and the approach taken in the Bill. In relation to other content which is legal and for adults to choose how they encounter it, there are protections in the Bill to enforce terms of service and empower users to curate their own experience online, but that assessment is made by adult users within the law.

Photo of Baroness Morgan of Cotes Baroness Morgan of Cotes Conservative

I thank all noble Lords who spoke in this short but important debate. As we heard, some issues relating to risk and harm have been returned to and will no doubt be again, and we note the impact of the absence of legal but harmful as a concept. As the noble Baroness, Lady Bull, said, I know that the noble Baroness, Lady Parminter, was very sad that she could not be here this afternoon due to another engagement.

I will not keep the House much longer. I particularly noted the noble Baroness’s point that there should not be, and is not, a direct relationship between the size of the platform and its ability to cause harm. There is a balance to be struck between the regulatory burden placed on platforms versus the health and well-being of those who are using them. As I have said before, I am not sure that we have always got that particular balance right in the Bill.

The noble Lord, Lord Allan, was very constructive: it has to be a good thing if we are now beginning to think about the Bill’s implementation, although we have not quite reached the end and I do not want to prejudge any further stages, in the sense that we are now thinking about how this would work. Of course, he is right to say that some of these platforms have no intention of complying with these rules at all. Ofcom and the Government will have to work out what to do about that.

Ultimately, the Government of the day—whoever it might be—will want the powers to be able to say that a small platform is deeply harmful in terms of its content and reach. When the Bill has been passed, there will be pressure at some point in the future on a platform that is broadcasting or distributing or amplifying content that is deeply harmful. Although I will withdraw the amendment today, my noble friend’s offer of further conversations, and more detail on categorisation and of any review of the platforms as categorised as category 1, 2 and beyond, would be very helpful in due course. I beg leave to withdraw.

Amendment 192 withdrawn.

Amendments 192A and 193 not moved.

Schedule 11 agreed.

Clause 86 agreed.

Amendment 194 not moved.

Clauses 87 and 88 agreed.

Clause 89: OFCOM’s register of risks, and risk profiles, of Part 3 services

Amendments 194A to 197 not moved.

Clause 89 agreed.

Clause 90 agreed.

House resumed.