New Clause 1 - Report on redress for individual complaints

Online Safety Bill – in the House of Commons at 5:23 pm on 17 January 2023.

Alert me about debates like this

‘(1) The Secretary of State must publish a report assessing options for dealing with appeals about complaints made under section 17 of this Act.

(2) The report must—

(a) provide a general update on the fulfilment of duties about complaints procedures which apply in relation to all regulated user-to-user services;

(b) assess which body should be responsible for a system to deal with appeals in cases where a complainant considers that a complaint has not been satisfactorily dealt with; and

(c) provide options for how the system should be funded, including consideration of whether an annual surcharge could be imposed on user-to-user services.

(3) The report must be laid before Parliament within six months of the commencement of section 17.’—(Alex Davies- Jones.)

Brought up, and read the First time.

Photo of Rosie Winterton Rosie Winterton Deputy Speaker (First Deputy Chairman of Ways and Means)

With this it will be convenient to discuss the following:

New clause 2—Offence of failing to comply with a relevant duty—

‘(1) The provider of a service to whom a relevant duty applies commits an offence if the provider fails to comply with the duty.

(2) In the application of sections 178(2) and 179(5) to an offence under this section (where the offence has been committed with the consent or connivance of an officer of the entity or is attributable to any neglect on the part of an officer of the entity) the references in those provisions to an officer of an entity include references to any person who, at the time of the commission of the offence—

(a) was (within the meaning of section 93) a senior manager of the entity in relation to the activities of the entity in the course of which the offence was committed; or

(b) was a person purporting to act in such a capacity.

(3) A person who commits an offence under this section is liable on conviction on indictment to imprisonment for a term not exceeding two years or a fine (or both).

(4) In this section, “relevant duty” means a duty provided for by section 11 of this Act.’

This new clause makes it an offence for the provider of a user-to-service not to comply with the safety duties protecting children set out in clause 11. Where the offence is committed with the consent or connivance of a senior manager or other officer of the provider, or is attributable to their neglect, the officer, as well as the entity, is guilty of the offence.

New clause 3—Child user empowerment duties—

‘(1) This section sets out the duties to empower child users which apply in relation to Category 1 services.

(2) A duty to include in a service, to the extent that it is proportionate to do so, features which child users may use or apply if they wish to increase their control over harmful content.

(3) The features referred to in subsection (2) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—

(a) reduce the likelihood of the user encountering priority content that is harmful, or particular kinds of such content, by means of the service, or

(b) alert the user to the harmful nature of priority content that is harmful that the user may encounter by means of the service.

(4) A duty to ensure that all features included in a service in compliance with the duty set out in subsection (2) are made available to all child users.

(5) A duty to include clear and accessible provisions in the terms of service specifying which features are offered in compliance with the duty set out in subsection (2), and how users may take advantage of them.

(6) A duty to include in a service features which child users may use or apply if they wish to filter out non-verified users.

(7) The features referred to in subsection (6) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—

(a) prevent non-verified users from interacting with content which that user generates, uploads or shares on the service, and

(b) reduce the likelihood of that user encountering content which non-verified users generate, upload or share on the service.

(8) A duty to include in a service features which child users may use or apply if they wish to only encounter content by users they have approved.

(9) A duty to include in a service features which child users may use or apply if they wish to filter out private messages from—

(a) non-verified users, or

(b) adult users, or

(c) any user other than those on a list approved by the child user.

(10) In determining what is proportionate for the purposes of subsection (2), the following factors, in particular, are relevant—

(a) all the findings of the most recent child risk assessment (including as to levels of risk and as to nature, and severity, of potential harm), and

(b) the size and capacity of the provider of a service.

(11) In this section “non-verified user” means a user who has not verified their identity to the provider of a service (see section 57(1)).

(12) In this section references to features include references to functionalities and settings.’

New clause 4—Safety duties protecting adults and society: minimum standards for terms of service—

‘(1) OFCOM may set minimum standards for the provisions included in a provider’s terms of service as far as they relate to the duties under sections 11, [Harm to adults and society risk assessment duties], [Safety duties protecting adults and society], 12, 16 to 19 and 28 of this Act (“relevant duties”).

(2) Where a provider does not meet the minimum standards, OFCOM may direct the provider to amend its terms of service in order to ensure that the standards are met.

(3) OFCOM must, at least once a year, conduct a review of—

(a) the extent to which providers are meeting the minimum standards, and

(b) how the providers’ terms of service are enabling them to fulfil the relevant duties.

(4) The report must assess whether any provider has made changes to its terms of service that might affect the way it fulfils a relevant duty.

(5) OFCOM must lay a report on the first review before both Houses of Parliament within one year of this Act being passed.

(6) OFCOM must lay a report on each subsequent review at least once a year thereafter.’

New clause 5—Harm to adult and society risk assessment duties—

‘(1) This section sets out the duties about risk assessments which apply in relation to Category 1 services (in addition to the duties about risk assessments set out in section 8 and, in the case of Category 1 services likely to be accessed by children, section 10).

(2) A duty to carry out a suitable and sufficient harm to adults and society risk assessment at a time set out in, or as provided by, Schedule 3.

(3) A duty to take appropriate steps to keep an harm to adults and society risk assessment up to date, including when OFCOM make any significant change to a risk profile that relates to services of the kind in question.

(4) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient harm to adults and society risk assessment relating to the impacts of that proposed change.

(5) A “harm to adults and society risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—

(a) the user base;

(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of priority content that is harmful to adults and society (with each kind separately assessed), taking into account (in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service;

(c) the level of risk of harm to adults and society presented by different kinds of priority content that is harmful to adults and society;

(d) the level of risk of harm to adults and society presented by priority content that is harmful to adults and society which particularly affects individuals with a certain characteristic or members of a certain group;

(e) the level of risk of functionalities of the service facilitating the presence or dissemination of priority content that is harmful to adults and society, identifying and assessing those functionalities that present higher levels of risk;

(f) the different ways in which the service is used, and the impact of such use on the level of risk of harm that might be suffered by adults and society;

(g) the nature, and severity, of the harm that might be suffered by adults and society from the matters identified in accordance with paragraphs (b) to (f);

(h) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.

(6) In this section references to risk profiles are to the risk profiles for the time being published under section 85 which relate to the risk of harm to adults and society presented by priority content that is harmful to adults and society.

(7) See also—

(a) section 19(2) (records of risk assessments), and

(b) Schedule 3 (timing of providers’ assessments).’

New clause 6—Safety duties protecting adults and society—

‘(1) This section sets out the duties to prevent harms to adults and society which apply in relation to Category 1 services.

(2) A duty to summarise in the terms of service the findings of the most recent adults and society risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to adults and society).

(3) If a provider decides to treat a kind of priority content that is harmful to adults and society in a way described in subsection (4), a duty to include provisions in the terms of service specifying how that kind of content is to be treated (separately covering each kind of priority content that is harmful to adults and society which a provider decides to treat in one of those ways).

(4) These are the kinds of treatment of content referred to in subsection (3)—

(a) taking down the content;

(b) restricting users’ access to the content;

(c) limiting the recommendation or promotion of the content;

(d) recommending or promoting the content;

(e) allowing the content without treating it in a way described in any of paragraphs (a) to (d).

(5) A duty to explain in the terms of service the provider’s response to the risks relating to priority content that is harmful to adults and society (as identified in the most recent adults and society risk assessment of the service), by reference to—

(a) any provisions of the terms of service included in compliance with the duty set out in subsection (3), and

(b) any other provisions of the terms of service designed to mitigate or manage those risks.

(6) If provisions are included in the terms of service in compliance with the duty set out in subsection (3), a duty to ensure that those provisions—

(a) are clear and accessible, and

(b) are applied consistently.

(7) If the provider of a service becomes aware of any non-designated content that is harmful to adults and society present on the service, a duty to notify OFCOM of—

(a) the kinds of such content identified, and

(b) the incidence of those kinds of content on the service.

(8) In this section—

“harm to adults and society risk assessment” has the meaning given by section [harm to adults and society risk assessment duties];

“non-designated content that is harmful to adults and society” means content that is harmful to adults and society other than priority content that is harmful to adults and society.

(9) See also, in relation to duties set out in this section, section 18 (duties about freedom of expression and privacy).’

New clause 7—“Content that is harmful to adults and society” etc—

‘(1) This section applies for the purposes of this Part.

(2) “Priority content that is harmful to adults and society” means content of a description designated in regulations made by the Secretary of State as priority content that is harmful to adults and society.

(3) “Content that is harmful to adults and society” means—

(a) priority content that is harmful to adults and society, or

(b) content, not within paragraph (a), of a kind which presents a material risk of significant harm to an appreciable number of adults in the United Kingdom.

(4) For the purposes of this section—

(a) illegal content (see section 53) is not to be regarded as within subsection (3)(b), and

(b) content is not to be regarded as within subsection (3)(b) if the risk of harm flows from—

(i) the content’s potential financial impact,

(ii) the safety or quality of goods featured in the content, or

(iii) the way in which a service featured in the content may be performed (for example, in the case of the performance of a service by a person not qualified to perform it).

(5) References to “priority content that is harmful to adults and society” and “content that is harmful to adults and society” are to be read as—

(a) limited to content within the definition in question that is regulated user-generated content in relation to a regulated user-to-user service, and

(b) including material which, if it were present on a regulated user-to-user service, would be content within paragraph (a) (and this section is to be read with such modifications as may be necessary for the purpose of this paragraph).

(6) Sections 55 and 56 contain further provision about regulations made under this section.’

Government amendments 1 to 4.

Amendment 44, clause 11, page 10, line 17, , at end insert ‘, and—

“(c) mitigate the harm to children caused by habit-forming features of the service by consideration and analysis of how processes (including algorithmic serving of content, the display of other users’ approval of posts and notifications) contribute to development of habit-forming behaviour.”’

Amendment 82, page 10, line 25, at end insert—

‘(3A) Content under subsection (3) includes content that may result in serious harm or death to a child while crossing the English Channel with the aim of entering the United Kingdom in a vessel unsuited or unsafe for those purposes.’

This amendment would require proportionate systems and processes, including removal of content, to be in place to control the access by young people to material which encourages them to undertake dangerous Channel crossings where their lives could be lost.

Amendment 83, page 10, line 25, at end insert—

‘(3A) Content promoting self-harm, including content promoting eating disorders, must be considered as harmful.’

Amendment 84, page 10, line 25, at end insert—

‘(3A) Content which advertises or promotes the practice of so-called conversion practices of LGBTQ+ individuals must be considered as harmful for the purposes of this section.’

Amendment 45, page 10, line 36, leave out paragraph (d) and insert—

‘(d) policies on user access to the service, parts of the service, or to particular content present on the service, including blocking users from accessing the service, parts of the service, or particular content,’.

Amendment 47, page 10, line 43, at end insert ‘, and

“(i) reducing or removing a user’s access to livestreaming features.”’

Amendment 46, page 10, line 43, at end insert ‘, and

“(i) reducing or removing a user’s access to private messaging features.”’

Amendment 48, page 11, line 25, after ‘accessible’ insert ‘for child users.’

Amendment 43, clause 12, page 12, line 24, leave out ‘made available to’ and insert

‘in operation by default for’.

Amendment 52, page 12, line 30, after ‘non-verified users’ insert

‘and to enable them to see whether another user is verified or non-verified.’

This amendment would require Category 1 services to make visible to users whether another user is verified or non-verified.

Amendment 49, page 12, line 30, at end insert—

‘(6A) A duty to ensure features and provisions in subsections (2), (4) and (6) are accessible and understandable to adult users with learning disabilities.’

Amendment 53, page 12, line 32, after ‘to’ insert ‘effectively’.

This amendment would bring this subsection into line with subsection (3) by requiring that the systems or processes available to users for the purposes described in subsections (7)(a) and (7)(b) should be effective.

Amendment 55, page 18, line 15, at end insert—

‘(4A) Content that is harmful to adults and society.’

Amendment 56, clause 17, page 20, line 10, leave out subsection (6) and insert—

‘(6) The following kinds of complaint are relevant for Category 1 services—

(a) complaints by users and affected persons about content present on a service which they consider to be content that is harmful to adults and society;

(b) complaints by users and affected persons if they consider that the provider is not complying with a duty set out in—

(i) section [adults and society online safety]

(ii) section 12 (user empowerment),

(iii) section 13 (content of democratic importance),

(iv) section 14 (news publisher content),

(v) section 15 (journalistic content), or

(vi) section 18(4), (6) or (7) (freedom of expression and privacy);

(c) complaints by a user who has generated, uploaded or shared content on a service if that content is taken down, or access to it is restricted, on the basis that it is content that is harmful to adults and society;

(d) complaints by a user of a service if the provider has given a warning to the user, suspended or banned the user from using the service, or in any other way restricted the user’s ability to use the service, as a result of content generated, uploaded or shared by the user which the provider considers to be content that is harmful to adults and society.’

Amendment 57, clause 19, page 21, line 40, leave out ‘or 10’ and insert

‘, 10 or [harms to adults and society risk assessment duties]’.

Amendment 58, page 22, line 37, at end insert—

‘(ba) section [adults and society online safety] (adults and society online safety),’

Government amendment 5.

Amendment 59, clause 44, page 44, line 11, at end insert

‘or

(ba) section [adults and society online safety] (adults and society online safety);’

Government amendment 6.

Amendment 60, clause 55, page 53, line 43, at end insert—

‘(2A) The Secretary of State may specify a description of content in regulations under section [“Content that is harmful to adult and society” etc](2) (priority content that is harmful to adults and society) only if the Secretary of State considers that, in relation to regulated user-to-user services, there is a material risk of significant harm to an appreciable number of adults presented by content of that description that is regulated user-generated content.’

Amendment 61, page 53, line 45, after ‘54’ insert

‘or [“Content that is harmful to adults and society” etc]’.

Amendment 62, page 54, line 8, after ‘54’ insert

‘or [“Content that is harmful to adults and society” etc]’.

Amendment 63, page 54, line 9, leave out ‘are to children’ and insert

‘or adults are to children or adults and society’.

Government amendments 7 to 16.

Amendment 77, clause 94, page 85, line 42, after ‘10’ insert

‘, [Adults and society risk assessment duties]’.

Amendment 78, page 85, line 44, at end insert—

‘(iiia) section [Adults and society online safety] (adults and society online safety);’

Amendment 54, clause 119, page 102, line 22, at end insert—

‘Section [Safety duties protecting adults and society: minimum standards for terms of service]Minimum standards for terms of service’

Amendment 79, page 102, line 22, at end insert—

‘Section [Harm to adults and society assessments]Harm to adults and society risk assessments
Section [Adults and society online safety]Adults and society online safety’

Government amendments 17 to 19.

Amendment 51, clause 207, page 170, line 42, after ‘including’ insert ‘but not limited to’.

Government amendments 20 to 23.

Amendment 81, clause 211, page 177, line 3, leave out ‘and 55’ and insert

‘, [“Content that is harmful to adults and society” etc] and 55’.

Government amendments 24 to 42.

Amendment 64, schedule 8, page 207, line 13, leave out ‘relevant content’ and insert

‘priority content that is harmful to adults and society’.

Amendment 65, page 207, line 15, leave out ‘relevant content’ and insert

‘priority content that is harmful to adults and society’.

Amendment 66, page 207, line 17, leave out ‘relevant content’ and insert

‘priority content that is harmful to adults and society’.

Amendment 67, page 207, line 21, leave out ‘relevant content’ and insert

‘content that is harmful to adults and society, or other content which they consider breaches the terms of service.’

Amendment 68, page 207, line 23, leave out ‘relevant content’ and insert

‘priority content that is harmful to adults and society’.

Amendment 69, page 207, line 26, leave out ‘relevant content’ and insert

‘priority content that is harmful to adults and society’.

Amendment 70, page 208, line 2, leave out

‘or content that is harmful to children’ and insert

‘content that is harmful to children or priority content that is harmful to adults and society’.

Amendment 71, page 208, line 10, leave out

‘and content that is harmful to children’ and insert

‘content that is harmful to children and priority content that is harmful to adults and society’.

Amendment 72, page 208, line 13, leave out

“and content that is harmful to children” and insert

‘content that is harmful to children and priority content that is harmful to adults and society’.

Amendment 73, page 210, line 2, at end insert

‘“content that is harmful to adults and society” and “priority content that is harmful to adults and society” have the same meaning as in section [“Content that is harmful to adults and society” etc]’.

Amendment 50, schedule 11, page 217, line 31, at end insert—

‘(1A) Regulations made under sub-paragraph (1) must provide for any regulated user-to-user service which OFCOM assesses as posing a very high risk of harm to be included within Category 1, regardless of the number of users.’

Amendment 74, page 218, line 24, leave out

‘and content that is harmful to children’ and insert

‘content that is harmful to children and priority content that is harmful to adults and society’.

Amendment 75, page 219, line 6, leave out

‘and content that is harmful to children’ and insert

‘content that is harmful to children and priority content that is harmful to adults and society’.

Amendment 76, page 221, line 24, at end insert—

‘“priority content that is harmful to adults and society” has the same meaning as in section [“Content that is harmful to adults and society” etc]’.

Amendment 80, page 240, line 35, in schedule 17, at end insert—

‘(ba) section [Harm to adults and society assessments] (Harm to adults and society assessments), and’.

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport), Shadow Minister (Tech, Gambling and the Digital Economy)

Once again, it is a privilege to be back in the Chamber opening this debate—the third Report stage debate in recent months—of this incredibly important and urgently needed piece of legislation. I speak on behalf of colleagues across the House when I say that the Bill is in a much worse position than when it was first introduced. It is therefore vital that it is now able to progress to the other place. Although we are all pleased to see the Bill return today, the Government’s delays have been incredibly costly and we still have a long way to go until we see meaningful change for the better.

In December, during the last Report stage debate, we had the immense privilege to be joined in the Public Gallery by a number of the families who have all lost children in connection with online harms. It is these families whom we must keep in our mind when we seek to get the Bill over the line once and for all. As ever, I pay tribute to their incredible efforts in the most difficult of all circumstances.

Today’s debate is also very timely in that, earlier today, the End Violence Against Women and Girls coalition and Glitch, a charity committed to ending online abuse, handed in their petition, which calls on the Prime Minister to protect women and girls online. The petition has amassed more than 90,000 signatures and rising, so we know there is strong support for improving internet safety across the board. I commend all those involved on their fantastic efforts in raising this important issue.

It would be remiss of me not to make a brief comment on the Government’s last-minute U-turns in their stance on criminal sanctions. The fact that we are seeing amendments withdrawn at the last minute goes to show that this Government have absolutely no idea where they truly stand on these issues and that they are ultimately too weak to stand up against vested interests, whereas Labour is on the side of the public and has consistently put safety at the forefront throughout the Bill’s passage.

More broadly, I made Labour’s feelings about the Government’s highly unusual decision to send part of this Bill back to Committee a second time very clear during the previous debate. I will spare colleagues by not repeating those frustrations here, but let me be clear: it is absolutely wrong that the Government chose to remove safety provisions relating to “legal but harmful” content in Committee. That is a major weakening, not strengthening, of the Bill; everyone online, including users and consumers, will be worse off without those provisions.

The Government’s alternative proposal, to introduce a toggle to filter out harmful content, is unworkable. Replacing the sections of this Bill that could have gone some way towards preventing harm with an emphasis on free speech instead undermines the very purpose of the Bill. It will embolden abusers, covid deniers, hoaxers and others, who will feel encouraged to thrive online.

In Committee, the Government also chose to remove important clauses from the Bill that were in place to keep adults safe online. Without the all-important risk assessments for adults, I must press the Minister on an important point: exactly how will this Bill do anything to keep adults safe online? The Government know all that, but have still pursued a course of action that will see the Bill watered down entirely.

Photo of Kim Leadbeater Kim Leadbeater Labour, Batley and Spen

Does my hon. Friend agree that, as we discussed in the Bill Committee, there is clear evidence that legal but harmful content is often the gateway to far more dangerous radicalisation and extremism, be it far-right, Islamist, incel or other? Will she therefore join me in supporting amendment 43 to ensure that by default such content is hidden from all adult users?

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport), Shadow Minister (Tech, Gambling and the Digital Economy)

I completely support my hon. Friend’s comments and I was pleased to see her champion that cause in the Bill Committee. Of course I support amendment 43, tabled in the names of SNP colleagues, to ensure that the toggle is on by default. Abhorrent material is being shared and amplified—that is the key point, amplified—online by algorithms and by the processes and systems in place. It is obvious that the Government just do not get that. That said, there is a majority in Parliament and in the country for strengthening the Online Safety Bill, and Labour has been on the front foot in arguing for a stronger Bill since First Reading last year.

It is also important to recognise the sheer number of amendments and changes we have seen to the Bill so far. Even today, there are many more amendments tabled by the Government. If that does not give an indication of the mess they have made of getting this legislation over the line in a fit and proper state, I do not know what does.

I have said it before, and I am certain I will say it again, but we need to move forward with this Bill, not backward. That is why, despite significant Government delay, we will support the Bill’s Third Reading, as each day of inaction allows more harm to spread online. With that in mind, I too will make some progress.

I will first address new clause 1, tabled in my name and that of my hon. Friend Lucy Powell. This important addition to the Bill will go some way to address the gaps around support for individual complaints. We in the Opposition have repeatedly queried Ministers and the Secretary of State on the mechanisms available for individuals who have appeals of complaints. That is why new clause 1 is so important. It is vital that platforms’ complaints procedures are fit for purpose, and this new clause will finally see the Secretary of State publishing a report on the options available to individuals.

We already know that the Bill in its current form fails to consider an appropriate avenue for individual complaints. This is a classic case of David and Goliath, and it is about time those platforms went further in giving their users a transparent, effective complaints process. That substantial lack of transparency underpins so many of the issues Labour has with the way the Government have handled—or should I say mishandled—the Bill so far, and it makes the process by which the Government proceeded to remove the all-important clauses on legal but harmful content, in a quiet room on Committee Corridor just before Christmas, even more frustrating.

That move put the entire Bill at risk. Important sections that would have put protections in place to prevent content such as health and foreign-state disinformation, the promotion of self-harm, and online abuse and harassment from being actively pushed and promoted were rapidly removed by the Government. That is not good enough, and it is why Labour has tabled a series of amendments, including new clauses 4, 5, 6 and 7, that we think would go some way towards correcting the Government’s extremely damaging approach.

Under the terms of the Bill as currently drafted, platforms could set whatever terms and conditions they want and change them at will. We saw that in Elon Musk’s takeover at Twitter, when he lifted the ban on covid disinformation overnight because of his own personal views. Our intention in tabling new clause 4 is to ensure that platforms are not able to simply avoid safety duties by changing their terms and conditions whenever they see fit. This group of amendments would give Ofcom the power to set minimum standards for platforms’ terms and conditions, and to direct platforms to change them if they do not meet those standards.

Photo of Andrew Gwynne Andrew Gwynne Shadow Minister (Health and Social Care)

My hon. Friend is making an important point. She might not be aware of it, but I recently raised in the House the case of my constituents, whose 11-year-old daughter was groomed on the music streaming platform Spotify and was able to upload explicit photographs of herself on that platform. Thankfully, her parents found out and made several complaints to Spotify, which did not immediately remove that content. Is that not why we need the ombudsman?

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport), Shadow Minister (Tech, Gambling and the Digital Economy)

I am aware of that case, which is truly appalling and shocking. That is exactly why we need such protections in the Bill: to stop those cases proliferating online, to stop the platforms from choosing their own terms of service, and to give Ofcom real teeth, as a regulator, to take on those challenges.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee)

Does the hon. Lady accept that the Bill does give Ofcom the power to set minimum safety standards based on the priority legal offences written into the Bill? That would cover almost all the worst kinds of offences, including child sexual exploitation, inciting violence and racial hatred, and so on. Those are the minimum safety standards that are set, and the Bill guarantees them.

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport), Shadow Minister (Tech, Gambling and the Digital Economy)

What is not in those minimum safety standards is all the horrendous and harmful content that I have described: covid disinformation, harmful content from state actors, self-harm promotion, antisemitism, misogyny and the incel culture, all of which is proliferating online and being amplified by the algorithms. This set of minimum safety standards can be changed overnight.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee)

As the hon. Lady knows, foreign-state disinformation is covered because it is part of the priority offences listed in the National Security Bill, so those accounts can be disabled. Everything that meets the criminal threshold is in this Bill because it is in the National Security Bill, as she knows. The criminal threshold for all the offences she lists are set in schedule 7 of this Bill.

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport), Shadow Minister (Tech, Gambling and the Digital Economy)

That is just the problem, though, isn’t it? A lot of those issues would not be covered by the minimum standards—that is why we have tabled new clause 4—because they do not currently meet the legal threshold. That is the problem. There is a grey area of incredibly harmful but legal content, which is proliferating online, being amplified by algorithms and by influencers—for want of a better word—and being fed to everybody online. That content is then shared incredibly widely, and that is what is causing harm and disinformation.

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport), Shadow Minister (Tech, Gambling and the Digital Economy)

No, I will not. I need to make progress; we have a lot to cover and a lot of amendments, as I have outlined.

Under the terms of the Bill, platforms can issue whatever minimum standards they wish and then simply change them at will overnight. In tabling new clause 4, our intention is to ensure that the platforms are not able to avoid safety duties by changing their terms and conditions. As I have said, this group of amendments will give Ofcom the relevant teeth to act and keep everybody safe online.

We all recognise that there will be a learning curve for everyone involved once the legislation is enacted. We want to get that right, and the new clauses will ensure that platforms have specific duties to keep us safe. That is an important point, and I will continue to make it clear at every opportunity, because the platforms and providers have, for far too long, got away with zero regulation—nothing whatsoever—and enough is enough.

During the last Report stage, I made it clear that Labour considers individual liability essential to ensuring that online safety is taken seriously by online platforms. We have been calling for stronger criminal sanctions for months, and although we welcome some movement from the Government on that issue today, enforcement is now ultimately a narrower set of measures because the Government gutted much of the Bill before Christmas. That last minute U-turn is another one to add to a long list, but to be frank, very little surprises me when it comes to this Government’s approach to law-making.

Photo of John Hayes John Hayes Conservative, South Holland and The Deepings

I have to say to the hon. Lady that to describe it as a U-turn is not reasonable. The Government have interacted regularly with those who, like her, want to strengthen the Bill. There has been proper engagement and constructive conversation, and the Government have been persuaded by those who have made a similar case to the one she is making now. I think that warrants credit, rather than criticism.

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport), Shadow Minister (Tech, Gambling and the Digital Economy)

I completely disagree with the right hon. Member, because we voted on this exact amendment before Christmas in the previous Report stage. It was tabled in the name of my right hon. Friend Dame Margaret Hodge, and it was turned down. It was word for word exactly the same amendment. If this is not anything but a U-turn, what is it?

I am pleased to support a number of important amendments in the names of the hon. Members for Aberdeen North (Kirsty Blackman) and for Ochil and South Perthshire (John Nicolson). In particular, I draw colleagues’ attention to new clause 3, which would improve the child empowerment duties in the Bill. The Government may think they are talking a good game on child safety, but it is clear to us all that some alarming gaps remain. The new clause would go some way to ensuring that the systems and processes behind platforms will go further in keeping children safe online.

In addition, we are pleased, as I have mentioned, to support amendment 43, which calls for the so-called safety toggle feature to be turned on by default. When the Government removed the clause relating to legal but harmful content in Committee, they instead introduced a requirement for platforms to give users the tools to reduce the likelihood of certain content appearing on their feeds. We have serious concerns about whether this approach is even workable, but if it is the route that the Government wish to take, we feel that these tools should at least be turned on by default.

Photo of Debbie Abrahams Debbie Abrahams Labour, Oldham East and Saddleworth

Since my hon. Friend is on the point of safeguarding children, will she support Baroness Kidron as the Bill progresses to the other House in ensuring that coroners have access to data where they suspect that social media may have played a part in the death of children?

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport), Shadow Minister (Tech, Gambling and the Digital Economy)

I can confirm that we will be supporting Baroness Kidron in her efforts. We will support a number of amendments that will be tabled in the Lords in the hope of strengthening this Bill further, because we have reached the limit of what we can do in this place. I commend the work that Baroness Kidron and the 5Rights Foundation have been doing to support children and to make this Bill work to keep everybody online as safe as possible.

Supporting amendment 43 would send a strong signal that our Government want to put online safety at the forefront of all our experiences when using the internet. For that reason, I look forward to the Minister seriously considering this amendment going forward. Scottish National party colleagues can be assured of our support, as I have previously outlined, should there be a vote on that.

More broadly, I highlight the series of amendments tabled in my name and that of my hon. Friend the Member for Manchester Central that ultimately aim to reverse out of the damaging avenue that the Government have chosen to go to down in regulating so-called legal but harmful content. As I have already mentioned, the Government haphazardly chose to remove those important clauses in Committee. They have chopped and changed this Bill more times than any of us can remember, and we are now left with a piece of legislation that is even more difficult to follow and, importantly, implement than when it was first introduced. We can all recognise that there is a huge amount of work to be done in making the Bill fit for purpose. Labour has repeatedly worked to make meaningful improvements at every opportunity, and it will be on the Government’s hands if the Bill is subject to even more delay. The Minister knows that, and I sincerely hope that he will take these concerns seriously. After all, if he will not listen to me, he would do well to listen to the mounting concerns raised by Members on his own Benches instead.

Several hon. Members:

rose—

Photo of Rosie Winterton Rosie Winterton Deputy Speaker (First Deputy Chairman of Ways and Means)

I have noticed that some people are standing who may not have applied earlier. If anybody is aware of that, can they let me know, and I can adjust timings accordingly? At the moment, my estimate is that if everybody takes no longer than seven minutes, and perhaps more like six, we can get everybody in comfortably without having to impose a time limit.

Photo of Priti Patel Priti Patel Conservative, Witham

I rise to speak to new clause 2 on the offence of failing to comply with a relevant duty. I pay tribute to my right hon. and hon. Friends who have championed new clause 2 to strengthen protections for children by introducing criminal liability for senior managers.

We have discussed this issue already in this Chamber. I thank charities and campaigners such as the National Society for the Prevention of Cruelty to Children for raising awareness and for being constructive and assiduous. I also thank the families who, through voicing their own pain and suffering, have given impetus to this issue. I thank those on the Front Bench; it is fair to say that I have had constructive dialogue with the Minister and the Secretary of State. They listened to our concerns and accepted that this issue had to be addressed.

As we debate this new clause and other aspects of the Bill, we should begin as we did last time by thinking of those who face tragedy and distress as a result of accessing inappropriate content online. Children and vulnerable people have been failed by tech companies and regulation. We have the duty and responsibility to step up and tighten the law, and protect children from online harms, exploitation and inappropriate content. That must be at the heart and centre of a lot of the legislation—not just this Bill but going forward. Throughout the various debates, and at Committee stage, we have touched on the fact that technology is evolving and changing constantly. With that, we must keep on building upon insights.

New clause 2 does simple and straightforward things. It makes senior managers liable and open to being prosecuted for failing to proactively promote and support the safety duties in clause 11. As it stands, the Bill’s criminal liability provisions fall short of what is expected or required. Criminal liability for failing to comply with an information notice from Ofcom is welcome. Ofcom has a very important role to play—I do not need to emphasise that any more. But the Bill does not go far enough, and Ministers have recognised that. We must ensure that all the flaws and failings are sanctionable and that the laws are changed in the right way. It not just about the laws for the Government Department leading the Bill; it cuts across other Government Departments. We have touched on that many times before.

More than 80% of the public agree that senior tech managers should be held legally responsible, to prevent harm to children on social media. That is a statement of the obvious, as we have seen such abhorrent and appalling harms take place. Around two thirds want managers to be prosecuted when failures result in serious harm. But harm can happen prior to an information notice being issued by Ofcom—again, we have discussed that.

The public need assurances that these companies will have the frameworks and safeguards to act responsibly and be held to account so that children and vulnerable individuals are protected. That means meaningful actions, not warm words. We should have proactivity when developing the software, algorithms and technology to be responsive. We must ensure that measures are put in place to hold people to account, and that sanctions cover company law, accountability, health and safety and other areas. Ireland has been mentioned throughout the passage of this Bill. That is important. My colleagues who will speak shortly have also touched on similar provisions.

It is right that we put these measures in the Bill for the serious failures to protect children. This is a topical issue. In fact, a number of colleagues met tech companies and techUK yesterday, as did I. We have an opportunity to raise the bar in the United Kingdom so that technology investment still comes forward and the sector continues to grow and flourish in the right way and for the right reasons. We want to see that.

Photo of Jamie Stone Jamie Stone Liberal Democrat Spokesperson (Armed Forces), Liberal Democrat Spokesperson (Digital, Culture, Media and Sport) 6:00, 17 January 2023

The issues of evolving technology and holding people to account are hugely important. May I make the general point that digital education could underpin all those safeguards? The teaching of digital literacy should be conducted in parallel with all the other good efforts made across our schools.

Photo of Priti Patel Priti Patel Conservative, Witham

The hon. Member is absolutely right, and I do not think anyone in the House would disagree with that. We have to carry on learning in life, and that links to technology and other issues. That applies to all of us across the board, and we need people in positions of authority to ensure that the right kind of information is shared, to protect our young people.

I look forward to hearing from the Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend Paul Scully, who has been so good in engaging on this issue, and I thank him for the proactive way in which he has spent time with all of us. Will we see the Government’s amendment prior to the Bill going to the other place for its Second Reading there? It is vital for all colleagues who support new clause 2 to have clear assurances that the provisions we support, which could have passed through this House, will not be diluted in the other place by Ministers. Furthermore—we should discuss this today—what steps are the Government and Ofcom taking to secure the agreement of tech companies to work to ensure that senior managers are committed and proactive in meeting their duties under clause 11?

I recognise that a lot of things will flow through secondary legislation, but on top of that, engagement with tech companies is vital, so that they can prepare, be ready and know what duties will be upon them. We also need to know what further guidance and regulation will come forward to secure the delivery of clause 11 duties and hold tech companies to account.

In the interests of time, I will shorten my remarks. I trust and hope that Ministers will give those details. It is important to give those assurances before the Bill moves to the House of Lords. We need to know that those protections will not be diluted. This is such a sensitive issue. We have come a long way, and that is thanks to colleagues on both sides of the House. It is important that we get the right outcomes, because all of us want to make sure that children are protected from the dreadful harms that we have seen online.

Photo of Margaret Hodge Margaret Hodge Labour, Barking

This is a really important piece of legislation. As my hon. Friend Alex Davies-Jones said, it has taken far too long to get to this point. The Bill has been considered in a painstaking way by Members across the House. While today’s announcement that we will introduce senior manager and director liability is most welcome, the recent decisions to strip out vast chunks of the Bill—clauses that would have contributed to making online a safe place for us all—represent a tragic opportunity missed by the Government, and it will fall to a Labour Government to put things right. I know from the assurances given by those on our Front Bench that they will do just that.

I do not want to spend too much time on it, but in discussing the removal of provisions on “legal but harmful” content, I have to talk a little bit about the Jewish community. The hope that the Online Safety Bill would give us some respite from the torrent of antisemitic abuse that some of us have been subjected to has been thwarted. The Centre for Countering Digital Hate has conducted research in this area, and it found that nine out of 10 antisemitic posts on Facebook and Twitter stay there, despite requests to have them removed. Its analysis of 714 posts containing anti-Jewish hate found that they were viewed by more than 7.3 million people across the platforms, and that 80% of posts containing holocaust denial and 70% identified as neo-Nazi were not acted on, although they were in breach of the rules set by the platforms. People like me are left with a sense of bitterness that our suffering has to be tolerated because of some ideological, misplaced, flawed and ill-thought-out interpretation of freedom of speech.

I turn to new clause 2, tabled by Sir William Cash and Miriam Cates. I congratulate them on the work they have done in bringing this forward. I think they will probably agree with me that this issue should never have divided us as it did before Christmas, when I tabled a similar amendment. It is not a party political issue; it is a common-sense measure that best serves the national interest and will make online a safer place for children. I am pleased that the hon. Members for Stone and for Penistone and Stocksbridge have persuaded their colleagues of the justification and that the Government have listened to them—I am only sorry that I was not as successful.

This is an important measure. The business model that platforms operate encourages, not just passively but actively, the flourishing of abusive content online. They do not just fail to remove that content, but actively promote its inclusion through the algorithms that they employ. Sadly, people get a kick out of reading hateful, harmful and abusive content online, as the platform companies and their senior managers know. It is in their interest to encourage maximum traffic on their platforms, and if that means letting people post and see vile abuse, they will. The greater the traffic on such sites, the more attractive they become to advertisers and the more advertisers are willing to pay for the ads that they post on the sites. The platforms make money out of online abuse.

Originally, the Government wanted to deal with the problem by fining the companies, but companies would simply treat such fines as a cost to their business. It would not change their model or the platforms’ behaviour, although it might add to the charges for those who want to advertise on the platforms. Furthermore, we know that senior directors, owners and managers personally take decisions about the content that they allow to appear on their platforms and that their approach affects what people post.

Elon Musk’s controversial and aggressive takeover of Twitter, where he labelled the sensible moderation of content as a violation of freedom of speech, led to a 500% increase in the use of the N-word within 12 hours of his acquisition. Telegram, whose CEO is Pavel Durov, has become the app of choice of terror networks such as ISIS, according to research conducted by the Middle East Media Research Institute. When challenged about that, however, Durov refused to act on the intelligence to moderate content and said:

“You cannot make messaging technology secure for everybody except for terrorists.”

If senior managers have responsibility for the content on their platforms, they must be held to account, because we know that doing so will mean that online businesses become a safer place for our children.

We have to decide whose side we are on. Are we really putting our children’s wellbeing first, or are we putting the platforms’ interest first? Of course, everybody will claim that we are putting children’s interests first, but if we are, we have to put our money where our mouth is, which involves making the managers truly accountable for what appears on their platforms. We know that legislating for director liability works, because it has worked for health and safety on construction sites, in the Bribery Act 2010 and on tax evasion. I hope to move similar amendments when we consider the Economic Crime and Corporate Transparency Bill on Report next week.

This is not simply a punitive measure—in fact, the last thing we want to do is lock up a lot of platform owners—but a tool to transform behaviour. We will not be locking up the tech giants, but we will be ensuring that they moderate their content. Achieving this change shows the House truly working at its best, cross-party, and focusing on the merits of the argument rather than playing party politics with such a serious issue. I commend new clause 2 to the House.

Several hon. Members:

rose—

Photo of Rosie Winterton Rosie Winterton Deputy Speaker (First Deputy Chairman of Ways and Means)

I remind hon. Members about the six-minute advisory time limit.

Photo of Caroline Dinenage Caroline Dinenage Conservative, Gosport

It is a great relief to see the Online Safety Bill finally reach this stage. It seems like a long time since my right hon. and learned Friend Sir Jeremy Wright kicked it off with the ambitious aim of making the UK the safest place in the world to be online. Although other countries around the world had picked at the edges of it, we were truly the first country in the world to set out comprehensive online safety legislation. Since then, other jurisdictions have started and, in some cases, concluded this work. As one of the relay of Ministers who have carried this particular baton of legislation on its very long journey, I know we are tantalising close to getting to the finish line. That is why we need to focus on that today, and I am really grateful to Alex Davies-Jones for confirming that the Opposition are going to support the Bill on Third Reading.

We know that the internet is magnificent and life-changing in so many ways, but the dark corners remain a serious concern with regard particularly to children, but also to scores of other vulnerable people. Of course, the priorities of this Bill must be to protect children, to root out illegal content, and to hold the online platforms to account and ensure they are actually doing what they say they are doing when it comes to the dangerous content on their sites. I warmly welcome the Minister and the Secretary of State’s engagement on these particular aspects of the Bill and how they have worked really hard to strengthen it.

This legislation is so vital for our children. The National Society for the Prevention of Cruelty to Children has estimated that more than 21,000 online child sex crimes have been recorded by the police just in the time this legislation has been delayed since last summer.

Photo of Richard Graham Richard Graham Conservative, Gloucester 6:15, 17 January 2023

Does my hon. Friend agree that the new crime of cyber-flashing is one instance of how this Bill has been improved? It should also help to reduce some of the violence against women and girls, which is a major issue of our time.

Photo of Caroline Dinenage Caroline Dinenage Conservative, Gosport

My hon. Friend is absolutely right to raise this, because we do need the Bill to be future-proofed to deal with some of the recently emerging threats to women and others that the online world has offered.

The potential threat of online harms is everyday life for most children in the modern world. Before Christmas, I received an email from my son’s school highlighting a TikTok challenge encouraging children to strangle each other until they passed out. This challenge probably did not start on TikTok, and it certainly is not exclusive to the platform, but when my children were born I never envisaged a day when I would have to sit them down and warn them about the potential dangers of allowing someone else to throttle them until they passed out. It is terrifying. Our children need this legislation.

I welcome the Government support for amendment 84 to clause 11, in the name of my hon. Friend Alicia Kearns, to ban content that advertises so-called conversion therapies for LGBTQ+ people. Someone’s sexuality and who they love is not something to be cured, and unscrupulous crooks should not be able to profit from pushing young people towards potentially sinister and harmful treatments.

I really sympathise with the aims behind new clause 2, on senior executive liability. It is vital that this regime has the teeth to protect children and hold companies to account. I know the 10% of annual global turnover maximum fine is higher than some of the global comparisons, and certainly having clear personal consequences for those responsible for enforcing the law is an incentive for them to do it properly, but there is clearly a balance to strike. We must make sure that sanctions are proportionate and targeted, and do not make the UK a less attractive place to build a digital business. I am really pleased to hear Ministers’ commitment to a final amendment that will strike that really important balance.

I am concerned about the removal of measures on legal but harmful content. I understand the complexity of defining them, but other measures, including the so-called triple shield, do not offer the same protections for vulnerable adults or avoid the cliff edge when someone reaches the age of 18. That particularly concerns me for adults with special educational needs or disabilities. The key point here is that, if the tragic cases of Molly Russell and dozens of young people like her teach us anything, it is that dreadful, harmful online content cannot be defined strictly by what is illegal, because algorithms do not differentiate between harmful and harmless content. They see a pattern and they exploit it.

We often talk about the parallels between the online and offline world—we say that what is illegal online should be illegal offline, and vice versa—but in reality the two worlds are fundamentally different. In the real world, for a young person struggling with an eating disorder or at risk of radicalisation, their inner demons are not reinforced by everyone they meet on the street, but algorithms are echo chambers. They take our fears and our paranoia, and they surround us with unhealthy voices that normalise and validate them, however dangerous and however hateful, glamorising eating disorders, accelerating extremist, racist and antisemitic views and encouraging violent misogyny on incel sites.

That is why I worry that the opt-out option suggested in the Bill simply does not offer enough protection: the lines between what is legal and illegal are too opaque. Sadly, it feels as though this part of the Bill has become the lightning rod for those who think it will result in an overly censorious approach. However, we are where we are. As the Molly Rose Foundation said, the swift implementation of the Bill must now be the priority. Time is no longer on our side, and while we perfect this vast, complicated and inherently imperfect legislation, the most unspeakable content is allowed to proliferate in the online world every single day.

Finally, I put on record the exhaustive efforts made by the incredible team at the Department for Digital, Culture, Media and Sport and the Home Office, who brought this Bill to fruition. If there was ever an example of not letting the perfect be the enemy of the good, this is it, and right now we need to get this done. The stakes in human terms simply could not be any higher.

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Cabinet Office)

I congratulate Dame Caroline Dinenage on what was one of the best speeches on this Bill—and we have heard quite a lot. It was excellent and very thoughtful. I will speak to a number of amendments. I will not cover the Labour amendments in any detail because, as ever, the Labour Front Benchers did an excellent job of that. Dame Margaret Hodge covered nicely the amendment on liability, and brought up the issue of hate, particularly when pointed towards the Jewish community. I thank her for consistently bringing that up. It is important to hear her voice and others on this issue.

Amendment 43 was tabled by me and my hon. Friend John Nicolson and it regards a default toggle for material that we all agree is unsafe or harmful. The Labour party has said that it agrees with the amendment, and the SNP believes that the safest option should be the default option. We should start from a point of view that if anybody wants to see eating disorder content, or racist or incredibly harmful content that does not meet the bar of illegality, they should have to opt in to receive it. They should not see it by default; they should have to make that choice to see such content.

Freedom of speech is written into the Bill. People can say whatever they want as long as it is below that bar of illegality, but we should not have to read it. We should not have to read abuse that is pointed toward minority groups. We should start from the position of having the safest option on. We are trying to improve the permissive approach that the Government have arrived at, and this simple change is not controversial. It would require users to flip a switch if they want to opt in to some of the worst and most dangerous content available online, including pro-suicide, pro-anorexia or pro-bulimia content, rather than leaving that switch on by default.

If the Government want the terms and conditions to be the place where things are excluded or included, I think platforms should have to say, “We are happy to have pro-bulimia or pro-anorexia content.” They should have to make that clear and explicit in their terms of service, rather than having to say, “We do not allow x, y and z.” They should have to be clear, up front and honest with people, because then people would know what they are signing up to when they sign up to a website.

Amendment 44 is on habit forming features, and we have not spoken enough about the habit forming nature of social media in particular. Sites such as TikTok, Instagram and Facebook are set up to encourage people to spend time on them. They make money by encouraging people to spend as much time on them as possible—that is the intention behind them. We know that 42% of respondents to a survey by YoungMinds reported displaying signs of addiction-like behaviour when questioned about their social media habits. Young people are worried about that, and they do not necessarily have the tools to avoid it. We therefore tabled amendment 44 to take that into account, and to require platforms to consider that important issue.

New clause 3, on child user empowerment, was mentioned earlier. There is a bizarre loophole in the Bill requiring user empowerment toggles for adults but not for children. It is really odd not to require them for children when we know that they will be able to see some of this content and access features that are much more inherently dangerous to them than to adults. That is why we tabled amendments on private messaging features and live streaming features.

Live streaming is a place where self-generated child sexual abuse has shot through the roof. With child user empowerment, children would have to opt in, and they would have empowerment tools to allow them opportunities to say, “No, I don’t want to be involved in live streaming,” or to allow their parents to say, “No, I don’t want my child to be able to do live streaming when they sign up to Instagram. I don’t want them able to share live photos and to speak to people they don’t know.” Amendment 46, on private messaging features, would allow children to say, “No, I don’t want to get any private messages from anyone I don’t know.” That is not written into terms of service or in the Bill as a potentially harmful thing, but children should be able to exclude themselves from having such conversations.

We have been talking about the relationship between real life and the online world. If a child is playing in a play park and some stranger comes up and talks to them, the child is perfectly within their rights to say, “No, I’m not speaking to strangers. My parents have told me that, and it is a good idea not to speak to strangers,” but they cannot do that in the online world. We are asking for that to be taken into account and for platforms to allow private messaging and live streaming features to be switched off for certain groups of people. If they were switched off for children under 13, that would make Roblox, for example, a far safer place than it currently is.

I turn to amendment 84, on conversion therapy. I am glad that the amendment was tabled and that there are moves by the UK Government to bring forward the conversion therapy ban. As far as I am aware—I have been in the Chamber all day—we have not yet seen that legislation, but I am told that it will be coming. I pay tribute to all those who have worked really hard to get us to the position where the Government have agreed to bring forward a Bill. They are to be commended on that. I am sorry that it has taken this long, but I am glad that we are in that position. The amendment was massively helpful in that.

Lastly, I turn to amendment 50, on the risk of harm. One of the biggest remaining issues with the Bill is about the categorisation of platforms, which is done on the basis of their size and the risk of their features. The size of the platform—the number of users on it—is the key thing, but that fails to take into account very small and incredibly harmful platforms. The amendment would give Ofcom the power to categorise platforms that are incredibly harmful—incel forums, for example, and Kiwi Farms, set up entirely to dox trans people and put their lives at risk—as category 1 platforms and require them to meet all the rules, risk assessments and things for those platforms.

We should be asking those platforms to answer for what they are doing, no matter how few members they have or how small their user base. One person being radicalised on such a platform is one person too many. Amendment 50 is not an extreme amendment saying that we should ban all those platforms, although we probably should. It would ask Ofcom to have a higher bar for them and require them to do more.

I cannot believe that we are here again and that the Bill has taken so long to get to this point. I agree that the Bill is far from perfect, but it is better than nothing. The SNP will therefore not be voting against its Third Reading, because it is marginally better than the situation that we have right now.

Photo of Jeremy Wright Jeremy Wright Conservative, Kenilworth and Southam

I want to say in passing that I support amendments 52 and 53, which stand in the name of my hon. Friend Siobhan Baillie and others. She will explain them fully so I do not need to, but they seem to be sensible clarifications that I hope the Government will consider favourably.

I want to focus on new clause 2. I have said before, and am happy to repeat it, that the individual criminal liability provided for in the Bill as it stands is too limited. Attaching it to information offences only means that, in effect, very bad behaviour cannot be penalised under the criminal law as long as the perpetrator is prepared to provide Ofcom with information about it. That cannot be sensible so there is a strong case for extending criminal liability, but new clause 2 goes too far. There are, fundamentally, two problems with new clause 2.

First, new clause 2 is drafted too broadly. It would potentially criminalise any breach of a safety duty under clause 11, the clause relating to children. We all, of course, think that keeping children safer online is a core mission of the Bill. I hope Ministers will consider favourably various other amendments that might achieve that, including the amendments in the name of the noble Baroness Kidron, which Alex Davies-Jones mentioned earlier, in relation to coroners and all services likely to be accessed by children. Clause 11 covers a variety of different duties, including duties to incorporate certain provisions in terms of service and to ensure that terms of service are clear and accessible. Those are important duties no doubt, but I am not convinced that any and all failures to fulfil them should result in criminal prosecution.

Photo of Bill Cash Bill Cash Chair, European Scrutiny Committee, Chair, European Scrutiny Committee 6:30, 17 January 2023

I thought I might mention to my right hon. and learned Friend that the written ministerial statement, which is now available to the public, makes it clear that useful and constructive discussions have taken place. Much of what he is saying is not necessarily applicable to the state of affairs we are now faced with.

Photo of Jeremy Wright Jeremy Wright Conservative, Kenilworth and Southam

I am grateful to my hon. Friend and I will come on to the written statement. I accept what he says. I think we are heading in the right direction, but since new clause 2 is before us at the moment, it seemed to me that I ought to address it, I hope in a helpful way.

There is nothing in the language of new clause 2 as it stands that requires a breach of the duties to be serious or even more than minimal. We should be more discriminating than that.

The second difficulty with new clause 2, which I hope the Government will pick up when they look at it again, is with prosecuting successfully the sorts of offences we may create. The more substantive and fundamental child safety duties in clause 11, which are to

“mitigate and manage the risks of harm” and to prevent children encountering harmful content, are expressed in terms of the use of “proportionate measures” or “proportionate systems and processes”. The word “proportionate” is important and describes the need for balanced judgments to be made, including by taking into account freedom of expression and privacy as required by clause 11 itself. Aside from the challenges of obtaining evidence of what individual managers did or did not know, did or said, those balanced judgments could be very difficult for a prosecutor to assess and to demonstrate to a criminal court, to the required standard of proof, were deliberately or negligently wrong.

The consequences of that difficulty could either be that it becomes apparent that the cases are very hard to prosecute, and therefore criminal liability is not the deterrent we hoped for, or that wide criminal liability causes the sort of risk aversion and excessive take-down of material that I know worries my hon. Friend Sir William Cash and others who support new clause 2. We therefore need to calibrate criminal liability appropriately.

It is also worth saying that if we are to pursue an extension of criminal liability, I am not sure that I see the logic of limiting that further criminal liability only to breaches of the child safety duties; I can envisage some breaches of safety duties in relation to illegal content that may also be deserving of such liability.

That leads me on to consider, as has been said, exactly how we might extend criminal liability differently. I appreciate that the Government will now be doing just that. Perhaps they can consider doing so in relation to serious or persistent breaches of the safety duties, rather than in relation to all breaches of safety duties.

Alternatively, or additionally, they could look at individual criminal liability for a failure to comply with a confirmed notice of contravention from Ofcom. I welcome the direction of travel set out in the written ministerial statement, which suggests that that is where the Government may go. As the statement says, the recent Irish legislation that has been prayed in aid does something very similar, and it is an approach with several advantages: it is easier to prove, we will know whether Ofcom has issued a notice requiring action to remedy a deficient approach to the safety duties, and we will know whether Ofcom believes that it has not been responded to adequately.

As we design a new system of regulation in this new era of regulation, we should want open conversations to take place between the regulator and the regulated as to how best to counter harms. Anything that discourages platforms and their directors from doing so may make the system we are designing work less well in promoting safety online. The approach that I think the Government will now consider is unlikely to do that.

Let me say one final thing. As my hon. Friend Dame Caroline Dinenage said, I have been involved in the progress of this Bill almost from the start, and I am delighted to see present my right hon. Friend Mrs May, at whose instruction I started doing it. It has been tortuous progress, no doubt—to some extent that was inevitable because of the difficulty of the Bill and the territory in which we seek to legislate—but Kirsty Blackman, who speaks for the SNP and for whom I have a good deal of respect, was probably a little grudging in suggesting that as it stands the Bill does only slightly better than the status quo. It does a lot more than that.

If we send the Bill to the other place this evening, as I hope we do, and if the other place considers it again with some thoroughness and seeks to improve it further, as I know it will, we will make the internet not a safe place—I do not believe that is achievable—but a significantly safer place. If we can do that, it will be the most important thing that most of us in this place have ever done.

Several hon. Members:

rose—

Photo of Rosie Winterton Rosie Winterton Deputy Speaker (First Deputy Chairman of Ways and Means)

Order. Things are not going quite according to plan, so colleagues might perhaps like to gear more towards five minutes as we move forward.

Photo of Luke Pollard Luke Pollard Shadow Minister (Defence)

I rise to speak in favour of new clause 4, on minimum standards. In particular, I shall restrict my remarks to minimum standards in respect of incel culture.

Colleagues will know of the tragedy that took place in Plymouth in 2021. Indeed, the former Home Secretary, Priti Patel, visited Plymouth to meet and have discussions with the people involved. I really want to rid the internet of the disgusting, festering incel culture that is capturing so many of our young people, especially young men. In particular, I want minimum standards to apply and to make sure that, on big and small platforms where there is a risk, those minimum standards include the recognition of incel content. At the moment, incel content is festering in the darkest corners of the internet, where young men are taught to channel their frustrations into an insidious hatred of women and to think of themselves as brothers in arms in a war against women. It is that serious.

In Parliament this morning I convened a group of expert stakeholders, including those from the Centre for Countering Digital Hate, Tech Against Terrorism, Moonshot, Girlguiding, the Antisemitism Policy Trust and the Internet Watch Foundation, to discuss the dangers of incel culture. I believe that incel culture is a growing threat online, with real-world consequences. Incels are targeting young men, young people and children to swell their numbers. Andrew Tate may not necessarily be an incel, but his type of hate and division is growing and is very popular online. He is not the only one, and the model of social media distribution that my right hon. Friend Dame Margaret Hodge spoke about incentivises hate to be viewed, shared and indulged in.

This Bill does not remove incel content online and therefore does not prevent future tragedies. As chair of the all-party parliamentary group on social media, I want to see minimum standards to raise the internet out of the sewer. Where is the compulsion for online giants such as Facebook and YouTube to remove incel content? Five of the most popular incel channels on YouTube have racked up 140,000 subscribers and 24 million views between them, and YouTube is still platforming four of those five. Why? How can these channels apparently pass YouTube’s current terms and conditions? The content is truly harrowing. In these YouTube videos, men who have murdered women are described as saints and lauded in incel culture.

We know that incels use mainstream platforms such as YouTube to reel in unsuspecting young men—so-called normies—before linking them to their own small, specialist websites that show incel content. This is called breadcrumbing: driving traffic and audiences from mainstream platforms to smaller platforms—which will be outside the scope of category 1 provisions and therefore any minimum standards—where individuals start their journey to incel radicalisation.

I think we need to talk less about freedom of speech and more about freedom of reach. We need to talk about enabling fewer and fewer people to see that content, and about down-ranking sites with appalling content like this to increase the friction to reduce audience reach. Incel content not only includes sexist and misogynist material; it also frequently includes anti-Semitic, racist, homophobic and transphobic items layered on top of one another. However, without a “legal but harmful” provision, the Bill does nothing to force search engines to downrate harmful content. If it is to be online, it needs to be harder and harder to find.

I do not believe that a toggle will be enough to deal with this. I agree with amendment 43—if we are to have a toggle, the default should be the norm—but I do not think a toggle will work because it will be possible to evade it with a simple Google Chrome extension that will auto-toggle and therefore make it almost redundant immediately. It will be a minor inconvenience, not a game changer. Some young men spent 10 hours a day looking at violent incel content online. Do we really think that a simple button, a General Data Protection Regulation annoyance button, will stop them from doing so? It will not, and it will not prevent future tragedies.

However, this is not just about the effect on other people; it is also about the increase in the number of suicides. One of the four largest incel forums is dedicated to suicide and self-harm. Suicide is normalised in the forum, and is often referred to as “catching the bus.” People get together to share practical advice on how they can take their own lives. That is not content to which we should be exposing our young people, but it is currently legal. It is harmful, but it will remain legal under the Bill because the terms and conditions of those sites are written by incels to promote incel content. Even if the sites were moved from category 2 to category 1, they would still pass the tests in the Bill, because the incels have written the terms and conditions to allow that content.

Why are smaller platforms not included in the Bill? Ofcom should have the power to bring category 2 sites into scope on the basis of risk. Analysis conducted by the Center for Countering Digital Hate shows that on the largest incel website, rape is mentioned in posts every 29 minutes, with 89% of those posts referring to it in a positive sense. Moreover, 50% of users’ posts about child abuse on the same site are supportive of paedophilia. Indeed, the largest incel forum has recently changed its terms and conditions to allow mention of the sexualisation of pubescent minors—unlike pre-pubescent minors; it makes that distinction. This is disgusting and wrong, so why is it not covered in the Bill? I think there is a real opportunity to look at incel content, and I would be grateful if the Minister met the cross-party group again to discuss how we can ensure that it is harder and harder to find online and is ultimately removed, so that we can protect all our young people from going down this path.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee)

My right hon. and learned Friend Sir Jeremy Wright made an excellent speech about new clause 2, a clause with which I had some sympathy. Indeed, the Joint Committee that I chaired proposed that there should be criminal liability for failure to meet the safety duties set out in the Bill, and that that should apply not just to child safety measures, but to any such failure.

However, I agree with my right hon. and learned Friend that, as drafted, the new clause is too wide. If it is saying that the liability exists when the failure to meet the duties has occurred, who will be the determinant of that factor? Will it be determined when Ofcom has issued a notice, or when it has issued a fine? Will it be determined when guidance has been given and has not been followed? What we do not want to see is a parallel judicial system in which decisions are made that are different from those of the regulator in respect of when the safety duties had not been met.

I think it is when there are persistent breaches of the safety duties, when companies have probably already been fined and issued with guidance, and when it has been demonstrated that they are clearly in breach of the codes of practice and are refusing to abide by them, that the criminal liability should come in. Similar provisions already exist in the GDPR legislation for companies that are in persistent breach of their duties and obligations. The Joint Committee recommended that this should be included in the Bill, and throughout the journey of this legislation the inclusion of criminal liability has been consistently strengthened. When the draft Bill was published there was no immediate commencement of any criminal liability, even for not complying with the information notices given by Ofcom, but that was included when the Bill was presented for Second Reading. I am pleased that the Government are now going to consider how we can correctly define what a failure to meet the safety duties would be and therefore what the committal sanction that goes with it would be. That would be an important measure for companies that are in serial breach of their duties and obligations and have no desire to comply.

This issue is also relevant and linked to the wider debate around legal but harmful that we have had today and in the recommittal Committee, because if we are going to have criminal sanctions for non-compliance, we need to be really clear what companies are supposed to do. It needs to be really clear to them what they have to do as well. That is why, when the Joint Committee produced its report, we recommended that the legal but harmful provisions in the Bill should be changed. They do not do what many people in the House have asserted they do, which is to set standards and requirements for companies to remove legal content. They were never there to do that. They provided risk assessment for a wider range of content, and that may have been helpful, but they did not require the removal of content that was neither a breach of the community standards of the platform nor a breach of the legal threshold.

The changes to the Bill help in some ways with the idea of having criminal liability because written on to the face of the Bill are the offences that are within scope, what the companies have to do and also the requirement to enforce their own terms of service where the safety standards are defined not by law but by the platform. Safety standards are important, and there is sometimes a danger in this debate that we pretend they do not really exist. That is understandable, because companies are not very good at enforcing them. They are not very good at doing the things they say they will do. As a former board member of the Centre for Countering Digital Hate, I am pleased to hear that organisation being cited so often in the debate. Its chief executive gave evidence to the Joint Committee, in which he said that if there was one thing it could do, it would be to ensure that companies enforced their own terms of service. He said that if there were a legal power to make them do that, many of the problems we are discussing would go away. That is a very important sanction.

On the point around smaller platforms, in reality Ofcom has the power to enforce safety standards at the level set in the Bill on any platform of any size. On the question of smaller platforms being out of scope, they can only take enforcement based on the terms of service, and platforms like that are likely to have very weak or practically non-existent terms of service. That is why having the minimum safety standards based in law is so important.

With regard to advertising, my hon. Friend and constituency neighbour Mrs Elphicke has an amendment relating to immigration offences that are promoted through advertising. The additional amendment that the Government are accepting relating to advertising in banning the promotion of conversion therapy is also important.

Photo of Theresa May Theresa May Conservative, Maidenhead 6:45, 17 January 2023

My hon. Friend has referenced the proposals from my hon. Friend Mrs Elphicke. I am grateful to the Minister and the Secretary of State for the discussions they have had with me on making modern slavery a specific priority offence, as well as illegal immigration. I think this is very important.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee)

I agree with my right hon. Friend; that is exactly right, and it is also right that we look at including additional offences on the face of the Bill in schedule 7 as offences that will be considered as part of the legislation.

Where this touches on advertising, the Government have already accepted, following the recommendation of the Joint Committee, that the promotion of fraud should be regulated in the Bill, even if it is in advertising. There are other aspects of this, too, including modern slavery and immigration, where we need to move at pace to close the loophole where consideration was to be given to advertising outside of the Bill through the online advertising review. The principle has already been accepted that illegal activity promoted through an advert on an online platform should be regulated as well as if it was an organic posting. That general provision does not yet exist, however. Given that the Government have considered these additional amendments, which was the right thing to do, they also need to look at the general presumption that any illegal activity that is a breach of the safety duties should be included and regulated, and that if somebody includes it in an advert it does not become exempt, when it would be regulated if it was in an organic posting.

Photo of Matt Rodda Matt Rodda Shadow Minister (Work and Pensions) (Pensions)

I would like to focus on new clause 1, dealing with redress, new clause 43, dealing with the toggle default, and new clause 4 on minimum standards. This Bill is a very important piece of legislation, but I am afraid that it has been seriously watered down by the Government. In particular, it has been seriously weakened by the removal of measures to tackle legal but harmful content. I acknowledge that some progress has been made recently, now that the Government have accepted the need for criminal sanctions for senior managers of tech companies. However, there are still many gaps in the Bill and I want to deal with some of them in the time available to me tonight.

First, I pay tribute to the families who have lost children due to issues related to social media. Some of those families are in the Public Gallery tonight. In particular, I want to mention the Stephens family from my Reading East constituency. Thirteen-year-old Olly Stephens was murdered in an horrific attack following a plot hatched on social media. The two boys who attacked Olly had both shared dozens of images of knives online, and they used 11 different social media platforms to do so. Sadly, none of the platforms took down the content, which is why these matters are so important to all of us and our communities.

Following this awful case, I support a number of new clauses that I believe would lead to a significant change in the law to prevent a similar tragedy. I stress the importance of new clause 1, which would help parents to make complaints. As Olly’s dad, Stuart, often says, “You simply cannot contact the tech companies. You send an email and get no reply.” It is important to tackle this matter, and I believe that new clause 1 would go some way towards doing that.

As others have said, surely it makes sense for parents to know their children have some protection from harmful content. New clause 43 would provide reassurance by introducing a default position of protecting children. I urge Members on both sides of the House to support this new clause. Both children and vulnerable adults should be better protected from legal but harmful content, and further action should be taken. New clause 43 would take clear steps in that direction.

I am aware of time, and I support many other important new clauses. I reiterate my support and backing for my Front-Bench colleague, my hon. Friend Alex Davies-Jones. Thank you, Madam Deputy Speaker, for the opportunity to contribute to this debate.

Photo of Andrea Leadsom Andrea Leadsom Conservative, South Northamptonshire

It is a pleasure to follow Matt Rodda. I congratulate him on his moving tribute to his constituent’s son. It is a terrible story.

This Bill will be life changing for many, but I am sorry to say that it has taken far too long to get to this point. The Government promised in 2015 to end children’s exposure to harmful online material, and in 2017 they committed to making the UK the safest place for children to be online. This morning, as I waited in the freezing cold on the station platform for a train that was late, a fellow passenger spoke to me about the Bill. He told me how happy he is that action is, at last, under way to protect children from the dangers of the internet. As a father of three young children, he told me that the internet is one of his greatest concerns.

I am afraid that, at the moment, the internet is as lawless as the wild west, and children are viewing images of abuse, addiction and self-harm on a daily basis. As others have said, the stats are shocking. Around 3,500 online child sex offences are recorded by police each month, and each month more than a million UK children access online pornography. It has been said that, in the time it takes to make a cup of tea, a person who joins certain popular social media platforms will have been introduced to suicidal content, “Go on, just kill yourself. You know you want to.”

I am incredibly proud that our Government have introduced a Bill that will change lives for the better, and I hope and expect it will be a “best in class” for other Governments to do likewise. I pay tribute to my right hon. Friend the Secretary of State for Digital, Culture, Media and Sport and her predecessors for their ruthless focus on making the online world a safer place. Ultimately, improving lives is what every MP is here to do, and on both sides of the House we should take great delight that, at last, this Bill will have its remaining Commons stages today.

I pay tribute to my hon. Friends the Members for Penistone and Stocksbridge (Miriam Cates) and for Stone (Sir William Cash) for their determination to give the Bill even more teeth, and I sincerely thank the Secretary of State for her willingness not only to listen but to take action.

New clause 2, tabled by my hon. Friends, will not be pressed because the Secretary of State has agreed to table a Government amendment when the Bill goes to the other place. New clause 2 sought to create a backstop so that, if a senior manager in a tech firm knowingly allows harm to be caused to a child that results in, for example their abuse or suicide, the manager should be held accountable and a criminal prosecution, with up to two years in prison, should follow. I fully appreciate that many in the tech world say, first, that that will discourage people from taking on new senior roles and, secondly, that it will discourage inward investment in the UK tech sector. Those serious concerns deserve to be properly addressed.

First, with regard to the potential for senior tech staff to be unwilling to take on new roles where there is this accountability, I would argue that from my experience as City Minister in 2015 I can provide a good example of why that is an unnecessary concern. We were seeking to address the aftermath of the 2008 financial crisis and we established the possibility of criminal liability for senior financial services staff. It was argued at the time that that would be highly damaging to UK financial services and that people would be unwilling to take on directorships and risk roles. I think we can all see clearly that those concerns were unfounded. Some might even say, “Well, tech firms would say that, wouldn’t they?”. The likelihood of a criminal prosecution will always be low, but the key difference is that in the future tech managers, instead of waking up each day thinking only about business targets, will wake up thinking, “Have I done enough to protect children, as I meet my business targets?”. I am sure we can agree that that would be a very good thing.

Secondly, there are those who argue that inward investment to the UK’s tech sector would be killed off by this move, and that would indeed be a concern. The UK tech sector leads in Europe, and at the end of 2022 it retained its position as the main challenger to the US and China. Fast-growing UK tech companies have continued to raise near-record levels of investment—more than France and Germany combined. The sector employs 3 million people across the UK and continues to thrive. So it is absolutely right that Ministers take seriously the concerns of these major employers.

However, I think we can look to Ireland as a good example of a successful tech hub where investment has not stopped as a result of strong accountability laws. The Irish Online Safety and Media Regulation Act 2022 carries a similar criminal responsibility to the one proposed in new clause 2, yet Ireland remains a successful tech hub in the European Union.

Photo of Tim Loughton Tim Loughton Conservative, East Worthing and Shoreham

My right hon. Friend is rightly dispelling all these scare stories we have heard. One brief we had warned that if new clause 2 were to go through, it would portend the use of upload filters, where the system sweeps in and removes content before it has been posted. That would be a good thing, would it not? We need social media companies to be investing in more moderators in order to be more aware of the harmful stuff before it goes online and starts to do the damage. This should lead to more investment, but in the right part—in the employees of these social media companies. Facebook—Meta, as it now is—made $39 billion profit in 2021, so they are not short of money to do that, are they?

Photo of Andrea Leadsom Andrea Leadsom Conservative, South Northamptonshire

My hon. Friend makes a good point. Of course, as I have said, tech managers who wake up trying to meet business targets will now look at meeting them in a way that also protects children. That is a good thing.

We will look back on this period since the real rise of social media and simply not be able to believe what millions of children have been subjected to every day. As the Government’s special adviser on early years, it seems to me that all the work we are doing to give every baby the best start for life will be in vain if we then subject them during their vulnerable childhood years to the daily onslaught of appalling vitriol, violence, abuse and sordid pornography that is happening right now. It is little wonder that the mental health of young people is so poor. So it is my hope that this Bill will truly support our attempts to build back better after the covid lockdown. The Government’s clear commitment to families and children, and the Prime Minister's own personal commitment to the vision for “The Best Start for Life” is apparent for all to see. Keeping children safe online will make a radical improvement to all their lives.

Several hon. Members:

rose—

Photo of Rosie Winterton Rosie Winterton Deputy Speaker (First Deputy Chairman of Ways and Means)

In order to ensure that we get everybody in, I am going to introduce a five-minute time limit. I call Richard Burgon.

Photo of Richard Burgon Richard Burgon Labour, Leeds East

I have listened with interest to all the powerful speeches that have been made today. As legislation moves through Parliament, it is meant to be improved, but the great pity with this Bill is that it has got worse, not better. It is a real tragedy that measures protecting adults from harmful but legal content have been watered down.

I rise to speak against the amendments that have come from the Government, including amendments 11 to 14 and 18 and 19, which relate to the removal of adult safety duties. I am also speaking in favour of new clause 4 from the Labour Front Bench team and amendment 43 from the SNP, which go at least some of the way to protect adults from harmful but legal content.

The reason I am keen to highlight these points today stems from a tragic case in my constituency, which I have raised in the House on more than one occasion. Joe Nihill, a popular former Army cadet, was aged 23 when he took his own life after accessing dangerous, suicide-related content online. As I have mentioned previously, his mother, Catherine, and his sister-in-law, Melanie, have run a courageous campaign to ensure that, when this legislation becomes law, what happened to Joe does not happen to others.

For much of the passage of the Bill, I have been heartened. In particular, speaking to the previous Front-Bench Government team, it felt like we were going in the right direction, but perhaps not as quickly as we would like. However, the Government amendments mean that we are now heading in the wrong direction. Joe’s mother and sister-in-law are heartbroken at the Government’s current direction of travel on the Bill in relation to protecting adults from harmful but legal content. I urge the Minister to think again, because Government amendments have gutted harmful but legal protections for adults. Reckless amendments mean that sites will not even have to consider the risk that harmful but legal content poses to adult users on their platform. As I have said, Bills are meant to get better as they go through Parliament. With the Government’s amendments, we have seen the opposite happen.

Research from the Samaritans shows that just 16% of people think that access to potentially harmful content on the internet should be restricted only for children. As I have said, my constituent, Joe, was 23. We all know that it is false to presume that people stop being vulnerable at the age of 18. There are so many vulnerable adults in our society, and there are also people who become vulnerable when they see these things online—when they are dragged down this online rabbit hole of dangerous, harmful content.

The importance of including harmful but legal content is clear. Content that is legal but undoubtedly harmful includes information, instructions and advice on methods of self-harm and suicide, and material that portrays self-harm and suicide as desirable. Crudely removing protections from harmful content at 18 years of age leaves vulnerable people exposed to potentially fatal content.

As we have heard today, individual filters are simply not enough to protect vulnerable people. The Government have set out that it is up to individuals to filter legal but harmful content, but, often, people experiencing suicidal thoughts will look for ways to take their own life and actively seek out harmful content.

In conclusion, the truth is that the Government have ignored the real-world expertise of groups such as the Samaritans and others in order to put the interests of tech giants first as well as those on the Tory Back Benches who put so-called freedom of speech ahead of the safety of people like Joe from my constituency who took his own life at the age of 23.

I hope to see further work on this Bill in the other place to ensure that vulnerable adults are given the protection that they deserve. That was Joe’s parting wish in the letter that he left to his family—that what happened to him would not happen to others. Let us not lose this opportunity. Let us improve the Bill. The other place has a vital role to play in ensuring that the Bill improves and protects everybody.

Photo of John Hayes John Hayes Conservative, South Holland and The Deepings 7:00, 17 January 2023

One of the most noticeable changes in my lifetime has been the disheartening debasement of public discourse. The internet—a place for posturing, preening and posing, but rarely for genuine discussion or measured debate—must take much of the blame for that transformative decline, but, while the coarsening of the national conversation is among the most obvious examples of the harm being done by the internet, it is merely the tip of a very dangerous iceberg.

Beyond every superficial banality lurks a growing crisis of depression, decay, misery and malaise, of self-doubt and self-harm, all facilitated by tech companies that profit from exploiting insecurities, doubts and fears. Such companies do not exist simply to facilitate communication; rather, they control and manipulate virtual interaction in ways that play on innate fears.

The social media conglomerates’ entire business model relies on ruthlessly exploiting vast quantities of data harvested from their users. Driven by nothing beyond profit and growth, they have abandoned any notion of duty of care, because their business model depends on monetarising information with little regard to how it is generated or how it is used, even when that puts children at deadly risk.

Perhaps that wilful ignorance is why social media consistently fails to police videos advertising and glamorising illegal channel crossings. The 1,400 minors accompanying the nearly 50,000 crossings last year had their images placed on the internet as poster children for that despicable trade. I am delighted that the work done by my hon. Friend Mrs Elphicke, and her amendment 82, now wisely accepted by the Government, will begin to address that particular wickedness. The amendment will wipe such material from the internet, requiring social media companies to face up to their responsibilities in plying this evil trade.

If drafted correctly, this Bill is an opportunity for Britain to lead the way in curbing the specious, sinister, spiteful excesses of the internet age. For all their virtue signalling, the tech giants’ lack of action speaks louder than words. Whether it is facilitating the promotion of deadly channel crossings or the day-to-day damage done to the mental health of Britain’s young people, let us be under no illusion: those at the top know exactly the harm they wreak.

Whistleblowing leaks by Frances Haugen last year revealed Mr Zuckerberg’s Meta as a company fully aware of the damage it does to the mental health of young people. In the face of its inaction, new clause 2, tabled by my hon. Friends the Members for Stone (Sir William Cash) and for Penistone and Stocksbridge (Miriam Cates), whom I was pleased and proud to support in doing so, makes tech directors personally legally liable for breaches of their child safety duties. No longer will those senior managers be able to wash their hands of the harm they do, and no longer will they be able to perpetuate those sinister algorithms, which, rather than merely reflecting harm, cause harm.

Strengthening the powers of Ofcom to enforce those duties will ensure that the buck stops with tech management. Like the American frontier of legend, the virtual world of the internet can be tamed—the beast can be caged—but, as GK Chesterton said:

“Unless a man becomes the enemy of an evil, he will not even become its slave but rather its champion.”

The greedy, careless tech conglomerates cannot be trusted to check themselves. This Bill is a welcome start, but in time to come, as the social media beast writhes and breathes, Parliament will need to take whatever action is necessary to protect our citizens by quenching its fearful fire.

Photo of Jim Shannon Jim Shannon Shadow DUP Spokesperson (Human Rights), Shadow DUP Spokesperson (Health)

First and foremost, as we approach the remaining stages of this Bill, we must remember its importance. As MPs, we hear stories of the dangers of online harms that some would not believe. I think it is fair to say that those of my generation were very fortunate to grow up in a world where social media did not exist; as I just said to my hon. Friend Paul Girvan a few minutes ago, I am really glad I did not have to go through that. Social media is so accessible nowadays and children are being socialised in that environment, so it is imperative that we do all we can to ensure that they are protected and looked after.

I will take a moment to discuss the importance of new clause 2. There are many ongoing discussions about where the responsibility lies when it comes to the regulation of online harms, but new clause 2 ultimately would make it an offence for service providers not to comply with their safety duties in protecting children.

Miriam Cates has described the world of social media as

“a modern Wild West, a lawless and predatory environment”— how true those words are. I put on record my thanks to her and to Sir William Cash for all their endeavours to deliver change—they have both been successful, and I say well done to them.

Some 3,500 online child sexual offences are recorded by the police every month. Every month, 1.4 million UK children access online porn, the majority of which is degrading, abusive and violent. As drafted, the Bill would not hold tech bosses individually liable for their own failure in child and public safety. New clause 2 must be supported, and I am very pleased that the Government are minded to accept it.

Fines are simply not enough. If we fail to address that in the Bill, this House will be liable, because senior tech bosses seem not to be. I am minded, as is my party, to support the official Opposition’s new clause 4, “Safety duties protecting adults and society: minimum standards for terms of service”.

New clause 8 is also important. Over the last couple of years, my office has received numerous stories from parents who have witnessed their children deal with the consequences of what an eating disorder can do. I have a very close friend whose 16-year-old daughter is experiencing that at the moment. It is very hard on the family. Social media pages are just brutal. I have heard of TikTok pages glorifying bulimia and anorexia, and Instagram pages providing tips for self-harm—that is horrendous. It is important that we do not pick and choose what forms of harm are written into the Bill. It is not fair that some forms of harm are addressed under the Bill or referred to Ofcom while others are just ignored.

Communication and engagement with third-party stakeholders is the way to tackle and deal with this matter. Let us take, for example, a social media page that was started to comment on eating disorders and is generally unsafe and unhelpful to young people who are struggling. Such a page should be flagged to healthcare professionals, including GPs and nurses, who know best. If we can do that through the Bill, it would be a step in the right direction. On balance, we argue that harmful content should be reserved for regulations, which should be informed by proper stakeholder engagement.

I will touch briefly on new clause 3, which would require providers to include features that child users may use or apply if they wish to increase their control over harmful content. Such features are currently restricted to adults. Although we understand the need to empower young people to be responsible and knowledgeable for the decisions they make, we recognise the value of targeting such a duty at adults, many of whom hold their parental responsibilities very close to their hearts. More often than not, that is just as important as regulation.

To conclude, we have seen too many suicides and too much danger emerge from online and social media. Social media has the potential to be an educational and accessible space for all, including young people. However, there must be safety precautions for the sake of young people, who can very easily fall into traps, as we are all aware. In my constituency, we have had a spate of suicides among young people—it seems to be in a clique of friends, and that really worries me. This is all about regulation, and ensuring that harmful content is dealt with and removed, and that correct and informed individuals are making the decisions about what is and is not safe. I have faith that the Minister, the Government and the Bill will address the outstanding issues. The Bill will not stop every online evil, but it will, as Sir Jeremy Wright said, make being online safer. If the Bill does that, we can support it, because that would be truly good news.

Photo of Natalie Elphicke Natalie Elphicke Conservative, Dover

Thank you, Mr Deputy Speaker—if I may say so, it is a pleasure to see my east Kent neighbour in the Chair.

I will speak to amendment 82, which was tabled in my name, and in support of new clause 2 and amendment 83. At the last Report stage I spoke at some length on an associated amendment, and I am conscious that many Members wish to speak, so I will keep my comments brief.

I am grateful to the many right hon. and hon. Friends who supported my amendment, whether or not their names appear next to it on the amendment paper. I thank in particular my right hon. Friend Sir John Hayes for his considerable assistance in securing changes.

Amendment 82 sets out a requirement to remove content that may result in serious harm or death to a child while crossing the English channel in small boats. The risk of harm or death from channel crossings is very real. Four children have drowned in the past 15 months, with many more harmed through exposure to petrol and saltwater burns and put in danger here and abroad by organised crime and people traffickers. Social media is playing a direct role in this criminal enterprise. It must be brought to book, and the videos and other content that encourage such activity must be taken down.

There is an obligation on us to protect children, especially lone children who find themselves not in the protection of social services, either here or abroad, but in the hands of evil people smugglers and people traffickers. I hope that whatever our differences may be across this House on how open or otherwise our borders and migration system should be, we should be united in compassion, concern and action for children and young people in the snare of this wicked criminal activity. That is what my amendment 82 seeks to ensure.

Turning briefly to other amendments, new clause 2 seeks to hold senior managers to account. I am grateful to my hon. Friends the Members for Stone (Sir William Cash) and for Penistone and Stocksbridge (Miriam Cates) for their excellent work on this. I was somewhat disappointed to read the comments, repeated today by Alex Davies-Jones, that it is some kind of weakness for Government to agree to amendments. I particularly wanted to comment on that in relation to new clause 2.

In deciding to support new clause 2, I was persuaded by the remarks of Dame Margaret Hodge in the previous Report stage. I am grateful to her for the strength of her comments and their persuasive nature. It is our job here in this House to make sure that we consider and make responsible amendments. That is what those of us on the Government Benches have sought to do. I am very pleased that the Government have moved in relation to new clause 2, and it is important to recognise that it shows the confidence and strength of leadership of the Prime Minister, his Ministers, the Culture Secretary and Ministers in her Department and the Home Office, as well as the Solicitor General, that they will work with us to ensure that the Bill is stronger yet.

Finally, I turn to amendment 83 in the same spirit. I was moved by the personal account and the comments made by my right hon. Friend Vicky Ford on Report, and that is why I lent my support to her amendment. She has made a powerful case that it is important to protect children, but also to recognise, as has been said, that as children turn 18 they may still be extremely vulnerable and in need of support. I thank her for that, and I know that a number of Members feel likewise.

In conclusion, I thank the Culture Secretary and the Minister, my hon. Friend Paul Scully, for their engagement to date and for the commitment made in the written ministerial statement to strengthen the Bill in relation to the prevention of modern slavery and illegal immigration, including for the protection of children. On that basis, I confirm that I will not be moving amendment 82 later today.

Photo of Bill Cash Bill Cash Chair, European Scrutiny Committee, Chair, European Scrutiny Committee 7:15, 17 January 2023

In a nutshell, we must be able to threaten tech bosses with jail. There is precedent for that—jail sentences for senior managers are commonplace for breaches of duties across a great range of UK legislation. That is absolutely and completely clear, and as a former shadow Attorney General, I know exactly what the law is on this subject. I can say this: we must protect our children and grandchildren from predatory platforms operating for financial gain on the internet. It is endemic throughout the world and in the UK, inducing suicide, self-harm and sexual abuse, and it is an assault on the minds of our young children and on those who are affected by it, including the families and such people as Ian Russell. He has shown great courage in coming out with the tragedy of his small child of 14 years old committing suicide as a result of such activities, as the coroner made clear. It is unthinkable that we will not deal with that. We are dealing with it now, and I thank the Secretary of State and the Minister for responding with constructive dialogue in the short space of time since we have got to grips with this issue.

The written ministerial statement is crystal clear. It says that

“where senior managers, or those purporting to act in that capacity, have consented or connived in ignoring enforceable requirements, risking serious harm to children. The criminal penalties, including imprisonment and fines, will be commensurate with similar offences.”

We can make a comparison, as Dame Margaret Hodge made clear, with financial penalties in the financial services sector, which is also international. There is also the construction industry, as my hon. Friend Miriam Cates just said. Those penalties are already on our statute book.

I do not care what the European Union is doing in its legislation. I am glad to know that the Irish legislation, which has been passed and is an Act, has been through different permutations and examinations. The Irish have come up with something that includes similar severe penalties. It can be done. But this is our legislation in this House. We will do it the way that we want to do it to protect our children and families. I am just about fed up with listening to the mealy-mouthed remarks from those who say, “You can’t do it. It’s not quite appropriate.” To hell with that. We are talking about our children.

On past record, which I just mentioned, in 1977-78, a great friend of mine, Cyril Townsend, the Member for Bexleyheath, introduced the first Protection of Children Bill. He asked me to help him, and I did. We got it through. That was incredibly difficult at the time. You have no idea, Mr Deputy Speaker, how much resistance was put up by certain Members of this House, including Ministers. I spoke to Jim Callaghan—I have been in this House so long that I was here with him after he had been Prime Minister—and asked, “How did you give us so much time to get the Bill through?” He said, “It’s very simple. I was sitting in bed with my wife in the flat upstairs at No. 10. She wasn’t talking to me. I said, ‘What’s wrong, darling?’ She replied, ‘If you don’t get that Protection of Children Bill through, I won’t speak to you for six months.’” And it went through, so there you go. There is a message there for all Secretaries of State, and even Prime Ministers.

I raised this issue with the Prime Minister in December in a question at the Liaison Committee. I invited him to consider it, and I am so glad that we have come to this point after very constructive discussion and dialogue. It needed that. It is a matter not of chariots of fire but of chariots on fire, because we have done all this in three weeks. I am extremely grateful to the 51 MPs who stood firm. I know the realities of this House, having been involved in one or two discussions in the past. As a rule, it is only when you have the numbers that the results start to come. I pay tribute to the Minister for the constructive dialogue.

The Irish legislation will provide a model, but this will be our legislation. It will be modelled on some of the things that have already enacted there, but it is not simply a matter of their legislation being transformed into ours. It will be our legislation. In the European Parliament

Photo of Miriam Cates Miriam Cates Conservative, Penistone and Stocksbridge

I too rise to speak to new clause 2, which seeks to introduce senior manager criminal liability to the Bill. As my hon. Friend Sir William Cash set out, we will not push it to a vote as a result of the very welcome commitments that the Minister has made to introduce a similar amendment in the other place.

Protecting children is not just the role of parents but the responsibility of the whole of society, including our institutions and businesses that wish to trade here. That is the primary aim of this Bill, which I wholeheartedly support: to keep children safe online from horrendous and unspeakable harms, many of which were mentioned by my right hon. Friend Dame Andrea Leadsom.

We look back in horror at children being forced to work down mines or neglected in Victorian orphanages, but I believe we will look back with similar outrage at online harms. What greater violation could there be of childhood than to entice a child to collaborate in their own sexual abuse in the privacy and supposed safety of their own bedroom? Yet this is one of the many crimes that are occurring on an industrial scale every day. Past horrors such as children down mines were tackled by robust legislation, and the Online Safety Bill must continue our Parliament’s proud tradition of taking on vested interests to defend the welfare of children.

The Bill must succeed in its mission, but in its present form, it does not have sufficient teeth to drive the determination that is needed in tech boardrooms to tackle the systemic issue of the malevolent algorithms that drive this sickening content to our children. There is no doubt that the potential fines in the Bill are significant, but many of these companies have deep pockets, and the only criminal sanctions are for failure to share data with Ofcom. The inquest following the tragic death of Molly Russell was an example of this, as no one could be held personally responsible for what happened to her. I pay tribute to Ian Russell, Molly’s father, whose courage in the face of such personal tragedy has made an enormous difference in bringing to light the extent of online harms.

Only personal criminal liability will drive proactive change, and we have seen this in other areas such as the financial services industry and the construction industry. I am delighted that the Government have recognised the necessity of senior manager liability for tech bosses, after much campaigning across the House, and committed to introducing it in the other place. I thank the Secretary of State and her team for the very constructive and positive way in which they have engaged with supporters of this measure.

Photo of Bill Cash Bill Cash Chair, European Scrutiny Committee, Chair, European Scrutiny Committee

Would my hon. Friend not also like to say that the NSPCC has been magnificent in supporting us?

Photo of Miriam Cates Miriam Cates Conservative, Penistone and Stocksbridge

I was coming on to that—absolutely.

The advantage of introducing this measure in the other place is that we can widen the scope to all appropriate child safety duties beyond clause 11 and perhaps tackle pornography and child sexual abuse material as well. We will have a groundbreaking Bill that will hold to account powerful executives who knowingly allow our children to be harmed.

There are those who say—not least the tech companies —that we should not be seeking to criminalise tech directors. There are those who worry that this will reduce tech investment, but that has not happened in Ireland. There are those who say that the senior manager liability amendment will put a great burden on tech companies to comply, to which I say, “Great!” There are those who are worried that this will set an international precedent, to which I say, “Even better!”

Nothing should cause greater outrage in our society than the harming of innocent children. In a just society founded on the rule of law, those who harm children or allow children to be harmed should expect to be punished by the law. That is what new clause 2 seeks to do, and I look forward to working with the Secretary of State and others to bring forward a suitable amendment in the other place.

I offer my sincere thanks to the NSPCC, especially Rich Collard, and the outstanding Charles Hymas of The Telegraph, who have so effectively supported this campaign. I also pay tribute to my hon. Friend Sir William Cash; without his determination, knowledge and experience, it would not have been possible to achieve this change. He has been known as Mr Brexit, but as he said, even before he was Mr Brexit, he was Mr Child Protection, having been involved with the Protection of Children Act 1978. It is certainly advantageous in negotiations to work with someone who knows vastly more about legislation than pretty much anyone else involved. He sat through the debate in December on the amendment tabled by Dame Margaret Hodge, and while the vote was taking place, he said, “I think we can do this.” He spent the next week in the Public Bill Office and most of his recess buried in legislation. I pay tribute to him for his outstanding work. Once again, I thank the Secretary of State for her commitment to this, and I think this will continue our Parliament’s proud history of protecting children.

Photo of Lia Nici Lia Nici Conservative, Great Grimsby

I fully support the Bill and pay tribute to the work that Members have done over months and years to get us to where we are. I support the amendments tabled by my hon. Friends the Members for Dover (Mrs Elphicke), for Penistone and Stocksbridge (Miriam Cates) and for Stone (Sir William Cash), because these are the right things to do. We cannot have—effectively—illegal advertising for illegal activities on platforms. We would not allow it on television, so why would we allow it on other easily accessible platforms? With regard to content that is harmful to children, why should we not focus the minds of senior managers in those hugely rich organisations on the idea that, “If I do not do my job properly and protect children, I may go to prison.” I think that threat will focus those individuals’ minds.

Photo of Bill Cash Bill Cash Chair, European Scrutiny Committee, Chair, European Scrutiny Committee 7:30, 17 January 2023

Does my hon. Friend agree that it is an assault not just on the physical person, but on their minds? That is what is going on, and it is destroying them.

Photo of Lia Nici Lia Nici Conservative, Great Grimsby

My hon. Friend is correct. Often, senior managers are high-profile individuals with PR budgets that are probably larger than those of many countries. If we think about fines, they would just put those fines into their business plans, so fines would not effect a cultural change, as my hon. Friend the Member for Penistone and Stocksbridge has said on many occasions. We need cultural change to ensure that companies say, “What are we doing to make sure that children are being protected?” That is why I wholeheartedly support the new clause.

I also thank the Secretary of State, Ministers and officials, who have talked through issues with Back Benchers and taken them seriously. That means that we are where we need to be, which is fantastic. As a child of the 1970s and a parent, I never envisaged that we would have to be having these kinds of conversations with our children about what they are coming across: “Mum, what is this? Should I go and find a needle to inject this into myself?”. That is the kind of horrifying content that parents and teachers come across. Schools do a fantastic job with their digital footprint training to ensure that we can start to have such conversations.

Photo of John Hayes John Hayes Conservative, South Holland and The Deepings

The opponents of our cause claim that we are curbing freedom, but in fact, it is not freedom that these people offer. They turn their addicts into the slaves of cruel, callous conglomerates.

Photo of Lia Nici Lia Nici Conservative, Great Grimsby

I absolutely agree with my right hon. Friend. If freedom means that our children become collateral damage for harmful and dangerous people, we need to have some real conversations about what freedom is all about.

Thankfully, as a child of the 1970s, my only experience was of three television channels. My hon. Friends the Members for Stone and for Penistone and Stocksbridge are like Zorro and Tonto coming to save the villagers in a wild west town where all the baddies are waiting to annihilate them. I thank them for that and I look forward to supporting the Bill all the way.

Photo of Vicky Ford Vicky Ford Conservative, Chelmsford

Legislating in an online world is incredibly complex and full of pitfalls, because the digital world moves so fast that it is difficult to make effective and future-proof legislation. I do not want to wind up my hon. Friend Sir William Cash by mentioning Europe, but I am proud to have worked alongside other British MEPs to introduce the GDPR, which the tech companies hated—especially the penalties.

The GDPR is not perfect legislation, but it fundamentally transformed how online actors think about the need to protect personal data, confidentiality and privacy. The Bill can do exactly the same and totally transform how online safety is treated, especially for children. I have been a proud champion of the Internet Watch Foundation for more than a decade and I have worked with it to tackle the hideous sexual abuse of children online. As a children’s Minister during the Bill’s passage, I am aware of the serious harms that the online world can and does pose, and I am proud that Ministers have put protecting children at the front of the Bill.

Along with other hon. Members, I have signed new clause 2. If, God forbid, hospital staff were constantly and repeatedly causing harm to children and the hospital boss was aware of it but turned a blind eye and condoned it, we would all expect that hospital boss to end up in the courts and, if necessary, in prison. Tech bosses should have the same. I thank the Government for saying that they will go along with the Irish style legislation here, and I look forward to their doing so.

My amendments—amendment 83 and new clause 8, which was not in scope—relate to eating disorders. Amendment 83 is intended to make it very clear that eating disorders should be treated as seriously as other forms of self-harm. I would like to thank everybody in the Chamber who spoke to me so kindly after I spoke in the last debate about my own experience as a former anorexic and all those outside the Chamber who have since contacted me.

Anorexia is the biggest killer of all mental illnesses. It is a sickness that has a slow and long-burning fuse, but all too often that fuse is deadly. There has been a terrifying rise in the number of cases, and it is very clear that social media posts that glamorise eating disorders are helping to fuel this epidemic. I am talking not about content that advertises a diet, but egregious content that encourages viewers to starve themselves in some cases—too many cases—to death. Content promoting eating disorders is no less dangerous than other content promoting other forms of self-harm; in fact, given the huge numbers of people suffering from eating disorders—about 1.25 million people in this country—it may be considered the most dangerous. It is dangerous not only for children, but for vulnerable adults.

My amendment, as I have said, endeavours to make it clear that content promoting eating disorders should be treated in the same way and as seriously as content promoting other forms of self-harm. I thank all those who signed it, including former Health Ministers and Digital Ministers, the current Chair of the Health and Social Care Committee, my hon. Friend Steve Brine and the current and former Chairs of the Women and Equalities Committee, my right hon. Friends the Members for Romsey and Southampton North (Caroline Nokes) and for Basingstoke (Dame Maria Miller). I hope the fact that MPs of such experience have signed these amendment sends a clear message to those in the other place that we treat this issue very seriously.

My amendment 83 is not the clearest legal way in which to manage the issue, so I do not intend to press it today. I thank the Secretary of State, the Minister responsible for the Bill and the Minister of State, Ministry of Justice, my right hon. Friend Edward Argar, who I know want to move on this, for meeting me earlier today and agreeing that we will find a way to help protect vulnerable adults as well as children from being constantly subjected to this type of killing content. I look forward to continuing to work with Ministers and Members of the other place to find the best legally watertight way forward.

Photo of Marcus Fysh Marcus Fysh Conservative, Yeovil

It is a pleasure to follow my right hon. Friend Vicky Ford, who made a very powerful speech, and I completely agree with her about the importance of treating eating disorders as being of the same scale of harm as other things in the Bill.

I was the media analyst for Merrill Lynch about 22 years ago, and I made a speech about the future of media in which I mentioned the landscape changing towards one of self-generated media. However, I never thought we would get to where it is now and what the effect is. I was in the Pizza Express on Gloucester Road the other day at birthday party time, and an 11-year-old boy standing in the queue was doomscrolling TikTok videos rather than talking to his friends, which I just thought was a really tragic indication of where we have got to.

Digital platforms are also critical sources of information and our public discourse. Across the country, people gather up to 80% of information from such sources, but we should not have trust in them. Their algorithms, which promote and depromote, and their interfaces, which engage, are designed, as we have heard, to make people addicted to the peer validation and augmentation of particular points of view. They are driving people down tribal rabbit holes to the point where they cannot talk to each other or even listen to another point of view. It is no wonder that 50% of young people are unhappy or anxious when they use social media, and these algorithmic models are the problem. Trust in these platforms is wrong: their promotion or depromotion of messages and ideas is opaque, often subjective and subject to inappropriate influence.

It is right that we tackle illegal activity and that harms to children and the vulnerable are addressed, and I support the attempt to do that in the Bill. Those responsible for the big platforms must be held to account for how they operate them, but trusting in those platforms is wrong, and I worry that compliance with their terms of service might become a tick-box absolution of their responsibility for unhappiness, anxiety and harm.

What about harm to our public sphere, our discourse, and our processes of debate, policymaking and science? To trust the platforms in all that would be wrong. We know they have enabled censorship. Elon Musk’s release of the Twitter files has shown incontrovertibly that the big digital platforms actively censor people and ideas, and not always according to reasonable moderation. They censor people according to their company biases, by political request, or with and on behalf of the three-letter Government agencies. They censor them at the behest of private companies, or to control information on their products and the public policy debate around them. Censorship itself creates mistrust in our discourse. To trust the big platforms always to do the right thing is wrong. It is not right that they should be able to hide behind their terms of service, bury issues in the Ofcom processes in the Bill, or potentially pay lip service to a tick-box exercise of merely “having regard” to the importance of freedom of expression. They might think they can just write a report, hire a few overseers, and then get away scot-free with their cynical accumulation, and the sale of the data of their addicted users and the manipulation of their views.

The Government have rightly acknowledged that addressing such issues of online safety is a work in progress, but we must not think that the big platforms are that interested in helping. They and their misery models are the problem. I hope that the Government, and those in the other place, will include in the Bill stronger duties to stop things that are harmful, to promote freedom of expression properly, to ensure that people have ready and full access to the full range of ideas and opinions, and to be fully transparent in public and real time about the way that content is promoted or depromoted on their platforms. Just to trust in them is insufficient. I am afraid the precedent has been set that digital platforms can be used to censor ideas. That is not the future; that is happening right now, and when artificial intelligence comes, it will get even worse. I trust that my colleagues on the Front Bench and in the other place will work hard to improve the Bill as I know it can be improved.

Photo of Rachel Maclean Rachel Maclean The Minister of State, Home Department

I strongly support the Bill. This landmark piece of legislation promises to put the UK at the front of the pack, and I am proud to see it there. We must tackle online abuse while protecting free speech, and I believe the Bill gets that balance right. I was pleased to serve on the Bill Committee in the last Session, and I am delighted to see it returning to the Chamber. The quicker it can get on to the statute book, the more children we can protect from devastating harm.

I particularly welcome the strengthened protections for children, which require platforms to clearly articulate in their terms of service what they are doing to enforce age requirements on their site. That will go some way to reassuring parents that their children’s developing brains will not be harmed by early exposure to toxic, degrading, and demeaning extreme forms of pornography. Evidence is clear that early exposure over time warps young girls’ views of what is normal in a relationship, with the result that they struggle to form healthy equal relationships. For boys, that type of sexual activity is how they learn about sex, and it normalises abusive, non-consensual and violent acts. Boys grow up into men whose neural circuits become habituated to that type of imagery. They actually require it, regardless of the boundaries of consent that they learn about in their sex education classes—I know this is a difficult and troubling subject, but we must not be afraid to tackle it, which is what we are doing with the Bill. It is well established that the rise of that type of pornography on the internet over time has driven the troubling and pernicious rise in violence against women and girls, perpetrated by men, as well as peer-on-peer child sexual abuse and exploitation.

During Committee we had a good debate about the need for greater criminal sanctions to hold directors individually to account and drive a more effective safety culture in the boardroom. I am proud to serve in the Chamber with my hon. Friends the Members for Stone (Sir William Cash) and for Penistone and Stocksbridge (Miriam Cates). I have heard about all their work on new clause 2 and commend them heartily for it. I listened carefully to the Minister’s remarks in Committee and thank him and the Secretary of State for their detailed engagement.

I really welcome the plans to introduce measures to strengthen individual criminal liability for directors of tech companies. The concerns raised by some that that will deter investment in the UK or result in a tech exodus are absolute nonsense. That has not taken place in Ireland, which still hosts many leading industry headquarters. I am a former tech entrepreneur, part of the founding team of one of the UK’s largest software publishers, and I assure the House that this forward-leaning legislation and regulation that requires innovation to solve compliance and user problems is exactly what drives the engineers to do what they do best: to solve problem and develop solutions, in the interests of their customers, which are valuable across a host of industries and sectors.

The new measures will cement the UK’s role as a world leader in this space and underpin our ability to continue to play a leading role in the software industry. Where we lead, others will follow. They will also give our thought leaders the opportunity to develop bespoke solutions as well as ensure that children’s ages are verified robustly and that disgusting child sex abuse material is removed and does not proliferate.

On violence against women and girls, sensible and workable plans have been set out to make coercive and controlling behaviour a priority offence and to make platforms take that stuff down without women and girls having to contact them every single time. I welcome the work on creating codes of practice.

Considering the challenges that surround this area, the Bill does a really good job of protecting and upholding the freedom of speech that we hold dear in our democracy. As a feminist, I need to be able to express my view, protected under the Equality Act, that biological sex is immutable. I should not be hounded off the internet or threatened with violence for stating that view. At the same time, we should all seek to support and improve the experiences of transgender people. We can do both at the same time. We must have a nuanced, balanced and compassionate debate.

We are in an era where our discussion forums have become polarised. We are crossing new frontiers but we cannot accept the status quo. Our democracy depends on this.

Photo of Dean Russell Dean Russell Chair, Speaker's Advisory Committee on Works of Art 7:45, 17 January 2023

I rise to talk broadly about new clause 2, which I am pleased that the Government are engaging on. My right hon. and hon. Friends have done incredible work to make that happen. I share their elation. As—I think—the only Member who was on the Joint Committee under the fantastic Chair, my hon. Friend Damian Collins, and on both Committees, I have seen the Bill’s passage over the past year or so and been happy with how the Government have engaged with it. That includes on Zach’s law, which will ensure that trolls cannot send flashing images to people with epilepsy. I shared my colleagues’ elation with my hon. Friend Suzanne Webb when we were successful in convincing the Government to make that happen.

May I reiterate the learnings from the Joint Committee and from the Committee earlier last year? When we took evidence from the tech giants—they are giants—it was clear that, as giants do, they could not see the damage underfoot and the harm that they were doing because they are so big. They were also blind to the damage they were doing because they chose not to see it. I remember challenging a witness from one of the big tech giants about whether they had followed the Committee on the harms that they were causing to vulnerable children and adults. I was fascinated by how the witnesses just did not care. Their responses were, “Well, we are doing enough already. We are already trying. We are putting billions of pounds into supporting people who are being harmed.” They did not see the reality on the ground of young people being damaged.

When I interviewed my namesake, Ian Russell, I was heartbroken because we had children of a similar age. I just could not imagine having the conversations he must have had with his family and friends throughout that terrible tragedy.

Photo of Bill Cash Bill Cash Chair, European Scrutiny Committee, Chair, European Scrutiny Committee

Is my hon. Friend aware that Ian Russell has pointed out that 26% of young people who present at hospital with self-harm and suicide attempts have accessed such predatory, irresponsible and wilful online content?

Photo of Dean Russell Dean Russell Chair, Speaker's Advisory Committee on Works of Art

My hon. Friend is absolutely right. One of the real horrors is that, as I understand it, Facebook was not going to release—I do not want to break any rules here—the content that his daughter had being viewing, to help with the process of healing.

If I may, I want to touch on another point that has not been raised today, which is the role of a future Committee. I appreciate that is not part of the Bill, but I feel strongly that this House should have a separate new Committee for the Online Safety Bill. The internet and the world of social media is changing dramatically. The metaverse is approaching very rapidly, and we are seeing the rise of virtual reality and augmented reality. Artificial intelligence is even changing the way we believe what we see online and at a rate that we cannot imagine. I have a few predictions. I anticipate that in the next few years we will probably have the first No. 1 book and song written by AI. We can now hear online fake voices and impersonations of people by AI. We will have songs and so on created in ways that fool us and fool children even more. I have no doubt that in the coming months and years we will see the rise of children suing their parents for sharing content of them when they were younger without permission. We will see a changing dynamic in the way that young people engage with new content and what they anticipate from it.

Photo of John Hayes John Hayes Conservative, South Holland and The Deepings

My hon. Friend is making a valuable contribution to the debate, as I expected he would having discussed it with him from the very beginning. What he describes is not only the combination of heartlessness and carelessness on the part of the tech companies, but the curious marriage of an anarchic future coupled with the tyranny of their control of that future. He is absolutely right that if we are to do anything about that in this place, we need an ongoing role for a Committee of the kind he recommends.

Photo of Dean Russell Dean Russell Chair, Speaker's Advisory Committee on Works of Art

I thank my right hon. Friend for those comments. I will wrap up shortly, Mr Deputy Speaker. On that point, I have said before that the use of algorithms on platforms is in my mind very similar to addictive drugs: they get people addicted and get them to change their behaviours. They get them to cut off from their friends and family, and then they direct them in ways that we would not allow if we could wrap our arms around them and stop it. But they are doing that in their own bedrooms, classrooms and playgrounds.

I applaud the work on the Bill. Yes, there are ways it could be improved and a committee that looks at ways to improve it as the dynamics of social media change will be essential. However, letting the Bill go to the other place will be a major shift forwards in protecting our young people both now and in the future.

Photo of Siobhan Baillie Siobhan Baillie Conservative, Stroud

Thank you, Mr Deputy Speaker.

I rise to speak to amendments 52 and 53. As you know, Mr Deputy Speaker, I have been campaigning to tackle anonymous abuse for many years now. I have been working with the fantastic Clean Up The Internet organisation, Stroud residents and the brilliant Department for Digital, Culture, Media and Sport team. We have been focused on practical measures that will empower social media users to protect themselves from anonymous abuse. I am pleased to say that the Government accepted our campaign proposals to introduce verification options. They give people the option to be followed and to follow only verified accounts if that is what they choose, and to ensure that they know who is and who is not verified. That will also assist in ensuring that the positive parts of anonymity can continue online, as there are many. I respectfully think that that work is even more important now that we have seen the removal of the “legal but harmful” clauses, because we know what will be viewed by children and vulnerable adults who want to be protected online.

We are not resting on that campaign win, however. We want to see the verification measures really work in the real world and for social media companies to adopt them quickly without any confusion about their duties. Separately, clarity is the order of the day, because the regulator Ofcom is going to have an awful lot to do thanks to the excellent clauses throughout the legislation.

This issue is urgent. We must not forget that anonymous social media accounts are spewing out hateful bile every single minute of the day. Children and vulnerable adults are left terrified: it is much more scary for them to receive comments about suicide, self-harm and bullying, and from anorexia pushers, from people when they do not know who they are.

Financial scammers tend to hide behind anonymity. Faceless bots cause mayhem and start nasty pile-ons. Perverts know that when they send a cyber-flashing dick pic to an unsuspecting woman, it is very unlikely, if it comes from an anonymous account, that it will be traced back to them. It is really powerful and important for people to have the tools to not see unverified nonsense or abuse, to be able to switch that off and to know that the people they follow are real.

I am keen for the Minister and the Government to adopt amendments 52 and 53. They are by no means the most sexy and jazzy amendments before the House; they are more tweaks than amendments. They would change the wording to bring the legislation up to date in the light of recent changes. They would also ensure that it is obvious if people are verified—blue ticks are a really good example of that—which was part of my campaign in the first place. I understand from discussions that the Government are considering adopting my amendments. I thank colleagues for calling them sensible and backing them. They are really important.

Finally, I have made the case many times that the public expect us to act and to be strong in this policy area, but they also expect things to happen very quickly. We have waited a very long time. It is incredibly important to give people the power and tools to protect themselves, whether by sliding a button or switching something off. My great hope from the campaigning that I have done is that young people and adults will think about only following unverified accounts through an active choice.

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Cabinet Office)

On that specific point, does the hon. Lady realise that the empowerment duties in respect of verified and non-verified users apply only to adult users? Children will not have the option to toggle off unverified users, because the user empowerment duties do not allow that to happen.

Photo of Siobhan Baillie Siobhan Baillie Conservative, Stroud

The evidence we have received is that it is parents who need the powers. I want to normalise the ability to turn off anonymised accounts. I think we will see children do that very naturally. We should also try to persuade their parents to take those stances and to have those conversations in the home. I obviously need to take up the matter with the hon. Lady and think carefully about it as matters proceed through the other place.

We know that parents are very scared about what their children see online. I welcome what the Minister is trying to do with the Bill and I welcome the legislation and the openness to change it. These days, we are all called rebels whenever we do anything to improve legislation, but the reality is that that is our job. We are sending this legislation to the other House in a better shape.

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

There is a lot to cover in the short time I have, but first let me thank Members for their contributions to the debate. We had great contributions from Alex Davies-Jones, my right hon. Friend Priti Patel and Dame Margaret Hodge—I have to put that right, having not mentioned her last time—as well as from my hon. Friend Dame Caroline Dinenage; Kirsty Blackman; the former Secretary of State, my right hon. and learned Friend Sir Jeremy Wright; and the hon. Members for Plymouth, Sutton and Devonport (Luke Pollard), for Reading East (Matt Rodda) and for Leeds East (Richard Burgon).

I would happily meet the hon. Member for Plymouth, Sutton and Devonport to talk about incel content, as he requested, and the hon. Members for Reading East and for Leeds East to talk about Olly Stephens and Joe Nihill. Those are two really tragic examples and it was good to hear the tributes to them and their being mentioned in this place in respect of the changes in the legislation.

We had great contributions from my right hon. Friend Dame Andrea Leadsom, Jim Shannon and my hon. Friend Mrs Elphicke. I am glad that my hon. Friend Sir William Cash gave a three-Weetabix speech—I will have to look in the Tea Room for the Weetabix he has been eating.

There were great contributions from my hon. Friends the Members for Penistone and Stocksbridge (Miriam Cates) and for Great Grimsby (Lia Nici), from my right hon. Friend Vicky Ford and from my hon. Friend Mr Fysh. The latter talked about doom-scrolling; I recommend that he speaks to my right hon. Friend Sir John Hayes, whose quoting of G. K. Chesterton shows the advantages of reading books rather than scrolling through a phone. I also thank my hon. Friends the Members for Redditch (Rachel Maclean), for Watford (Dean Russell) and for Stroud (Siobhan Baillie).

I am also grateful for the contributions during the recommittal process. The changes made to the Bill during that process have strengthened the protections that it can offer.

We reviewed new clause 2 carefully, and I am sympathetic to its aims. We have demonstrated our commitment to strengthening protections for children elsewhere in the Bill by tabling a series of amendments at previous stages, and the Bill already includes provisions to make senior managers liable for failing to prevent a provider from committing an offence and for failing to comply with information notices. We are committed to ensuring that children are safe online, so we will work with those Members and others to bring to the other place an effective amendment that delivers our shared aims of holding people accountable for their actions in a way that is effective and targeted at child safety, while ensuring that the UK remains an attractive place for technology companies to invest and grow.

We need to take time to get this right. We intend to base our amendments on the Irish Online Safety and Media Regulation Act 2022, which, ironically, was largely based on our work here, and which introduces individual criminal liability for failure to comply with the notice to end contravention. In line with that approach, the final Government amendment, at the end of the ping-pong between the other place and this place, will be carefully designed to capture instances in which senior managers, or those purporting to act in that capacity, have consented or connived in ignoring enforceable requirements, risking serious harm to children. The criminal penalties, including imprisonment or fines, will be commensurate with those applying to similar offences. While the amendment will not affect those who have acted in good faith to comply in a proportionate way, it will give the Act additional teeth—as we have heard—to deliver the change that we all want, and ensure that people are held to account if they fail to protect children properly.

As was made clear by my right hon. Friend the Member for Witham, child protection and strong implementation are at the heart of the Bill. Its strongest protections are for children, and companies will be held accountable for their safety. I cannot guarantee the timings for which my right hon. Friend asked, but we will not dilute our commitment. We have already started to speak to companies in this sphere, and I will also continue to work with her and others.

Photo of Sajid Javid Sajid Javid Conservative, Bromsgrove

My hon. Friend has rightly prioritised the protection of children. He will recall that throughout the debate, a number of Members have asked the Government to consider the amendment that will be tabled by Baroness Kidron, which will require coroners to have access to data in cases in which the tragic death of a child may be related to social media and other online activities. Is my hon. Friend able to give a commitment from the Dispatch Box that the Government will look favourably on that amendment?

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

Coroners already have some powers in this area, but we are aware of instances raised by my right hon. Friend and others in which that has not been the case. We will happily work with Baroness Kidron, and others, and look favourably on changes where they are necessary.

Photo of Debbie Abrahams Debbie Abrahams Labour, Oldham East and Saddleworth

I entirely agree that our focus has been on protecting children, but is the Minister as concerned as I am about the information and misinformation, and about the societal impacts on our democracy, not just in this country but elsewhere? The hon. Member for Watford suggested a Committee that could monitor such impacts. Is that something the Minister will reconsider?

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

For the purpose of future-proofing, we have tried to make the Bill as flexible and as technologically neutral as possible so that it can adapt to changes. I think we will need to review it, and indeed I am sure that, as technology changes, we will come back with new legislation in the future to ensure that we continue to be world-beating—but let us see where we end up with that.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee)

May I follow up my hon. Friend’s response to our right hon. Friend Sajid Javid? If it is the case that coroners cannot access data and information that they need in order to go about their duties—which was the frustrating element in the Molly Russell case—will the Government be prepared to close that loophole in the House of Lords?

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

We will certainly work with others to address that, and if there is a loophole, we will seek to act, because we want to ensure—

Photo of Priti Patel Priti Patel Conservative, Witham

I am grateful to the Minister for giving way. He was commenting on my earlier remarks about new clause 2 and the specifics around a timetable. I completely recognise that much of this work is under development. In my remarks, I asked for a timetable on engagement with the tech firms as well as transparency to this House on the progress being made on developing the regulations around criminal liability. It is important that this House sees that, and that we follow every single stage of that process.

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

I thank my right hon. Friend for that intervention. We want to have as many conversations as possible in this area with Members on all sides, and I hope we can be as transparent as possible in that operation. We have already started the conversation. The Secretary of State and I met some of the big tech companies just yesterday to talk about exactly this area.

My hon. Friend the Member for Dover, my right hon. Friends the Members for South Holland and The Deepings and for Maidenhead (Mrs May) and others are absolutely right to highlight concerns about illegal small boat crossings and the harm that can be caused to people crossing in dangerous situations. The use of highly dangerous methods to enter this country, including unseaworthy, small or overcrowded boats and refrigerated lorries, presents a huge challenge to us all. Like other forms of serious and organised crime, organised immigration crime endangers lives, has a corrosive effect on society, puts pressure on border security resources and diverts money from our economy.

As the Prime Minister has said, stopping these crossings is one of the Government’s top priorities for the next year. The situation needs to be resolved and we will not hesitate to take action wherever that can have the most effect, including through this Bill. Organised crime groups continue to facilitate most migrant journeys to the UK and have no respect for human life, exploiting vulnerable migrants, treating them as commodities and knowingly putting people in life-threatening situations. Organised crime gangs are increasingly using social media to facilitate migrant crossings and we need to do more to prevent and disrupt the crimes facilitated through these platforms. We need to share best practice, improve our detection methods and take steps to close illegal crossing routes as the behaviour and methods of organised crime groups evolve.

However, amendment 82 risks having unforeseen consequences for the Bill. It could bring into question the meaning of the term “content” elsewhere in the Bill, with unpredictable implications for how the courts and companies would interpret it. Following constructive discussions with my hon. Friend the Member for Dover and my right hon. Friend the Member for Maidenhead, I can now confirm that in order to better tackle illegal immigration encouraged by organised gangs, the Government will add section 2 of the Modern Slavery Act 2015 to the list of priority offences. Section 2 makes it an offence to arrange or facilitate the travel of another person, including through recruitment, with a view to their exploitation.

We will also add section 24 of the Immigration Act to the priority offences list in schedule 7. Although the offences in section 24 cannot be carried out online, paragraph 33 of the schedule states the priority illegal content includes the inchoate offences relating to the offences listed. Therefore aiding, abetting, counselling and conspiring in those offences by posting videos of people crossing the channel that show the activity in a positive light could be an offence that is committed online and therefore fall within what is priority illegal content. The result of this amendment would therefore be that platforms would have to proactively remove that content. I am grateful to my hon. Friend the Member for Dover and my right hon. Friends the Members for South Holland and The Deepings and for Maidenhead for raising this important issue and I would be happy to offer them a meeting with my officials to discuss the drafting of this amendment ahead of it being tabled in the other place.

We recognise the strength of feeling on the issue of harmful conversion practices and remain committed to protecting people from these practices and making sure that they can live their lives free from the threat of harm or abuse. We have had constructive engagement with my hon. Friend Alicia Kearns on her amendment 84, which seeks to prevent children from seeing harmful online content on conversion practices. It is right that this issue is tackled through a dedicated and tailored legislative approach, which is why we are announcing today that the Government will publish a draft Bill to set out a proposed approach to banning conversion practices. This will apply to England and Wales. The Bill will protect everybody, including those targeted on the basis of their sexuality or being transgender. The Government will publish the Bill shortly and will ask for pre-legislative scrutiny by a Joint Committee in this parliamentary Session.

This is a complex area and pre-legislative scrutiny exists to help ensure that any Bill introduced to Parliament does not cause unintended consequences. It will also ensure that the Bill benefits from stakeholder expertise and input from parliamentarians. The legislation must not, through a lack of clarity, harm the growing number of children and young adults experiencing gender-related distress through inadvertently criminalising or chilling legitimate conversations that parents or clinicians may have with children. This is an important issue, and it needs the targeted and robust approach that a dedicated Bill would provide.

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

I am afraid I have only three minutes, so I am not able to give way.

The Government cannot accept the Labour amendments that would re-add the adult safety duties and the concept of content that is harmful to adults. These duties and the definition of harmful content were removed from the Bill in Committee to protect free speech and to ensure that the Bill does not incentivise tech companies to censor legal content. It is not appropriate for the Government to decide whether legal content is harmful to adult users, and then to require companies to risk assess and set terms for such content. Many stakeholders and parliamentarians are justifiably concerned about the consequences of doing so, and I share those concerns. However, the Government recognise the importance of giving users the tools and information they need to keep themselves safe online, which is why we have introduced to the Bill a fairer, simpler approach for adults—the triple shield.

Members have talked a little about user empowerment. I will not have time to cover all of that, but the Government believe we have struck the right balance of empowering adult users on the content they see and engage with online while upholding the right to free expression. For those reasons, I am not able to accept these amendments, and I hope the hon. Members for Aberdeen North (Kirsty Blackman) and for Ochil and South Perthshire (John Nicolson) will not press them to a vote.

The Government amendments are consequential on removing the “legal but harmful” sections, which were debated extensively in Committee.

The Government recognise the concern of my hon. Friend the Member for Stroud about anonymous online abuse, and I applaud her important campaigning in this area. We expect Ofcom to recommend effective tools for compliance, with the requirement that these tools can be applied by users who wish to filter out non-verified users. I agree that the issue covered by amendment 52 is important, and I am happy to continue working with her to deliver her objectives in this area.

My right hon. Friend the Member for Chelmsford spoke powerfully, and we take the issue incredibly seriously. We are committed to introducing a new communications offence of intentional encouragement and assistance of self-harm, which will apply whether the victim is a child or an adult.

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

I do not have time, but I thank all Members who contributed to today’s debate. I pay tribute to my officials and to all the Ministers who have worked on this Bill over such a long time.

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport), Shadow Minister (Tech, Gambling and the Digital Economy)

I beg to ask leave to withdraw the clause.

Clause, by leave, withdrawn.

Proceedings interrupted (Programme Order, 5 December, and Standing Order No. 24(7)),

The Deputy Speaker put forthwith the Questions necessary for the disposal of the business to be concluded at that time (Standing Order No. 83E).