New Clause 19 - Duties to protect news publisher content

Online Safety Bill – in the House of Commons at 12:34 pm on 12 July 2022.

Alert me about debates like this

(1) This section sets out the duties to protect news publisher content which apply in relation to Category 1 services.

(2) Subject to subsections (4), (5) and (8), a duty, in relation to a service, to take the steps set out in subsection (3) before—

(a) taking action in relation to content present on the service that is news publisher content, or

(b) taking action against a user who is a recognised news publisher.

(3) The steps referred to in subsection (2) are—

(a) to give the recognised news publisher in question a notification which—

(i) specifies the action that the provider is considering taking,

(ii) gives reasons for that proposed action by reference to each relevant provision of the terms of service,

(iii) where the proposed action relates to news publisher content that is also journalistic content, explains how the provider took the importance of the free expression of journalistic content into account when deciding on the proposed action, and

(iv) specifies a reasonable period within which the recognised news publisher may make representations,

(b) to consider any representations that are made, and

(c) to notify the recognised news publisher of the decision and the reasons for it (addressing any representations made).

(4) If a provider of a service reasonably considers that the provider would incur criminal or civil liability in relation to news publisher content present on the service if it were not taken down swiftly, the provider may take down that content without having taken the steps set out in subsection (3).

(5) A provider of a service may also take down news publisher content present on the service without having taken the steps set out in subsection (3) if that content amounts to a relevant offence (see section 52 and also subsection (10) of this section).

(6) Subject to subsection (8), if a provider takes action in relation to news publisher content or against a recognised news publisher without having taken the steps set out in subsection (3), a duty to take the steps set out in subsection (7).

(7) The steps referred to in subsection (6) are—

(a) to swiftly notify the recognised news publisher in question of the action taken, giving the provider’s justification for not having first taken the steps set out in subsection (3),

(b) to specify a reasonable period within which the recognised news publisher may request that the action is reversed, and

(c) if a request is made as mentioned in paragraph (b)—

(i) to consider the request and whether the steps set out in subsection (3) should have been taken prior to the action being taken,

(ii) if the provider concludes that those steps should have been taken, to swiftly reverse the action, and

(iii) to notify the recognised news publisher of the decision and the reasons for it (addressing any reasons accompanying the request for reversal of the action).

(8) If a recognised news publisher has been banned from using a service (and the ban is still in force), the provider of the service may take action in relation to news publisher content present on the service which was generated or originally published or broadcast by the recognised news publisher without complying with the duties set out in this section.

(9) For the purposes of subsection (2)(a), a provider is not to be regarded as taking action in relation to news publisher content in the following circumstances—

(a) a provider takes action in relation to content which is not news publisher content, that action affects related news publisher content, the grounds for the action only relate to the content which is not news publisher content, and it is not technically feasible for the action only to relate to the content which is not news publisher content;

(b) a provider takes action against a user, and that action affects news publisher content that has been uploaded to or shared on the service by the user.

(10) Section (Providers’ judgements about the status of content) (providers’ judgements about the status of content) applies in relation to judgements by providers about whether news publisher content amounts to a relevant offence as it applies in relation to judgements about whether content is illegal content.

(11) OFCOM’s guidance under section (Guidance about illegal content judgements) (guidance about illegal content judgements) must include guidance about the matters dealt with in section (Providers’ judgements about the status of content) as that section applies by reason of subsection (10).

(12) Any provision of the terms of service has effect subject to this section.

(13) In this section—

(a) references to “news publisher content” are to content that is news publisher content in relation to the service in question;

(b) references to “taking action” in relation to content are to—

(i) taking down content,

(ii) restricting users’ access to content, or

(iii) taking other action in relation to content (for example, adding warning labels to content);

(c) references to “taking action” against a person are to giving a warning to a person, or suspending or banning a person from using a service, or in any way restricting a person’s ability to use a service.

(14) Taking any step set out in subsection (3) or (7) does not count as “taking action” for the purposes of this section.

(15) See—

section 16 for the meaning of “journalistic content”;

section 49 for the meaning of “news publisher content”;

section 50 for the meaning of “recognised news publisher”.”—(Damian Collins.)

Member’s explanatory statement

This new clause requires providers to notify a recognised news publisher and provide a right to make representations before taking action in relation to news publisher content or against the publisher (except in certain circumstances), and to notify a recognised news publisher after action is taken without that process being followed and provide an opportunity for the publisher to request that the action is reversed.

Brought up, and read the First time.

Photo of Lindsay Hoyle Lindsay Hoyle Speaker of the House of Commons, Chair, Speaker's Committee for the Independent Parliamentary Standards Authority, Chair, Speaker's Committee for the Independent Parliamentary Standards Authority, Chair, House of Commons Commission, Chair, Speaker's Committee on the Electoral Commission, Chair, Speaker's Committee on the Electoral Commission

With this it will be convenient to discuss the following:

New clause 2—Secretary of State’s powers to suggest modifications to a code of practice—

“(1) The Secretary of State may on receipt of a code write within one month of that day to OFCOM with reasoned, evidence-based suggestions for modifying the code.

(2) OFCOM shall have due regard to the Secretary of State’s letter and must reply to the Secretary of State within one month of receipt.

(3) The Secretary of State may only write to OFCOM twice under this section for each code.

(4) The Secretary of State and OFCOM shall publish their letters as soon as reasonably possible after transmission, having made any reasonable redactions for public safety and national security.

(5) If the draft of a code of practice contains modifications made following changes arising from correspondence under this section, the affirmative procedure applies.”

New clause 3—Priority illegal content: violence against women and girls—

“(1) For the purposes of this Act, any provision applied to priority illegal content should also be applied to any content which—

(a) constitutes,

(b) encourages, or

(c) promotes

(2) ‘Violence against women and girls’ is defined by Article 3 of the Council of Europe Convention on Preventing Violence Against Women and Domestic Violence (‘the Istanbul Convention’).”

This new clause applies provisions to priority illegal content to content which constitutes, encourages or promotes violence against women and girls.

New clause 4—Duty about content advertising or facilitating prostitution: Category 1 and Category 2B services—

“(1) A provider of a Category 1 or Category 2B service must operate the service so as to—

(a) prevent individuals from encountering content that advertises or facilitates prostitution;

(b) minimise the length of time for which any such content is present;

(c) where the provider is alerted by a person to the presence of such content, or becomes aware of it in any other way, swiftly take down such content.

(2) A provider of a Category 1 or Category 2B service must include clear and accessible provisions in a publicly available statement giving information about any proactive technology used by the service for the purpose of compliance with the duty set out in subsection (1) (including the kind of technology, when it is used, and how it works).

(3) If a person is the provider of more than one Category 1 or Category 2B service, the duties set out in this section apply in relation to each such service.

(4) The duties set out in this section extend only to the design, operation and use of a Category 1 or Category 2B service in the United Kingdom.

(5) For the meaning of ‘Category 1 service’ and ‘Category 2B service’, see section 81 (register of categories of services).

(6) For the meaning of ‘prostitution’, see section 54 of the Sexual Offences Act 2003.”

New clause 5—Duty about content advertising or facilitating prostitution: Category 2A services—

“(1) A provider of a Category 2A service must operate that service so as to minimise the risk of individuals encountering content which advertises or facilitates prostitution in or via search results of the service.

(2) A provider of a Category 2A service must include clear and accessible provisions in a publicly available statement giving information about any proactive technology used by the service for the purpose of compliance with the duty set out in subsection (1) (including the kind of technology, when it is used, and how it works).

(3) The reference to encountering content which advertises or facilitates prostitution “in or via search results” of a search service does not include a reference to encountering such content as a result of any subsequent interactions with an internet service other than the search service.

(4) If a person is the provider of more than one Category 2A service, the duties set out in this section apply in relation to each such service.

(5) The duties set out in this section extend only to the design, operation and use of a Category 2A service in the United Kingdom.

(6) For the meaning of ‘Category 2A service’, see section 81 (register of categories of services).

(7) For the meaning of ‘prostitution’, see section 54 of the Sexual Offences Act 2003.”

New clause 6—Duty about content advertising or facilitating prostitution: internet services providing pornographic content—

“(1) A provider of an internet service within the scope of section 67 of this Act must operate that service so as to—

(a) prevent individuals from encountering content that advertises or facilitates prostitution;

(b) minimise the length of time for which any such content is present;

(c) where the provider is alerted by a person to the presence of such content, or becomes aware of it in any other way, swiftly take down such content.

(2) A provider of an internet service under this section must include clear and accessible provisions in a publicly available statement giving information about any proactive technology used by the service for the purpose of compliance with the duty set out in subsection (1) (including the kind of technology, when it is used, and how it works).

(3) If a person is the provider of more than one internet service under this section, the duties set out in this section apply in relation to each such service.

(4) For the meaning of ‘prostitution’, see section 54 of the Sexual Offences Act 2003.”

New clause 8—Duties about advertisements for cosmetic procedures—

“(1) A provider of a regulated service must operate the service using systems and processes designed to—

(a) prevent individuals from encountering advertisements for cosmetic procedures that do not meet the conditions specified in subsection (3);

(b) minimise the length of time for which any such advertisement is present;

(c) where the provider is alerted by a person to the presence of such an advertisement, or becomes aware of it in any other way, swiftly take it down.

(2) A provider of a regulated service must include clear and accessible provisions in the terms of service giving information about any proactive technology used by the service for the purpose of compliance with the duty set out in subsection (1) (including the kind of technology, when it is used, and how it works).

(3) The conditions under subsection (1)(a) are that the advertisement—

(a) contains a disclaimer as to the health risks of the cosmetic procedure, and

(b) includes a certified service quality indicator.

(4) If a person is the provider or more than one regulated service, the duties set out in this section apply in relation to each such service.

(5) The duties set out in this section extent only to the design, operation and use of a regulated service in the United Kingdom.

(6) For the meaning of ‘regulated service’, see section 3 (‘Regulated service’. ‘Part 3 service’ etc).”

This new clause would place a duty on all internet service providers regulated by the Bill to prevent individuals from encountering adverts for cosmetic procedures that do not contain a disclaimer as to the health risks of the procedure nor include a certified service quality indicator.

New clause 9—Content harmful to adults risk assessment duties: regulated search services—

“(1) This section sets out the duties about risk assessments which apply in relation to all regulated search services.

(2) A duty to carry out a suitable and sufficient priority adults risk assessment at a time set out in, or as provided by Schedule 3.

(3) A duty to take appropriate steps to keep an adults’ risk assessment up to date, including when OFCOM make any significant change to a risk profile that relates to services of the kind in question.

(4) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient adult risk assessment relating to the impacts of that proposed change.

(5) An ‘adults risk assessment’ of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—

(a) the level of risk of individuals who are users of the service encountering each kind of priority content that is harmful to adults (with each kind separately assessed), taking into account (in particular) risks presented by algorithms used by the service, and the way that the service indexes, organises and presents search results;

(b) the level of risk of functionalities of the service facilitating individuals encountering search content that is harmful to adults, identifying and assessing those functionalities that present higher levels of risk;

(c) the nature, and severity, of the harm that might be suffered by individuals from the matters identified in accordance with paragraphs (a) and (b);

(d) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.

(6) In this section, references to risk profiles are to the risk profiles for the time being published under section 84 which relate to the risk of harm to adults presented by priority content that is harmful to adults.

(7) See also—section 20(2) (records of risk assessments), and Schedule 3 (timing of providers’ assessments).”

New clause 10—Safety Duties Protecting Adults: regulated search services—

“(1) This section sets out the duties about protecting adults which apply in relation to all regulated search services.

(2) A duty to summarise in the policies of the search service the findings of the most recent adults’ risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to adults).

(3) A duty to include provisions in the search service policies specifying, in relation to each kind of priority content that is harmful to adults that is to be treated in a way described in subsection (4), which of those kinds of treatment is to be applied.

(4) The duties set out in subsections (2) and (3) apply across all areas of a service, including the way the search engine is operated and used as well as search content of the service, and (among other things) require the provider of a service to take or use measures in the following areas, if it is proportionate to do so—

(a) regulatory compliance and risk management arrangements,

(b) design of functionalities, algorithms and other features relating to the search engine,

(c) functionalities allowing users to control the content they encounter in search results,

(d) content prioritisation and ranking,

(e) user support measures, and

(f) staff policies and practices.

(5) A duty to explain in the terms of service the provider’s response to the risks relating to priority content that is harmful to adults (as identified in the most recent adults’ risk assessment of the service), by reference to—

(a) any provisions of the policies included in compliance with the duty set out in subsection (3), and

(b) any other provisions of the terms of service designed to mitigate or manage those risks.

(6) If provisions are included in the policies in compliance with the duty set out in subsection (3), a duty to ensure that those provisions—

(a) are clear and accessible, and

(b) are applied consistently in relation to content which the provider reasonably considers is priority

(NaN) If the provider of a service becomes aware of any non-designated content that is harmful to adults present on the service, a duty to notify OFCOM of—

(a) the kinds of such content identified, and

(b) the incidence of those kinds of content on the service.

(NaN) A duty to ensure that the provisions of the publicly available statement referred to in subsections (5) and (7) are clear and accessible.

(NaN) In this section—

‘adults’ risk assessment’ has the meaning given by section 12;

‘non-designated content that is harmful to adults’ means content that is harmful to adults other than priority content that is harmful to adults.”

New clause 18—Child user empowerment duties—

“(1) This section sets out the duties to empower child users which apply in relation to Category 1 services.

(2) A duty to include in a service, to the extent that it is proportionate to do so, features which child users may use or apply if they wish to increase their control over harmful content.

(3) The features referred to in subsection (2) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—

(a) reduce the likelihood of the user encountering priority content that is harmful, or particular kinds of such content, by means of the service, or

(b) alert the user to the harmful nature of priority content that is harmful that the user may encounter by means of the service.

(4) A duty to ensure that all features included in a service in compliance with the duty set out in subsection (2) are made available to all child users.

(5) A duty to include clear and accessible provisions in the terms of service specifying which features are offered in compliance with the duty set out in subsection (2), and how users may take advantage of them.

(6) A duty to include in a service features which child users may use or apply if they wish to filter out non-verified users.

(7) The features referred to in subsection (6) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—

(a) prevent non-verified users from interacting with content which that user generates, uploads or shares on the service, and

(b) reduce the likelihood of that user encountering content which non-verified users generate, upload or share on the service.

(8) A duty to include in a service features which child users may use or apply if they wish to only encounter content by users they have approved.

(9) A duty to include in a service features which child users may use or apply if they wish to filter out private messages from—

(a) non-verified users, or

(b) adult users, or

(c) any user other than those on a list approved by the child user.

(10) In determining what is proportionate for the purposes of subsection (2), the following factors, in particular, are relevant—

(a) all the findings of the most recent child risk assessment (including as to levels of risk and as to nature, and severity, of potential harm), and

(b) the size and capacity of the provider of a service.

(11) In this section ‘non-verified user’ means a user who has not verified their identity to the provider of a service (see section 57(1)).

(12) In this section references to features include references to functionalities and settings.”

New clause 24—Category 1 services: duty not to discriminate, harass or victimise against service users—

“(1) The following duties apply to all providers of Category 1 services.

(2) A duty not to discriminate, on the grounds of a protected characteristic, against a person wishing to use the service by not providing the service, if the result of not providing the service is to cause harm to that person.

(3) A duty not to discriminate, on the grounds of a protected characteristic, against any user of the service in a way that causes harm to the user—

(a) as to the terms on which the provider provides the service to the user;

(b) by terminating the provision of the service to the user;

(c) by subjecting the user to any other harm.

(4) A duty not to harass, on the grounds of a protected characteristic, a user of the service in a way that causes harm to the user.

(5) A duty not to victimise because of a protected characteristic a person wishing to use the service by not providing the user with the service, if the result of not providing the service is to cause harm to that person.

(6) A duty not to victimise a service user—

(a) as to the terms on which the provider provides the service to the user;

(b) by terminating the provision of the service to the user;

(c) by subjecting the user to any other harm.

(7) In this section— references to harassing, discriminating or victimising have the same meaning as set out in Part 2 of the Equality Act 2010;

‘protected characteristic’ means a characteristic listed in section 4 of the Equality Act 2010.”

This new clause would place a duty, regulated by Ofcom, on Category 1 service providers not to discriminate, harass or victimise users of their services on the basis of a protected characteristic if doing so would result in them being caused harm. Discrimination, harassment and victimisation, and protected characteristics, have the same meaning as in the Equality Act 2010.

New clause 25—Report on duties that apply to all internet services likely to be accessed by children—

“(1) Within 12 months of this Act receiving Royal Assent, the Secretary of State must commission an independent evaluation of the matters under subsection (2) and must lay the report of the evaluation before Parliament.

(2) The evaluation under subsection (1) must consider whether the following duties should be imposed on all providers of services on the internet that are likely to be accessed by children, other than services regulated by this Act—

(a) duties similar to those imposed on regulated services by sections 10 and 25 of this Act to carry out a children’s risk assessment, and

(b) duties similar to those imposed on regulated services by sections 11 and 26 of this Act to protect children’s online safety.”

This new clause would require the Secretary of State to commission an independent evaluation on whether all providers of internet services likely to be accessed by children should be subject to child safety duties and must conduct a children’s risk assessment.

New clause 26—Safety by design—

“(1) In exercising their functions under this Act—

(a) The Secretary of State, and

(b) OFCOM must have due regard to the principles in subsections (2)-(3).

(2) The first principle is that providers of regulated services should design those services to prevent harmful content from being disseminated widely, and that this is preferable in the first instance to both—

(a) removing harmful content after it has already been disseminated widely, and

(b) restricting which users can access the service or part of it on the basis that harmful content is likely to disseminate widely on that service.

(4) The second principle is that providers of regulated services should safeguard freedom of expression and participation, including the freedom of expression and participation of children.”

This new clause requires the Secretary of State and Ofcom to have due regard to the principle that internet services should be safe by design.

New clause 27—Publication of risk assessments—

“Whenever a Category 1 service carries out any risk assessment pursuant to Part 3 of this Act, the service must publish the risk assessment on the service’s website.”

New clause 38—End-to-end encryption—

“Nothing in this Act shall prevent providers of user-to-user services protecting their users’ privacy through end-to-end encryption.”

Government amendment 57.

Amendment 202, in clause 6, page 5, line 11, at end insert—

“(ba) the duty about pornographic content set out in Schedule [Additional duties on pornographic content].”

This amendment ensures that user-to-user services must meet the new duties set out in NS1.

Government amendments 163, 58, 59 and 60.

Amendment 17, in clause 8, page 7, line 14, at end insert—

“(h) how the service may be used in conjunction with other regulated user-to-user services such that it may—

(i) enable users to encounter illegal content on other regulated user-to-user services, and

(ii) constitute part of a pathway to harm to individuals who are users of the service, in particular in relation to CSEA content.”

This amendment would incorporate into the duties a requirement to consider cross-platform risk.

Amendment 15, in clause 8, page 7, line 14, at end insert—

“(5A) The duties set out in this section apply in respect of content which reasonably foreseeably facilitates or aids the discovery or dissemination of CSEA content.”

This amendment extends the illegal content risk assessment duties to cover content which could be foreseen to facilitate or aid the discovery or dissemination of CSEA content.

Government amendments 61 and 62.

Amendment 18, page 7, line 30 [Clause 9], at end insert—

“(none) ‘, including by being directed while on the service towards priority illegal content hosted by a different service;’

This amendment aims to include within companies’ safety duties a duty to consider cross-platform risk.

Amendment 16, in clause 9, page 7, line 35, at end insert—

“(d) minimise the presence of content which reasonably foreseeably facilitates or aids the discovery or dissemination of priority illegal content, including CSEA content.”

This amendment brings measures to minimise content that may facilitate or aid the discovery of priority illegal content within the scope of the duty to maintain proportionate systems and processes.

Amendment 19, in clause 9, page 7, line 35, at end insert—

“(3A) A duty to collaborate with other companies to take reasonable and proportionate measures to prevent the means by which their services can be used in conjunction with other services to facilitate the encountering or dissemination of priority illegal content, including CSEA content.”

This amendment creates a duty to collaborate in cases where there is potential cross-platform risk in relation to priority illegal content and CSEA content.

Government amendments 63 to 67.

Amendment 190, page 10, line 11, in clause 11, at end insert “, and—

(c) mitigate the harm to children caused by habit-forming features of the service by consideration and analysis of how processes (including algorithmic serving of content, the display of other users’ approval of posts and notifications) contribute to development of habit-forming behaviour.”

This amendment requires services to take or use proportionate measures to mitigate the harm to children caused by habit-forming features of a service.

Government amendments 68 and 69.

Amendment 42, page 11, line 16, in clause 11, at end insert—

“(c) the benefits of the service to children’s well-being.”

Amendment 151, page 12, line 43, leave out Clause 13.

This amendment seeks to remove Clause 13 from the Bill.

Government amendment 70.

Amendment 48, page 13, line 5, in clause 13, leave out “is to be treated” and insert

“the provider decides to treat”

This amendment would mean that providers would be free to decide how to treat content that has been designated ‘legal but harmful’ to adults.

Amendment 49, page 13, line 11, in clause 13, at end insert—

‘(ca) taking no action;”

This amendment provides that providers would be free to take no action in response to content referred to in subsection (3).

Government amendments 71 and 72.

Amendment 157, page 14, line 11, in clause 14, leave out subsections (6) and (7).

This amendment is consequential to Amendment 156, which would require all users of Category 1 services to be verified.

Government amendments 73, 164, 74 and 165.

Amendment 10, page 16, line 16, in clause 16, leave out from “or” until the end of line 17.

Government amendments 166 and 167.

Amendment 50, page 20, line 21, in clause 19, at end insert—

“(6A) A duty to include clear provision in the terms of service that the provider will not take down, or restrict access to content generated, uploaded or shared by a user save where it reasonably concludes that—

(a) the provider is required to do so pursuant to the provisions of this Act, or

(b) it is otherwise reasonable and proportionate to do so.”

This amendment sets out a duty for providers to include in terms of service a commitment not to take down or restrict access to content generated, uploaded or shared by a user except in particular circumstances.

Government amendment 168.

Amendment 51, page 20, line 37, in clause 19, at end insert—

“(10) In any claim for breach of contract brought in relation to the provisions referred to in subsection (7), where the breach is established, the court may make such award by way of compensation as it considers appropriate for the removal of, or restriction of access to, the content in question.”

This amendment means that where a claim is made for a breach of the terms of service result from Amendment 50, the court has the power to make compensation as it considers appropriate.

Government amendment 169.

Amendment 47, page 22, line 10, in clause 21, at end insert—

“(ba) the duties about adults’ risk assessment duties in section (Content harmful to adult risk assessment duties: regulated search services),

(bb) the safety duties protecting adults in section (Safety duties protecting adults: regulated search services).”

Government amendments 75 to 82.

Amendment 162, page 31, line 19, in clause 31, leave out “significant”

This amendment removes the requirement for there to be a “significant” number of child users, and replaces it with “a number” of child users.

Government amendments 85 to 87.

Amendment 192, page 36, line 31, in clause 37, at end insert—

“(ha) persons whom OFCOM consider to have expertise in matters relating to the Equality Act 2010,”

This amendment requires Ofcom to consult people with expertise on the Equality Act 2010 about codes of practice.

Amendment 44, page 37, line 25, in clause 39, leave out from beginning to the second “the” in line 26.

This amendment will remove the ability of the Secretary of State to block codes of practice being, as soon as practical, laid before the House for its consideration.

Amendment 45, page 38, line 8, leave out Clause 40.

This amendment will remove the ability of the Secretary of State to block codes of practice being, as soon as practical, laid before the House for its consideration.

Amendment 13, page 38, line 12, in clause 40, leave out paragraph (a).

Amendment 46, page 39, line 30, leave out Clause 41.

This amendment will remove the ability of the Secretary of State to block codes of practice being, as soon as practical, laid before the House for its consideration.

Amendment 14, page 39, line 33, in clause 41, leave out subsection (2).

Amendment 21, page 40, line 29, in clause 43, leave out “may require” and insert “may make representations to”

Amendment 22, page 40, line 33, in clause 43, at end insert—

‘(2A) OFCOM must have due regard to representations by the Secretary of State under subsection (2).”

Government amendments 88 to 89 and 170 to 172.

Amendment 161, page 45, line 23, in clause 49, leave out paragraph (d).

This amendment removes the exemption for one-to-one live aural communications.

Amendment 188, page 45, line 24, in clause 49, leave out paragraph (e).

This amendment removes the exemption for comments and reviews on provider content.

Government amendments 90 and 173.

Amendment 197, page 47, line 12, in clause 50, after “material” insert

“or special interest news material”.

Amendment 11, page 47, line 19, in clause 50, after “has” insert “suitable and sufficient”.

Amendment 198, page 47, line 37, in clause 50, leave out the first “is” and insert

“and special interest news material are”.

Amendment 199, page 48, line 3, in clause 50, at end insert—

““special interest news material” means material consisting of news or information about a particular pastime, hobby, trade, business, industry or profession.”

Amendment 12, page 48, line 7, in clause 50, after “a” insert “suitable and sufficient”.

Government amendments 91 to 94.

Amendment 52, page 49, line 13, in clause 52, leave out paragraph (d).

This amendment limits the list of relevant offences to those specifically specified.

Government amendments 95 to 100.

Amendment 20, page 51, line 3, in clause 54, at end insert—

‘(2A) Priority content designated under subsection (2) must include—

(a) content that contains public health related misinformation or disinformation, and

(b) misinformation or disinformation that is promulgated by a foreign state.”

This amendment would require the Secretary of State’s designation of “priority content that is harmful to adults” to include public health-related misinformation or disinformation, and misinformation or disinformation spread by a foreign state.

Amendment 53, page 51, line 47, in clause 55, after “State” insert “reasonably”.

This amendment, together with Amendment 54, would mean that the Secretary of State must reasonably consider the risk of harm to each one of an appreciable number of adults before specifying a description of the content.

Amendment 54, page 52, line 1, in clause 55, after “to” insert “each of”.

This amendment is linked to Amendment 53.

Amendment 55, page 52, line 12, in clause 55, after “OFCOM” insert

“, Parliament and members of the public in a manner the Secretary of State considers appropriate”.

This amendment requires the Secretary of State to consult Parliament and the public, as well as Ofcom, in a manner the Secretary of State considers appropriate before making regulations about harmful content.

Government amendments 147 to 149.

Amendment 43, page 177, line 23, in schedule 4, after “ages” insert

“, including the benefits of the service to their well-being,”

Amendment 196, page 180, line 9, in schedule 4, at end insert—

Amendment 187, page 186, line 32, in schedule 7, at end insert—

“Human trafficking

22A An offence under section 2 of the Modern Slavery Act 2015.”

This amendment includes Human Trafficking as a priority offence.

Amendment 211, page 187, line 23, in schedule 7, at end insert—

Government new clause 14.

Government new clause 15.

Government amendments 83 to 84.

Amendment 156, page 53, line 7, in clause 57, leave out subsections (1) and (2) and insert—

‘(1) A provider of a Category 1 service must require all adult users of the service to verify their identity in order to access the service.

(2) The verification process—

(a) may be of any kind (and in particular, it need not require documentation to be provided),

(b) must—

(i) be carried out by a third party on behalf of the provider of the Category 1 service,

(ii) ensure that all anonymous users of the Category 1 service cannot be identified by other users, apart from where provided for by section (Duty to ensure anonymity of users).”

This amendment would require all users of Category 1 services to be verified. The verification process would have to be carried out by a third party and to ensure the anonymity of users.

Government amendment 101.

Amendment 193, page 58, line 33, in clause 65, at end insert—

“(ea) persons whom OFCOM consider to have expertise in matters relating to the Equality Act 2010,”

This amendment requires Ofcom to consult people with expertise on the Equality Act 2010 in respect of guidance about transparency reports.

Amendment 203, page 60, line 33, in clause 68, at end insert—

‘(2B) A duty to meet the conditions set out in Schedule [Additional duties on pornographic content].”

This amendment ensures that commercial pornographic websites must meet the new duties set out in NS1.

Government amendments 141, 177 to 184, 142 to 145, 185 to 186 and 146.

New schedule 1—Additional duties on pornographic content

“30 All user-to-user services and an internet service which provides regulated provider pornographic content must meet the following conditions for pornographic content and content that includes sexual photographs and films (“relevant content”).

The conditions are—

(a) the service must not contain any prohibited material,

(b) the service must review all relevant content before publication.

31 In this Schedule—

“photographs and films” has the same meaning as section 34 of the Criminal Justice and Courts Act 2015 (meaning of “disclose” and “photograph or film”)

“prohibited material” has the same meaning as section 368E(3) of the Communications Act 2003 (harmful material).”

The new schedule sets out additional duties for pornographic content which apply to user-to-user services under Part 3 and commercial pornographic websites under Part 5.

Government amendments 150 and 174.

Amendment 191, page 94, line 24, in clause 12, at end insert—

“Section [Category 1 services: duty not to discriminate against, harass or victimise service users] Duty not to discriminate against, harass or victimise

This amendment makes NC24 an enforceable requirement.

Government amendment 131.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

Thank you, Mr Speaker. I am honoured to have been appointed the Minister responsible for the Online Safety Bill. Having worked on these issues for a number of years, I am well aware of the urgency and importance of this legislation, in particular to protect children and tackle criminal activity online—that is why we are discussing this legislation.

Relative to the point of order from my right hon. Friend Mr Davis, I have the greatest respect for him and his standing in this House, but it feels like we have been discussing this Bill for at least five years. We have had a Green Paper and a White Paper. We had a pre-legislative scrutiny process, which I was honoured to be asked to chair. We have had reports from the Digital, Culture, Media and Sport Committee and from other Select Committees and all-party parliamentary groups of this House. This legislation does not want for scrutiny.

We have also had a highly collaborative and iterative process in the discussion of the Bill. We have had 66 Government acceptances of recommendations made by the Joint Committee on the draft Online Safety Bill. We have had Government amendments in Committee. We are discusssing Government amendments today and we have Government commitments to table amendments in the House of Lords. The Bill has received a huge amount of consultation. It is highly important legislation, and the victims of online crime, online fraud, bullying and harassment want to see us get the Bill into the Lords and on the statute book as quickly as possible.

Photo of Jeremy Wright Jeremy Wright Conservative, Kenilworth and Southam

I warmly welcome my hon. Friend to his position. He will understand that those of us who have followed the Bill in some detail since its inception had some nervousness as to who might be standing at that Dispatch Box today, but we could not be more relieved that it is him. May I pick up on his point about the point of order from our right hon. Friend Mr Davis? Does he agree that an additional point to add to his list is that, unusually, this legislation has a remarkable amount of cross-party consensus behind its principles? That distinguishes it from some of the other legislation that perhaps we should not consider in these two weeks. I accept there is plenty of detail to be examined but, in principle, this Bill has a lot of support in this place.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

I completely agree with my right hon. and learned Friend. That is why the Bill passed Second Reading without a Division and the Joint Committee produced a unanimous report. I am happy for Members to cast me in the role of poacher turned gamekeeper on the Bill, but looking around the House, there are plenty of gamekeepers turned poachers here today who will ensure we have a lively debate.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

Exactly. The concept at the heart of this legislation is simple. Tech companies, like those in every other sector, must take appropriate responsibility for the consequences of their business decisions. As they continue to offer their users the latest innovations that enrich our lives, they must consider safety as well as profit. They must treat their users fairly and ensure that the internet remains a place for robust debate. The Bill has benefited from input and scrutiny from right across the House. I pay tribute to my predecessor, my hon. Friend Chris Philp, who has worked tirelessly on the Bill, not least through 50 hours of Public Bill Committee, and the Bill is better for his input and work.

We have also listened to the work of other Members of the House, including my right hon. and learned Friend Sir Jeremy Wright, Dame Margaret Hodge, my right hon. Friend the Member for Haltemprice and Howden and the Chair of the Select Committee, my hon. Friend Julian Knight, who have all made important contributions to the discussion of the Bill.

We have also listened to those concerned about freedom of expression online. It is worth pausing on that, as there has been a lot of discussion about whether the Bill is censoring legal speech online and much understandable outrage from those who think it is. I asked the same questions when I chaired the Joint Committee on the Bill. This debate does not reflect the actual text of the Bill itself. The Bill does not require platforms to restrict legal speech—let us be absolutely clear about that. It does not give the Government, Ofcom or tech platforms the power to make something illegal online that is legal offline. In fact, if those concerned about the Bill studied it in detail, they would realise that the Bill protects freedom of speech. In particular, the Bill will temper the huge power over public discourse wielded by the big tech companies behind closed doors in California. They are unaccountable for the decisions they make on censoring free speech on a daily basis. Their decisions about what content is allowed will finally be subject to proper transparency requirements.

Photo of Maria Miller Maria Miller Conservative, Basingstoke

My hon. Friend did not have the joy of being on the Bill Committee, as I did with my hon. Friend Chris Philp, who was the Minister at that point. The point that my hon. Friend has just made about free speech is so important for women and girls who are not able to go online because of the violent abuse that they receive, and that has to be taken into account by those who seek to criticise the Bill. We have to make sure that people who currently feel silenced do not feel silenced in future and can participate online in the way that they should be able to do. My hon. Friend is making an excellent point and I welcome him to his position.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

My right hon. Friend is entirely right on that point. The structure of the Bill is very simple. There is a legal priority of harms, and things that are illegal offline will be regulated online at the level of the criminal threshold. There are protections for freedom of speech and there is proper transparency about harmful content, which I will come on to address. [This section has been corrected on 21 July 2022, column 13MC — read correction]

Photo of Joanna Cherry Joanna Cherry Shadow SNP Spokesperson (Justice and Home Affairs)

Does the Minister agree that, in moderating content, category 1 service providers such as Twitter should be bound by the duties under our domestic law not to discriminate against anyone on the grounds of a protected characteristic? Will he take a look at the amendments I have brought forward today on that point, which I had the opportunity of discussing with his predecessor, who I think was sympathetic?

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

The hon. and learned Lady makes a very important point. The legislation sets regulatory thresholds at the criminal law level based on existing offences in law. Many of the points she made are covered by existing public law offences, particularly in regards to discriminating against people based on their protected characteristics. As she well knows, the internet is a reserved matter, so the legal threshold is set at where UK law stands, but where law may differ in Scotland, the police authorities in Scotland can still take action against individuals in breach of the law.

Photo of Joanna Cherry Joanna Cherry Shadow SNP Spokesperson (Justice and Home Affairs)

The difficulty is that Twitter claims it is not covered by the Equality Act 2010. I have seen legal correspondence to that effect. I am not talking about the criminal law here. I am talking about Twitter’s duty not to discriminate against women, for example, or those who hold gender critical beliefs in its moderation of content. That is the purpose of my amendment today—it would ensure that Twitter and other service providers providing a service in the United Kingdom abide by our domestic law. It is not really a reserved or devolved matter.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

The hon. and learned Lady is right. There are priority offences where the companies, regardless of their terms of service, have to meet their obligations. If something is illegal offline, it is illegal online as well. There are priority areas where the company must proactively look for that. There are also non-priority areas where the company should take action against anything that is an offence in law and meets the criminal threshold online. The job of the regulator is to hold them to account for that. They also have to be transparent in their terms of service as category 1 companies. If they have clear policies against discrimination, which they on the whole all do, they will have to set out what they would do, and the regulator can hold them to account to make sure they do what they say. The regulator cannot make them take down speech that is legal or below a criminal threshold, but they can hold them to account publicly for the decisions they make.

One of the most important aspects of this Bill with regard to the category 1 companies is transparency. At the moment, the platforms make decisions about curating their content—who to take down, who to suppress, who to leave up—but those are their decisions. There is no external scrutiny of what they do or even whether they do what they say they will do. As a point of basic consumer protection law, if companies say in their terms of service that they will do something, they should be held to account for it. What is put on the label also needs to be in the tin and that is what the Bill will do for the internet.

I now want to talk about journalism and the role of the news media in the online world, which is a very important part of this Bill. The Government are committed to defending the invaluable role of a free media. Online safety legislation must protect the vital role of the press in providing people with reliable and accurate sources of information. Companies must therefore put in place protections for journalistic content. User-to-user services will not have to apply their safety duties in part 3 of the Bill to news publishers’ content shared on their services. News publishers’ content on their own sites will also not be in scope of regulation.

New clause 19 and associated amendments introduce a further requirement on category 1 services to notify a recognised news publisher and offer a right of appeal before removing or moderating its content or taking any action against its account. This new provision will reduce the risk of major online platforms taking over-zealous, arbitrary or accidental moderation decisions against news publisher content, which plays an invaluable role in UK democracy and society.

We recognise that there are cases where platforms must be able to remove content without having to provide an appeal, and the new clause has been drafted to ensure that platforms will not be required to provide an appeal before removing content that would give rise to civil or criminal liability to the service itself, or where it amounts to a relevant offence as defined by the Bill. This means that platforms can take down without an appeal content that would count as illegal content under the Bill.

Moreover, in response to some of the concerns raised, in particular by my right hon. and learned Friend the Member for Kenilworth and Southam as well as by other Members, about the danger of creating an inadvertent loophole for bad actors, we have committed to further tightening the definition of “recognised news provider” in the House of Lords to ensure that sanctioned entities, such as RT, cannot benefit from these protections.

As the legislation comes into force, the Government are committed to ensuring that protections for journalism and news publisher content effectively safeguard users’ access to such content. We have therefore tabled amendments 167 and 168 to require category 1 companies to assess the impact of their safety duties on how news publisher and journalistic content are treated when hosted on the service. They must then demonstrate the steps they are taking to mitigate any impact.

In addition, a series of amendments, including new clause 20, will require Ofcom to produce a report assessing the impact of the Online Safety Bill on the availability and treatment of news publisher content and journalistic content on category 1 services. This will include consideration of the impact of new clause 19, and Ofcom must do this within two years of the relevant provisions being commenced.

The Bill already excludes comments sections on news publishers’ sites from the Bill’s safety duties. These comments are crucial for enabling reader engagement with the news and encouraging public debate, as well as for the sustainability of the news media. We have tabled a series of amendments to strengthen these protections, reflecting the Government’s commitment to media freedom. The amendments will create a higher bar for removing the protections in place for comments sections on recognised news publishers’ sites by ensuring that these can only be brought into the scope of regulation via primary legislation.

Government amendments 70 and 71 clarify the policy intention of the clause 13 adult safety duties to improve transparency about how providers treat harmful content, rather than incentivise its removal. The changes respond to concerns raised by stakeholders that the drafting did not make it sufficiently clear that providers could choose simply to allow any form of legal content, rather than promote, restrict or remove it, regardless of the harm to users.

This is a really important point that has sometimes been missed in the discussion on the Bill. There are very clear duties relating to illegal harm that companies must proactively identify and mitigate. The transparency requirements for other harmful content are very clear that companies must set out what their policies are. Enforcement action can be taken by the regulator for breach of their policies, but the primary objective is that companies make clear what their policies are. It is not a requirement for companies to remove legal speech if their policies do not allow that.

Photo of Margaret Hodge Margaret Hodge Labour, Barking 1:00, 12 July 2022

I welcome the Minister to his position, and it is wonderful to have somebody else who—like the previous Minister, Chris Philp—knows what he is talking about. On this issue, which is pretty key, I think it would work if minimum standards were set on the risk assessments that platforms have to make to judge what is legal but harmful content, but at the moment such minimum standards are not in the Bill. Could the Minister comment on that? Otherwise, there is a danger that platforms will set a risk assessment that allows really vile harmful but legal content to carry on appearing on their platform.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

The right hon. Lady makes a very important point. There have to be minimum safety standards, and I think that was also reflected in the report of the Joint Committee, which I chaired. Those minimum legal standards are set where the criminal law is set for these priority legal offences. A company may have higher terms of service—it may operate at a higher level—in which case it will be judged on the operation of its terms of service. However, for priority illegal content, it cannot have a code of practice that is below the legal threshold, and it would be in breach of the provisions if it did. For priority illegal offences, the minimum threshold is set by the law.

Photo of Margaret Hodge Margaret Hodge Labour, Barking

I understand that in relation to illegal harmful content, but I am talking about legal but harmful content. I understand that the Joint Committee that the hon. Member chaired recommended that for legal but harmful content, there should be minimum standards against which the platforms would be judged. I may have missed it, but I cannot see that in the Bill.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

The Joint Committee’s recommendation was for a restructuring of the Bill, so that rather than having general duty of care responsibilities that were not defined, we defined those responsibilities based on existing areas of law. The core principle behind the Bill is to take things that are illegal offline, and to regulate such things online based on the legal threshold. That is what the Bill does.

In schedule 7, which did not exist in the draft phase, we have written into the Bill a long list of offences in law. I expect that, as this regime is created, the House will insert more regulations and laws into schedule 7 as priority offences in law. Even if an offence in law is not listed in the priority illegal harms schedule, it can still be a non-priority harm, meaning that even if a company does not have to look for evidence of that offence proactively, it still has to act if it is made aware of the offence. I think the law gives us a very wide range of offences, clearly defined against offences in law, where there are clearly understood legal thresholds.

The question is: what is to be done about other content that may be harmful but sits below the threshold? The Government have made it clear that we intend to bring forward amendments that set out clear priorities for companies on the reporting of such harmful content, where we expect the companies to set out what their policies are. That will include setting out clearly their policies on things such as online abuse and harassment, the circulation of real or manufactured intimate images, content promoting self-harm, content promoting eating disorders or legal suicide content—this is content relating to adults—so the companies will have to be transparent on that point.

Photo of Chris Philp Chris Philp Conservative, Croydon South

I congratulate the Minister on his appointment, and I look forward to supporting him in his role as he previously supported me in mine. I think he made an important point a minute ago about content that is legal but considered to be harmful. It has been widely misreported in the press that this Bill censors or prohibits such content. As the Minister said a moment ago, it does no such thing. There is no requirement on platforms to censor or remove content that is legal, and amendment 71 to clause 13 makes that expressly clear. Does he agree that reports suggesting that the Bill mandates censorship of legal content are completely inaccurate?

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

I am grateful to my hon. Friend, and as I said earlier, he is absolutely right. There is no requirement for platforms to take down legal speech, and they cannot be directed to do so. What we have is a transparency requirement to set out their policies, with particular regard to some of the offences I mentioned earlier, and a wide schedule of things that are offences in law that are enforced through the Bill itself. This is a very important distinction to make. I said to him on Second Reading that I thought the general term “legal but harmful” had added a lot of confusion to the way the Bill was perceived, because it created the impression that the removal of legal speech could be required by order of the regulator, and that is not the case.

Photo of Debbie Abrahams Debbie Abrahams Labour, Oldham East and Saddleworth

I congratulate the Minister on his promotion and on his excellent chairmanship of the prelegislative scrutiny Committee, which I also served on. Is he satisfied with the Bill in relation to disinformation? It was concerning that there was only one clause on disinformation, and we know the impact—particularly the democratic impact—that that has on our society at large. Is he satisfied that the Bill will address that?

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

It was a pleasure to serve alongside the hon. Lady on the Joint Committee. There are clear new offences relating to knowingly false information that will cause harm. As she will know, that was a Law Commission recommendation; it was not in the draft Bill but it is now in the Bill. The Government have also said that as a consequence of the new National Security Bill, which is going through Parliament, we will bring in a new priority offence relating to disinformation spread by hostile foreign states. As she knows, one of the most common areas for organised disinformation has been at state level. As a consequence of the new national security legislation, that will also be reflected in schedule 7 of this Bill, and that is a welcome change.

The Bill requires all services to take robust action to tackle the spread of illegal content and activity. Providers must proactively reduce the risk on their services of illegal activity and the sharing of illegal content, and they must identify and remove illegal content once it appears on their services. That is a proactive responsibility. We have tabled several interrelated amendments to reinforce the principle that companies must take a safety-by-design approach to managing the risk of illegal content and activity on their services. These amendments require platforms to assess the risk of their services being used to commit, or to facilitate the commission of, a priority offence and then to design and operate their services to mitigate that risk. This will ensure that companies put in place preventive measures to mitigate a broad spectrum of factors that enable illegal activity, rather than focusing solely on the removal of illegal content once it appears.

Photo of Henry Smith Henry Smith Conservative, Crawley

I congratulate my hon. Friend on his appointment to his position. On harmful content, there are all too many appalling examples of animal abuse on the internet. What are the Government’s thoughts on how we can mitigate such harmful content, which is facilitating wildlife crime? Might similar online protections be provided for animals to the ones that clause 53 sets out for children?

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

My hon. Friend raises an important point that deserves further consideration as the Bill progresses through its parliamentary stages. There is, of course, still a general presumption that any illegal activity that could also constitute illegal activity online—for example, promoting or sharing content that could incite people to commit violent acts—is within scope of the legislation. There are some priority illegal offences, which are set out in schedule 7, but the non-priority offences also apply if a company is made aware of content that is likely to be in breach of the law. I certainly think this is worth considering in that context.

In addition, the Bill makes it clear that platforms have duties to mitigate the risk of their service facilitating an offence, including where that offence may occur on another site, such as can occur in cross-platform child sexual exploitation and abuse—CSEA—offending, or even offline. This addresses concerns raised by a wide coalition of children’s charities that the Bill did not adequately tackle activities such as breadcrumbing—an issue my hon. Friend Julian Knight, the Chair of the Select Committee, has raised in the House before—where CSEA offenders post content on one platform that leads to offences taking place on a different platform.

We have also tabled new clause 14 and a related series of amendments in order to provide greater clarity about how in-scope services should determine whether they have duties with regard to content on their services. The new regulatory framework requires service providers to put in place effective and proportionate systems and processes to improve user safety while upholding free expression and privacy online. The systems and processes that companies implement will be tailored to the specific risk profile of the service. However, in many cases the effectiveness of companies’ safety measures will depend on them making reasonable judgments about types of content. Therefore, it is essential to the effective functioning of the framework that there is clarity about how providers should approach these judgments. In particular, such clarity will safeguard against companies over-removing innocuous content if they wrongly assume mental elements are present, or under-removing content if they act only where all elements of an offence are established beyond reasonable doubt. The amendments make clear that companies must consider all reasonably available contextual information when determining whether content is illegal content, a fraudulent advert, content that is harmful to children, or content that is harmful to adults.

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Work and Pensions)

I was on the Bill Committee and we discussed lots of things, but new clause 14 was not discussed: we did not have conversations about it, and external organisations have not been consulted on it. Is the Minister not concerned that this is a major change to the Bill and it has not been adequately consulted on?

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

As I said earlier, in establishing the threshold for priority illegal offences, the current threshold of laws that exist offline should provide good guidance. I would expect that as the codes of practice are developed, we will be able to make clear what those offences are. On the racial hatred that the England footballers received after the European championship football final, people have been prosecuted for what they posted on Twitter and other social media platforms. We know what race hate looks like in that context, we know what the regulatory threshold should look at and we know the sort of content we are trying to regulate. I expect that, in the codes of practice, Ofcom can be very clear with companies about what we expect, where the thresholds are and where we expect them to take enforcement action.

Photo of Caroline Dinenage Caroline Dinenage Conservative, Gosport 1:15, 12 July 2022

I congratulate my hon. Friend on taking his new position; we rarely have a new Minister so capable of hitting the ground running. He makes a crucial point about clearness and transparency for both users and the social media providers and other platforms, because it is important that we make sure they are 100% clear about what is expected of them and the penalties for not fulfilling their commitments. Does he agree that opaqueness—a veil of secrecy—has been one of the obstacles, and that a whole raft of content has been taken down for the wrong reasons while other content has been left to proliferate because of the lack of clarity?

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

That is entirely right, and in closing I say that the Bill does what we have always asked for it to do: it gives absolute clarity that illegal things offline must be illegal online as well, and be regulated online. It establishes clear responsibilities and liabilities for the platforms to do that proactively. It enables a regulator to hold the platforms to account on their ability to tackle those priority illegal harms and provide transparency on other areas of harmful content. At present we simply do not know about the policy decisions that companies choose to make: we have no say in it; it is not transparent; we do not know whether they do it. The Bill will deliver in those important regards. If we are serious about tackling issues such as fraud and abuse online, and other criminal offences, we require a regulatory system to do that and proper legal accountability and liability for the companies. That is what the Bill and the further amendments deliver.

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport), Shadow Minister (Tech, Gambling and the Digital Economy)

It is an honour to respond on the first group of amendments on behalf of the Opposition.

For those of us who have been working on this Bill for some time now, it has been extremely frustrating to see the Government take such a siloed approach in navigating this complex legislation. I remind colleagues that in Committee Labour tabled a number of hugely important amendments that sought to make the online space safer for us all, but the Government responded by voting against each and every one of them. I certainly hope the new Minister—I very much welcome him to his post—has a more open-minded approach than his predecessor and indeed the Secretary of State; I look forward to what I hope will be a more collaborative approach to getting this legislation right.

With that in mind, it must be said that time and again this Government claim that the legislation is world-leading but that is far from the truth. Instead, once again the Government have proposed hugely significant and contentious amendments only after line-by-line scrutiny in Committee; it is not the first time this has happened in this Parliament, and it is extremely frustrating for those of us who have debated this Bill for more than 50 hours over the past month.

I will begin by touching on Labour’s broader concerns around the Bill. As the Minister will be aware, we believe that the Government have made a fundamental mistake in their approach to categorisation, which undermines the very structure of the Bill. We are not alone in this view and have the backing of many advocacy and campaign groups including the Carnegie UK Trust, Hope Not Hate and the Antisemitism Policy Trust. Categorisation of services based on size rather than risk of harm will mean that the Bill will fail to address some of the most extreme harms on the internet.

We all know that smaller platforms such as 4chan and BitChute have significant numbers of users who are highly motivated to promote very dangerous content. Their aim is to promote radicalisation and to spread hate and harm.

Photo of Debbie Abrahams Debbie Abrahams Labour, Oldham East and Saddleworth

Not only that: people migrate from one platform to another, a fact that just has not been reflected on by the Government.

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport), Shadow Minister (Tech, Gambling and the Digital Economy)

My hon. Friend is absolutely right, and has touched on elements that I will address later in my speech. I will look at cross-platform harm and breadcrumbing; the Government have taken action to address that issue, but they need to go further.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

I am sorry to intervene so early in the hon. Lady’s speech, and thank her for her kind words. I personally agree that the question of categorisation needs to be looked at again, and the Government have agreed to do so. We will hopefully discuss it next week during consideration of the third group of amendments.

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport), Shadow Minister (Tech, Gambling and the Digital Economy)

I welcome the Minister’s commitment, which is something that the previous Minister, Chris Philp also committed to in Committee. However, it should have been in the Bill to begin with, or been tabled as an amendment today so that we could discuss it on the Floor of the House. We should not have to wait until the Bill goes to the other place to discuss this fundamental, important point that I know colleagues on the Minister’s own Back Benches have been calling for. Here we are, weeks down the line, with nothing having been done to fix that problem, which we know will be a persistent problem unless action is taken. It is beyond frustrating that no indication was given in Committee of these changes, because they have wide-ranging consequences for the effects of the Bill. Clearly, the Government are distracted with other matters, but I remind the Minister that Labour has long called for a safer internet, and we are keen to get the Bill right.

Let us start with new clause 14, which provides clarification about how online services should determine whether content should be considered illegal, and therefore how the illegal safety duty should apply. The new clause is deeply problematic, and is likely to reduce significantly the amount of illegal content and fraudulent advertising that is correctly identified and acted on. First, companies will be expected to determine whether content is illegal or fraudulently based on information that is

“reasonably available to a provider”,

with reasonableness determined in part by the size and capacity of the provider. That entrenches the problems I have outlined with smaller, high-risk companies being subject to fewer duties despite the acute risks they pose. Having less onerous applications of the illegal safety duties will encourage malign actors to migrate illegal activity on to smaller sites that have less pronounced regulatory expectations placed on them. That has particularly concerning ramifications for children’s protections, which I will come on to shortly. On the other end of the scale, larger sites could use new clause 14 to argue that their size and capacity, and the corresponding volumes of material they are moderating, makes it impractical for them reliably and consistently to identify illegal content.

The second problem arises from the fact that the platforms will need to have

“reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied”.

That significantly raises the threshold at which companies are likely to determine that content is illegal. In practice, companies have routinely failed to remove content where there is clear evidence of illegal intent. That has been the case in instances of child abuse breadcrumbing, where platforms use their own definitions of what constitutes a child abuse image for moderation purposes. Charities believe it is inevitable that companies will look to use this clause to minimise their regulatory obligations to act.

Finally, new clause 14 and its resulting amendments do not appear to be adequately future-proofed. The new clause sets out that judgments should be made

“on the basis of all relevant information that is reasonably available to a provider.”

However, on Meta’s first metaverse device, the Oculus Quest product, that company records only two minutes of footage on a rolling basis. That makes it virtually impossible to detect evidence of grooming, and companies can therefore argue that they cannot detect illegal content because the information is not reasonably available to them. The new clause undermines and weakens the safety mechanisms that the Minister, his team, the previous Minister, and all members of the Joint Committee and the Public Bill Committee have worked so hard to get right. I urge the Minister to reconsider these amendments and withdraw them.

I will now move on to improving the children’s protection measures in the Bill. In Committee, it was clear that one thing we all agreed on, cross-party and across the House, was trying to get the Bill to work for children. With colleagues in the Scottish National party, Labour Members tabled many amendments and new clauses in an attempt to achieve that goal. However, despite their having the backing of numerous children’s charities, including the National Society for the Prevention of Cruelty to Children, 5Rights, Save the Children, Barnardo’s, The Children’s Society and many more, the Government sadly did not accept them. We are grateful to those organisations for their insights and support throughout the Bill’s passage.

We know that children face significant risks online, from bullying and sexist trolling to the most extreme grooming and child abuse. Our amendments focus in particular on preventing grooming and child abuse, but before I speak to them, I associate myself with the amendments tabled by our colleagues in the Scottish National party, the hon. Members for Aberdeen North (Kirsty Blackman) and for Ochil and South Perthshire (John Nicolson). In particular, I associate myself with the sensible changes they have suggested to the Bill at this stage, including a change to children’s access assessments through amendment 162 and a strengthening of duties to prevent harm to children caused by habit-forming features through amendment 190.

Since the Bill was first promised in 2017, the number of online grooming crimes reported to the police has increased by more than 80%. Last year, around 120 sexual communication with children offences were committed every single week, and those are only the reported cases. The NSPCC has warned that that amounts to a

“tsunami of online child abuse”.

We now have the first ever opportunity to legislate for a safer world online for our children.

However, as currently drafted, the Bill falls short by failing to grasp the dynamics of online child abuse and grooming, which rarely occurs on one single platform or app, as mentioned by my hon. Friend Debbie Abrahams. In well-established grooming pathways, abusers exploit the design features of open social networks to contact children, then move their communication across to other, more encrypted platforms, including livestreaming sites and encrypted messaging services. For instance, perpetrators manipulate features such as Facebook’s algorithmic friend suggestions to make initial contact with large numbers of children, who they then groom through direct messages before moving to encrypted services such as WhatsApp, where they coerce children into sending sexual images. That range of techniques is often referred to as child abuse breadcrumbing, and is a significant enabler of online child abuse.

I will give a sense of how easy it is for abusers to exploit children by recounting the words and experiences of a survivor, a 15-year-old girl who was groomed on multiple sites:

“I’ve been chatting with this guy online who’s…twice my age. This all started on Instagram but lately all our chats have been on WhatsApp. He seemed really nice to begin with, but then he started making me do these things to ‘prove my trust’ to him, like doing video chats with my chest exposed. Every time I did these things for him, he would ask for more and I felt like it was too late to back out. This whole thing has been slowly destroying me and I’ve been having thoughts of hurting myself.”

I appreciate that it is difficult listening, but that experience is being shared by thousands of other children every year, and we need to be clear about the urgency that is needed to change that.

It will come as a relief to parents and children that, through amendments 58 to 61, the Government have finally agreed to close the loophole that allowed for breadcrumbing to continue. However, I still wish to speak to our amendments 15, 16, and 17 to 19, which were tabled before the Government changed their mind. Together with the Government’s amendments, these changes will bring into scope tens of millions of interactions with accounts that actively enable the discovery and sharing of child abuse material.

Amendment 15 would ensure that platforms have to include in their illegal content risk assessment content that

“reasonably foreseeably facilitates or aids the discovery or dissemination of CSEA content.”

Amendment 16 would ensure that platforms have to maintain proportionate systems and processes to minimise the presence of such content on their sites. The wording of our amendments is tighter and includes aiding the discovery or dissemination of content, whereas the Government’s amendments cover only “commission or facilitation”. Can the Minister tell me why the Government chose that specific wording and opposed the amendments that we tabled in Committee, which would have done the exact same thing? I hope that in the spirit of collaboration that we have fostered throughout the passage of the Bill with the new Minister and his predecessor, the Minister will consider the merit of our amendments 15 and 16.

Labour is extremely concerned about the significant powers that the Bill in its current form gives to the Secretary of State. We see that approach to the Bill as nothing short of a shameless attempt at power-grabbing from a Government whose so-called world-leading Bill is already failing in its most basic duty of keeping people safe online. Two interlinked issues arise from the myriad of powers granted to the Secretary of State throughout the Bill: the first is the unjustified intrusion of the Secretary of State into decisions that are about the regulation of speech, and the second is the unnecessary levels of interference and threats to the independence of Ofcom that arise from the powers of direction to Ofcom in its day-to-day matters and operations. That is not good governance, and it is why Labour has tabled a range of important amendments that the Minister must carefully consider. None of us wants the Bill to place undue powers in the hands of only one individual. That is not a normal approach to regulation, so I fail to see why the Government have chosen to go down that route in this case.

Photo of Chris Philp Chris Philp Conservative, Croydon South

I thank the shadow Minister for giving way—I will miss our exchanges across the Dispatch Box. She is making a point about the Secretary of State powers in, I think, clause 40. Is she at all reassured by the undertakings given in the written ministerial statement tabled by the Secretary of State last Thursday, in which the Government committed to amending the Bill in the Lords to limit the use of those powers to exceptional circumstances only, and precisely defined those circumstances as only being in connection with issues such as public health and public safety?

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport), Shadow Minister (Tech, Gambling and the Digital Economy)

I thank the former Minister for his intervention, and I am grateful for that clarification. We debated at length in Committee the importance of the regulator’s independence and the prevention of overarching Secretary of State powers, and of Parliament having a say and being reconvened if required. I welcome the fact that that limitation on the power will be tabled in the other place, but it should have been tabled as an amendment here so that we could have discussed it today. We should not have to wait for the Bill to go to the other place for us to have our say. Who knows what will happen to the Bill tomorrow, next week or further down the line with the Government in utter chaos? We need this to be done now. The Minister must recognise that this is an unparalleled level of power, and one with which the sector and Back Benchers in his own party disagree. Let us work together and make sure the Bill really is fit for purpose, and that Ofcom is truly independent and without interference and has the tools available to it to really create meaningful change and keep us all safe online once and for all.

I must put on record my support for amendments 11 and 12, tabled by Sir Jeremy Wright. In Committee, we heard multiple examples of racist, extremist and other harmful publishers, from holocaust deniers to white supremacists, who would stand to benefit from the recognised news publisher exemption as it stands, either overnight or by making minor administrative changes. As long as the exemption protects antisemites and extremists, it is not fit for purpose. That much should be clear to all of us. In Committee, in response to an amendment tabled by my hon. Friend Kim Leadbeater, the then Minister promised a concession so that Russia Today would be excluded from the recognised news publisher exemption. I welcome the Minister’s comments at the Dispatch Box today to confirm that. I am pleased that the Government have promised to exclude sanctioned news bodies such as Russia Today, but their approach does not go far enough. Disinformation outlets rarely have the profile of Russia Today.

Photo of Andrew Percy Andrew Percy Conservative, Brigg and Goole 1:30, 12 July 2022

While the shadow Minister is on the subject of exemptions for antisemites, will she say where the Opposition are on the issue of search? Search platforms and search engines provide some of the most appalling racist, Islamophobic and antisemitic content.

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport), Shadow Minister (Tech, Gambling and the Digital Economy)

I thank the hon. Gentleman, who is absolutely right. In Committee, we debated at length the impact search engines have, and they should be included in the Bill’s categorisation of difficult issues. In one recent example on a search engine, the imagery that comes up when we search for desk ornaments is utterly appalling and needs to be challenged and changed. If we are to truly tackle antisemitism, racism and extremist content online, then the provisions need to be included in the Bill, and journalistic exemptions should not apply to this type of content. Often, they operate more discretely and are less likely to attract sanctions. Furthermore, any amendment will provide no answer to the many extremist publishers who seek to exploit the terms of the exemption. For those reasons, we need to go further.

The amendments are not a perfect or complete solution. Deficiencies remain, and the amendments do not address the fact that the exemption continues to exclude dozens of independent local newspapers around the country on the arbitrary basis that they have no fixed address. The Independent Media Association, which represents news publishers, describes the news publisher criteria as

“punishing quality journalism with high standards”.

I hope the Minister will reflect further on that point. As a priority, we need to ensure that the exemption cannot be exploited by bad actors. We must not give a free pass to those propagating racist, misogynistic or antisemitic harm and abuse. By requiring some standards of accountability for news providers, however modest, the amendments are an improvement on the Bill as drafted. In the interests of national security and the welfare of the public, we must support the amendments.

Finally, I come to a topic that I have spoken about passionately in this place on a number of occasions and that is extremely close to my heart: violence against women and girls. Put simply, in their approach to the Bill the Government are completely failing and falling short in their responsibilities to keep women and girls safe online. Labour has been calling for better protections for some time now, yet still the Government are failing to see the extent of the problem. They have only just published an initial indicative list of priority harms to adults, in a written statement that many colleagues may have missed. While it is claimed that this will add to scrutiny and debate, the final list of harms will not be on the face of the Bill but will included in secondary legislation after the Bill has received Royal Assent. Non-designated content that is harmful will not require action on the part of service providers, even though by definition it is still extremely harmful. How can that be acceptable?

Many campaigners have made the case that protections for women and girls are not included in the draft Bill at all, a concern supported by the Petitions Committee in its report on online abuse. Schedule 7 includes a list of sexual offences and aggravated offences, but the Government have so far made no concessions here and the wider context of violence against women and girls has not been addressed. That is why I urge the Minister to carefully consider our new clause 3, which seeks to finally name violence against women and girls as a priority harm. The Minister’s predecessor said in Committee that women and girls receive “disproportionate” levels of abuse online. The Minister in his new role will likely be well briefed on the evidence, and I know this is an issue he cares passionately about. The case has been put forward strongly by hon. Members on all sides of the House, and the message is crystal clear: women and girls must be protected online, and we see this important new clause as the first step.

Later on, we hope to see the Government move further and acknowledge that there must be a code of practice on tackling violence against women and girls content online.

Photo of Maria Miller Maria Miller Conservative, Basingstoke

The hon. Lady raises the issue of codes of practice. She will recall that in Committee we talked about that specifically and pressed the then Minister on that point. It became very clear that Ofcom would be able to issue a code of practice on violence against women and girls, which she talked about. Should we not be seeking an assurance that Ofcom will do that? That would negate the need to amend the Bill further.

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport), Shadow Minister (Tech, Gambling and the Digital Economy)

I welcome the right hon. Lady’s comments. We did discuss this at great length in Committee, and I know she cares deeply and passionately about this issue, as do I. It is welcome that Ofcom can issue a code of practice on violence against women and girls, and we should absolutely be urging it to do that, but we also need to make it a fundamental aim of the Bill. If the Bill is to be truly world leading, if it is truly to make us all safe online, and if we are finally to begin to tackle the scourge of violence against women and girls in all its elements—not just online but offline—then violence against women and girls needs to be named as a priority harm in the Bill. We need to take the brave new step of saying that enough is enough. Words are not enough. We need actions, and this is an action the Minister could take.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

I think we would all agree that when we look at the priority harms set out in the Bill, women and girls are disproportionately the victims of those offences. The groups in society that the Bill will most help are women and girls in our community. I am happy to work with the hon. Lady and all hon. Members to look at what more we can do on this point, both during the passage of the Bill and in future, but as it stands the Bill is the biggest step forward in protecting women and girls, and all users online, that we have ever seen.

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport), Shadow Minister (Tech, Gambling and the Digital Economy)

I am grateful to the Minister for the offer to work on that further, but we have an opportunity now to make real and lasting change. We talk about how we tackle this issue going forward. How can we solve the problem of violence against women and girls in our community? Three women a week are murdered at the hands of men in this country—that is shocking. How can we truly begin to tackle a culture change? This is how it starts. We have had enough of words. We have had enough of Ministers standing at the Dispatch Box saying, “This is how we are going to tackle violence against women and girls; this is our new plan to do it.” They have an opportunity to create a new law that makes it a priority harm, and that makes women and girls feel like they are being listened to, finally. I urge the Minister and Members in all parts of the House, who know that this is a chance for us finally to take that first step, to vote for new clause 3 today and make women and girls a priority by showing understanding that they receive a disproportionate level of abuse and harm online, and by making them a key component of the Bill.

Photo of David Davis David Davis Conservative, Haltemprice and Howden

I join everybody else in welcoming the Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend Damian Collins, to the Front Bench. He is astonishingly unusual in that he is both well-intentioned and well-informed, a combination we do not always find among Ministers.

I will speak to my amendments to the Bill. I am perfectly willing to be in a minority of one—one of my normal positions in this House. To be in a minority of one on the issue of free speech is an honourable place to be. I will start by saying that I think the Bill is fundamentally mis-designed. It should have been several Bills, not one. It is so complex that it is very difficult to forecast the consequences of what it sets out to do. It has the most fabulously virtuous aims, but unfortunately the way things will be done under it, with the use of Government organisations to make decisions that, properly, should be taken on the Floor of the House, is in my view misconceived.

We all want the internet to be safe. Right now, there are too many dangers online—we have been hearing about some of them from Alex Davies-Jones, who made a fabulous speech from the Opposition Front Bench—from videos propagating terror to posts promoting self-harm and suicide. But in its well-intentioned attempts to address those very real threats, the Bill could actually end up being the biggest accidental curtailment of free speech in modern history.

There are many reasons to be concerned about the Bill. Not all of them are to be dealt with in this part of the Report stage—some will be dealt with later—and I do not have time to mention them all. I will make one criticism of the handling of the Bill at this point. I have seen much smaller Bills have five days on Report in the past. This Bill demands more than two days. That was part of what I said in my point of order at the beginning.

One of the biggest problems is the “duties of care” that the Bill seeks to impose on social media firms to protect users from harmful content. That is a more subtle issue than the tabloid press have suggested. My hon. Friend Chris Philp, the previous Minister, made that point and I have some sympathy with him. I have spoken to representatives of many of the big social media firms, some of which cancelled me after speeches that I made at the Conservative party conference on vaccine passports. I was cancelled for 24 hours, which was an amusing process, and they put me back up as soon as they found out what they had done. Nevertheless, that demonstrated how delicate and sensitive this issue is. That was a clear suppression of free speech without any of the pressures that are addressed in the Bill.

When I spoke to the firms, they made it plain that they did not want the role of online policemen, and I sympathise with them, but that is what the Government are making them do. With the threat of huge fines and even prison sentences if they consistently fail to abide by any of the duties in the Bill—I am using words from the Bill—they will inevitably err on the side of censorship whenever they are in doubt. That is the side they will fall on.

Worryingly, the Bill targets not only illegal content, which we all want to tackle—indeed, some of the practice raised by the Opposition Front Bencher, the hon. Member for Pontypridd should simply be illegal full stop—but so-called “legal but harmful” content. Through clause 13, the Bill imposes duties on companies with respect to legal content that is “harmful to adults”. It is true that the Government have avoided using the phrase “legal but harmful” in the Bill, preferring “priority content”, but we should be clear about what that is.

The Bill’s factsheet, which is still on the Government’s website, states on page 1:

“The largest, highest-risk platforms will have to address named categories of legal but harmful material”.

This is not just a question of transparency—they will “have to” address that. It is simply unacceptable to target lawful speech in this way. The “Legal to Say, Legal to Type” campaign, led by Index on Censorship, sums up this point: it is both perverse and dangerous to allow speech in print but not online.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

As I said, a company may be asked to address this, which means that it has to set out what its policies are, how it would deal with that content and its terms of service. The Bill does not require a company to remove legal speech that it has no desire to remove. The regulator cannot insist on that, nor can the Government or the Bill. There is nothing to make legal speech online illegal.

Photo of David Davis David Davis Conservative, Haltemprice and Howden

That is exactly what the Minister said earlier and, indeed, said to me yesterday when we spoke about this issue. I do not deny that, but this line of argument ignores the unintended consequences that the Bill may have. Its stated aim is to achieve reductions in online harm, not just illegal content. Page 106 of the Government’s impact assessment lists a reduction in the prevalence of legal but harmful content as a “key evaluation” question. The Bill aims to reduce that—the Government say that both in the online guide and the impact assessment. The impact assessment states that an increase in “content moderation” is expected because of the Bill.

A further concern is that the large service providers already have terms and conditions that address so-called legal but harmful content. A duty to state those clearly and enforce them consistently risks legitimising and strengthening the application of those terms and conditions, possibly through automated scanning and removal. That is precisely what happened to me before the Bill was even dreamed of. That was done under an automated system, backed up by somebody in Florida, Manila or somewhere who decided that they did not like what I said. We have to bear in mind how cautious the companies will be. That is especially worrying because, as I said, providers will be under significant pressure from outside organisations to include restrictive terms and conditions. I say this to Conservative Members, and we have some very well-intentioned and very well-informed Members on these Benches: beware of the gamesmanship that will go on in future years in relation to this.

Ofcom and the Department see these measures as transparency measures—that is the line. Lord Michael Grade, who is an old friend of mine, came to see me and he talked about this not as a pressure, but as a transparency measure. However, these are actually pressure measures. If people are made to announce things and talk about them publicly, that is what they become.

It is worth noting that several free speech and privacy groups have expressed scepticism about the provisions, yet they were not called to give oral evidence in Committee. A lot of other people were, including pressure groups on the other side and the tech companies, which we cannot ignore, but free speech advocates were not.

The clause is also part of the Bill where real democratic scrutiny is missing. Without being too pious about this, the simple truth is that the comparative power of Parliament has diminished over the past decade or two, with respect to Government, and this is another example. The decision on what counts as

“priority content that is harmful to adults” will initially be made by the Secretary of State and then be subject to the draft affirmative procedure, in a whipped Statutory Instrument Committee. I have never, ever been asked to serve on an SI Committee and the Whips’ Office has never sought me to volunteer to do that—I wonder why. I hasten to add that I am not volunteering to do so now either, but the simple truth is this will be considered in a whipped, selected Committee. When we talked earlier about constraints on the power, we heard comments such as, “We will only do this in the case of security, and so on.” Heavens above, we have been through two years of ferocious controversy on matters of public health security. This is not something that should somehow be protected from free speech, I’m afraid.

We cannot allow such significant curtailments of free expression to take place without proper parliamentary debate or amendment. These questions need to be discussed and decided in the Chamber, if need be, annually. When I first came into the House of Commons, we had an annual Companies Act, because companies law and accounting law were going through change. They were not changing anything like as fast as the internet. The challenges were not coming up anything like as fast as they do with the internet, so why do we not have an annual Bill on this matter? I would be perfectly happy to see that, so that we can make decisions here. If we do not do that, we could do this on an ad hoc basis as the issues arise, including some that the hon. Member for Pontypridd raised. We could have been dealing with that before now on a simpler basis than that of the Bill.

If a category of speech is important enough to be censored, which is what we are really asking for, it is important enough to be debated in this Chamber and by the whole of Parliament—the Commons and the Lords. Otherwise, the Government’s claim that the Bill will protect free speech will appear absurd. My right hon. and learned Friend Sir Jeremy Wright has tabled some amendments relating to the press. Even there, it is incredibly difficult to get this right because of the sheer complexity of the Bill and the size of the problem that we are trying to address. That is why I have tabled amendment 151, which seeks to remove clause 13 entirely, because it introduces the authoritarian concept of “legal but harmful” content—decided by the Government. “Legal but harmful” is a perfectly reasonable concept, but if it is decided by the Government alone, that is authoritarian. This is described as “priority content”, but everybody knows what it actually means. The Government have run away from using “legal but harmful” in a public context, but they use it everywhere else.

My amendments are designed to protect free speech while making the internet a safer place for everyone. I do not want to see content relating to suicide, self-harm or abuse of women, or whatever it may be, and I tabled two amendments to make them explicitly illegal, and the House can decide on those. That is what we should do. That is where the power of the House and the proper judgment lies.

The Bill has significantly improved since it was in draft form and the new Minister has a very honourable history in that reform. I compliment and commend him on that and thank him for those actions. I also welcome the measures taken against such things as cyber-flashing, but more needs to be done. The Bill falls far short of what it needs to be, and it would be remiss of us, in our duty as MPs, to let it pass without serious alteration.

I say this to the Whip on the Front Bench, and I hope that I have his attention: the Bill needs many more days on Report. I hope that he will reflect that back to the Chief Whip at the end of this business, because only with more days can we get it right. This is probably one of the most important Bills to go through this House in this decade, and we have not quite got it right yet.

Photo of John Nicolson John Nicolson Shadow SNP Spokesperson (Digital, Culture, Media and Sport) 1:45, 12 July 2022

I rise to speak to the amendments in my name and those of other right hon. and hon. Members. I welcome the Minister to his place after his much-deserved promotion; as other hon. Members have said, it is great to have somebody who is both passionate and informed as a Minister. I also pay tribute to Chris Philp, who is sitting on the Back Benches: he worked incredibly hard on the Bill, displayed a mastery of detail throughout the process and was extremely courteous in his dealings with us. I hope that he will be speedily reshuffled back to the Front Bench, which would be much deserved—but obviously not that he should replace the Minister, who I hope will remain in his current position or indeed be elevated from it.

But enough of all this souking, as we say north of the border. As one can see from the number of amendments tabled, the Bill is not only an enormous piece of legislation but a very complex one. Its aims are admirable—there is no reason why this country should not be the safest place in the world to be online—but a glance through the amendments shows how many holes hon. Members think it still has.

The Government have taken some suggestions on board. I welcome the fact that they have finally legislated outright to stop the wicked people who attempt to trigger epileptic seizures by sending flashing gifs; I did not believe that such cruelty was possible until I was briefed about it in preparation for debates on the Bill. I pay particular tribute to wee Zach, whose name is often attached to what has been called Zach’s law.

The amendments to the Bill show that there has been a great deal of cross-party consensus on some issues, on which it has been a pleasure to work with friends in the Labour party. The first issue is addressed, in various ways, by amendments 44 to 46, 13, 14, 21 and 22, which all try to reduce the Secretary of State’s powers under the Bill. In all the correspondence that I have had about the Bill, and I have had a lot, that is the area that has most aggrieved the experts. A coalition of groups with a broad range of interests, including child safety, human rights, women and girls, sport and democracy, all agree that the Secretary of State is granted too many powers under the Bill, which threatens the independence of the regulator. Businesses are also wary of the powers, in part because they cause uncertainty.

The reduction of ministerial powers under the Bill was advised by the Joint Committee on the Draft Online Safety Bill and by the Select Committee on Digital, Culture, Media and Sport, on both of which I served. In Committee, I asked the then Minister whether any stakeholder had come forward in favour of these powers. None had.

Even DCMS Ministers do not agree with the powers. The new Minister was Chair of the Joint Committee, and his Committee’s report said:

“The powers for the Secretary of State to a) modify Codes of Practice to reflect Government policy and b) give guidance to Ofcom give too much power to interfere in Ofcom’s independence and should be removed.”

The Government have made certain concessions with respect to the powers, but they do not go far enough. As the Minister said, the powers should be removed.

We should be clear about exactly what the powers do. Under clause 40, the Secretary of State can

“modify a draft of a code of practice”.

That allows the Government a huge amount of power over the so-called independent communications regulator. I am glad that the Government have listened to the suggestions that my colleagues and I made on Second Reading and in Committee, and have committed to using the power only in “exceptional circumstances” and by further defining “public policy” motives. But “exceptional circumstances” is still too opaque and nebulous a phrase. What exactly does it mean? We do not know. It is not defined—probably intentionally.

The regulator must not be politicised in this way. Several similar pieces of legislation are going through their respective Parliaments or are already in force. In Germany, Australia, Canada, Ireland and the EU, with the Digital Services Act, different Governments have grappled with the issue of making digital regulation future-proof and flexible. None of them has added political powers. The Bill is sadly unique in making such provision.

When a Government have too much influence over what people can say online, the implications for freedom of speech are particularly troubling, especially when the content that they are regulating is not illegal. There are ways to future-proof and enhance the transparency of Ofcom in the Bill that do not require the overreach that these powers give. When we allow the Executive powers over the communications regulator, the protections must be absolute and iron-clad, but as the Bill stands, it gives leeway for abuse of those powers. No matter how slim the Minister feels the chance of that may be, as parliamentarians we must not allow it.

Amendment 187 on human trafficking is an example of a relatively minor change to the Bill that could make a huge difference to people online. Our amendment seeks to deal explicitly with what Meta and other companies refer to as domestic servitude, which is very newsworthy, today of all days, and which we know better as human trafficking. Sadly, this abhorrent practice has been part of our society for hundreds if not thousands of years. Today, human traffickers are aided by various apps and platforms. The same platforms that connect us with old friends and family across the globe have been hijacked by the very worst people in our world, who are using them to create networks of criminal enterprise, none more cruel than human trafficking.

Investigations by the BBC and The Wall Street Journal have uncovered how traffickers use Instagram, Facebook and WhatsApp to advertise, sell and co-ordinate the trafficking of young women. One would have thought that the issue would be of the utmost importance to Meta—Facebook, as it was at the time—yet, as the BBC reported, The Wall Street Journal found that

“the social media giant only took ‘limited action’ until ‘Apple Inc. threatened to remove Facebook’s products from the App Store, unless it cracked down on the practice’.”

I and my friends across the aisle who sat on the DCMS Committee and the Joint Committee on the draft Bill know exactly what it is like to have Facebook’s high heid yins before us. They will do absolutely nothing to respond to legitimate pressure. They understand only one thing: the force of law and of financial penalty. Only when its profits were in danger did Meta take the issue seriously.

The omission of human trafficking from schedule 7 is especially worrying, because if human trafficking is not directly addressed as priority illegal content, we can be certain that it will not be prioritised by the platforms. We know from their previous behaviour that the platforms never do anything that will cost them money unless they are forced to do so. We understand that it is difficult to regulate in respect of human trafficking on platforms: it requires work across borders and platforms, with moderators speaking different languages. It is not cheap or easy, but it is utterly essential. The social media companies make enormous amounts of money, so let us shed no tears for them and for the costs that will be entailed. If human trafficking is not designated as a priority harm, I fear that it will fall by the wayside.

In Committee, the then Minister said that the relevant legislation was covered by other parts of the Bill and that it was not necessary to incorporate offences under the Modern Slavery Act 2015 into priority illegal content. He referred to the complexity of offences such as modern slavery, and said how illegal immigration and prostitution priority offences might cover that already. That is simply not good enough. Human traffickers use platforms as part of their arsenal at every stage of the process, from luring in victims to co-ordinating their movements and threatening their families. The largest platforms have ample capacity to tackle these problems and must be forced to be proactive. The consequences of inaction will be grave.

Photo of Chris Philp Chris Philp Conservative, Croydon South

It is a pleasure to follow John Nicolson.

Let me begin by repeating my earlier congratulations to my hon. Friend Damian Collins on assuming his place on the Front Bench. Let me also take this opportunity to extend my thanks to those who served on the Bill Committee with me for some 50 sitting hours—it was, generally speaking, a great pleasure—and, having stepped down from the Front Bench, to thank the civil servants who have worked so hard on the Bill, in some cases over many years.

It would be strange if I did not broadly support the Government amendments, given that I have spent most of the last three or months concocting them. I will touch on one or two of them, and then mention some areas in which I think the House might consider going further when the Bill proceeds to the other end of the building. I certainly welcome new clause 19, which gives specific protection to content generated by news media publishers by ensuring that there is a right of appeal before it can be removed. I take the view—and I think the Government do as well—that protecting freedom of the press is critical, but as we grant news media publishers this special protection, it is important for us to ensure that we are granting it to organisations that actually deserve it.

That, I think, is the purpose of amendments 11 and 12, tabled by my right hon. and learned Friend Sir Jeremy Wright. The amendments apply to clause 50, which defines the term “recognised news publisher”. During the evidence sessions in Committee, some concern was expressed that the definition was too wide, and that some organisations—"bad actors”, as the Minister put it—might manage to organise themselves in such a way that they would benefit from this exemption. My right hon. and learned Friend’s amendments are designed to tighten that definition a little bit. There is some concern that the drafting of the amendments might effectively give rise to back-door press regulation because determining whether news publishers’ terms and conditions are “suitable and sufficient” constitutes a value judgment, but I certainly agree that clause 50 needs tightening up.

I welcome—unsurprisingly—the reference in the written ministerial statement to tabling an amendment in the House of Lords providing that sanctioned organisations cannot benefit from this exemption. I suggest, however, that their lordships might like to consider going even further, for example by saying that where content amounts to a foreign interference offence as defined by the National Security Bill, introduced by my hon. Friend Mr Jayawardena—the Under-Secretary of State for International Trade, who is in his place on the Front Bench—the organisation propagating it should not be able to benefit from the “recognised news publisher” exemption. Their lordships may wish to consider that, along with any other ideas for tightening the definition in clause 50.

Let me now say a word about free speech. It has been widely misreported that the Bill mandates censorship of speech that is legal but harmful. As I said in my intervention on the Minister earlier, that is categorically untrue. While the large social media platforms will have to address such content as part of their terms and conditions, they are not compelled in the actions that they have to take in relation to it; they simply have to risk-assess it, adopt a policy—what that policy is will be up to them—and then apply that policy consistently. They are not obliged to take any action, and they are certainly not obliged to remove the content entirely. Lest there should be any doubt about that, Government amendment 71 to clause 13 makes it explicit that it is reasonable to take no action if the platform sees fit.

Photo of Joanna Cherry Joanna Cherry Shadow SNP Spokesperson (Justice and Home Affairs) 2:00, 12 July 2022

I hear what the hon. Gentleman is saying, but he will have heard the speech made by his colleague, Mr Davis. Does he not accept that it is correct to say that there is a risk of an increase in content moderation, and does he therefore see the force of my amendment, which we have previously discussed privately and which is intended to ensure that Twitter and other online service providers are subject to anti-discrimination law in the United Kingdom under the Equality Act 2010?

Photo of Chris Philp Chris Philp Conservative, Croydon South

I did of course hear what was said by my right hon. Friend Mr Davis. To be honest, I think that increased scrutiny of content which might constitute abuse of harassment, whether of women or of ethnic minorities, is to be warmly welcomed. The Bill provides that the risk assessors must pay attention to the characteristics of the user. There is no cross-reference to the Equality Act—I know the hon. and learned Lady has submitted a request on that, to which my successor Minister will now be responding—but there are references to characteristics in the provisions on safety duties, and those characteristics do of course include gender and race.

In relation to the risk that these duties are over-interpreted or over-applied, for the first time ever there is a duty for social media firms to have regard to freedom of speech. At present these firms are under no obligation to have regard to it, but clause 19(2) imposes such a duty, and anyone who is concerned about free speech should welcome that. Clauses 15 and 16 go further: clause 15 creates special protections for “content of democratic importance”, while clause 16 does the same for content of journalistic importance. So while I hugely respect and admire my right hon. Friend the Member for Haltemprice and Howden, I do not agree with his analysis in this instance.

I would now like to ask a question of my successor. He may wish to refer to it later or write to me, but if he feels like intervening, I will of course give way to him. I note that four Government amendments have been tabled; I suppose I may have authorised them at some point. Amendments 72, 73, 78 and 82 delete some words in various clauses, for example clauses 13 and 15. They remove the words that refer to treating content “consistently”. The explanatory note attached to amendment 72 acknowledges that, and includes a reference to new clause 14, which defines how providers should go about assessing illegal content, what constitutes illegal content, and how content is to be determined as being in one of the various categories.

As far as I can see, new clause 14 makes no reference to treating, for example, legal but harmful content “consistently”. According to my quick reading—without the benefit of highly capable advice—amendments 72, 73, 78 and 82 remove the obligation to treat content “consistently”, and it is not reintroduced in new clause 14. I may have misread that, or misunderstood it, but I should be grateful if, by way of an intervention, a later speech or a letter, my hon. Friend the Minister could give me some clarification.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

I think that the codes of practice establish what we expect the response of companies to be when dealing with priority illegal harm. We would expect the regulator to apply those methods consistently. If my hon. Friend fears that that is no longer the case, I shall be happy to meet him to discuss the matter.

Photo of Chris Philp Chris Philp Conservative, Croydon South

Clause 13(6)(b), for instance, states that the terms of service must be

“applied consistently in relation to content”,

and so forth. As far as I can see, amendment 72 removes the word “consistently”, and the explanatory note accompanying the amendment refers to new clause 14, saying that it does the work of the previous wording, but I cannot see any requirement to act consistently in new clause 14. Perhaps we could pick that up in correspondence later.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

If there is any area of doubt, I shall be happy to follow it up, but, as I said earlier, I think we would expect that if the regulator establishes through the codes of practice how a company will respond proactively to identify illegal priority content on its platform, it is inherent that that will be done consistently. We would accept the same approach as part of that process. As I have said, I shall be happy to meet my hon. Friend and discuss any gaps in the process that he thinks may exist, but that is what we expect the outcome to be.

Photo of Chris Philp Chris Philp Conservative, Croydon South

I am grateful to my hon. Friend for his comments. I merely observe that the “consistency” requirements were written into the Bill, and, as far as I can see, are not there now. Perhaps we could discuss it further in correspondence.

Let me turn briefly to clause 40 and the various amendments to it—amendments 44, 45, 13, 46 and others—and the remarks made by the shadow Minister, Alex Davies-Jones, about the Secretary of State’s powers. I intervened on the hon. Lady earlier on this subject. It also arose in Committee, when she and many others made important points on whether the powers in clause 40 went too far and whether they impinged reasonably on the independence of the regulator, in this case Ofcom. I welcome the commitments made in the written ministerial statement laid last Thursday—coincidentally shortly after my departure—that there will be amendments in the Lords to circumscribe the circumstances in which the Secretary of State can exercise those powers to exceptional circumstances. I heard the point made by the hon. Member for Ochil and South Perthshire that it was unclear what “exceptional” meant. The term has a relatively well defined meaning in law, but the commitment in the WMS goes further and says that the bases upon which the power can be exercised will be specified and limited to certain matters such as public health or matters concerning international relations. That will severely limit the circumstances in which those powers can be used, and I think it would be unreasonable to expect Ofcom, as a telecommunications regulator, to have expertise in those other areas that I have just mentioned. I think that the narrowing is reasonable, for the reasons that I have set out.

Photo of Julian Knight Julian Knight Chair, Culture, Media and Sport Committee, Chair, Culture, Media and Sport Committee, Chair, Culture, Media and Sport Sub-committee on Online Harms and Disinformation, Chair, Culture, Media and Sport Sub-committee on Online Harms and Disinformation

Those areas are still incredibly broad and open to interpretation. Would it not be easier just to remove the Secretary of State from the process and allow this place to take directly from Ofcom the code of standards that we are talking about so that it can be debated fully in the House?

Photo of Chris Philp Chris Philp Conservative, Croydon South

I understand my hon. Friend’s point. Through his work as the Chairman of the Select Committee he has done fantastic work in scrutinising the Bill. There might be circumstances where one needed to move quickly, which would make the parliamentary intervention he describes a little more difficult, but he makes his point well.

Photo of Chris Philp Chris Philp Conservative, Croydon South

The Government are often in possession of information—for example, security information relating to the UK intelligence community—that Ofcom, as the proposer of a code or a revised code, may not be in possession of. So the ability of the Secretary of State to propose amendments in those narrow fields, based on information that only the Government have access to, is not wholly unreasonable. My hon. Friend will obviously comment further on this in his speech, and no doubt the other place will give anxious scrutiny to the question as well.

I welcome the architecture in new clause 14 in so far as it relates to the definition of illegal content; that is a helpful clarification. I would also like to draw the House’s attention to amendment 16 to clause 9, which makes it clear that acts that are concerned with the commission of a criminal offence or the facilitation of a criminal offence will also trigger the definitions. That is a very welcome widening.

I do not want to try the House’s patience by making too long a speech, given how much the House has heard from me already on this topic, but there are two areas where, as far as I can see, there are no amendments down but which others who scrutinise this later, particularly in the other place, might want to consider. These are areas that I was minded to look at a bit more over the summer. No doubt it will be a relief to some people that I will not be around to do so. The first of the two areas that might bear more thought is clause 137, which talks about giving academic researchers access to social media platforms. I was struck by Frances Haugen’s evidence on this. The current approach in the Bill is for Ofcom to do a report that will takes two years, and I wonder if there could be a way of speeding that up slightly.

The second area concerns the operation of algorithms promoting harmful content. There is of course a duty to consider how that operates, but when it comes algorithms promoting harmful content, I wonder whether we could be a bit firmer in the way we treat that. I do not think that would restrain free speech, because the right of free speech is the right to say something; it is not the right to have an algorithm automatically promoting it. Again, Frances Haugen had some interesting comments on that.

Photo of Jeremy Wright Jeremy Wright Conservative, Kenilworth and Southam

I agree that there is scope for more to be done to enable those in academia and in broader civil society to understand more clearly what the harm landscape looks like. Does my hon. Friend agree that if they had access to the sort of information he is describing, we would be able to use their help to understand more fully and more clearly what we can do about those harms?

Photo of Chris Philp Chris Philp Conservative, Croydon South

My right hon. and learned Friend is right, as always. We can only expect Ofcom to do so much, and I think inviting expert academic researchers to look at this material would be welcome. There is already a mechanism in clause 137 to produce a report, but on reflection it might be possible to speed that up. Others who scrutinise the Bill may also reach that conclusion. It is important to think particularly about the operation of algorithmic promotion of harmful content, perhaps in a more prescriptive way than we do already. As I have said, Frances Haugen’s evidence to our Committee in this area was particularly compelling.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport 2:15, 12 July 2022

I agree with my hon. Friend on both points. I discussed the point about researcher access with him last week, when our roles were reversed, so I am sympathetic to that. There is a difference between that and the researcher access that the Digital Services Act in Europe envisages, which will not have the legal powers that Ofcom will have to compel and demand access to information. It will be complementary but it will not replace the primary powers that Ofcom will have, which will really set our regime above those elsewhere. It is certainly my belief that the algorithmic amplification of harmful content must be addressed in the transparency reports and that, where it relates to illegal activities, it must absolutely be within the scope of the regulator to state that actively promoting illegal content to other people is an offence under this legislation.

Photo of Chris Philp Chris Philp Conservative, Croydon South

On my hon. Friend’s first point, he is right to remind the House that the obligations to disclose information to Ofcom are absolute; they are hard-edged and they carry criminal penalties. Researcher access in no way replaces that; it simply acts as a potential complement to it. On his second point about algorithmic promotion, of course any kind of content that is illegal is prohibited, whether algorithmically promoted or otherwise. The more interesting area relates to content that is legal but perceived as potentially harmful. We have accepted that the judgments on whether that content stays up or not are for the platforms to make. If they wish, they can choose to allow that content simply to stay up. However, it is slightly different when it comes to algorithmically promoting it, because the platform is taking a proactive decision to promote it. That may be an area that is worth thinking about a bit more.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

On that point, if a platform has a policy not to accept a certain sort of content, I think the regulators should expect it to say in its transparency report what it is doing to ensure that it is not actively promoting that content through a newsfeed, on Facebook or “next up” on YouTube. I expect that to be absolutely within the scope of the powers we have in place.

Photo of Chris Philp Chris Philp Conservative, Croydon South

In terms of content that is legal but potentially harmful, as the Bill is drafted, the platforms will have to set out their policies, but their policies can say whatever they like, as we discussed earlier. A policy could include actively promoting content that is harmful through algorithms, for commercial purposes. At the moment, the Bill as constructed gives them that freedom. I wonder whether that is an area that we can think about making slightly more prescriptive. Giving them the option to leave the content up there relates to the free speech point, and I accept that, but choosing to algorithmically promote it is slightly different. At the moment, they have the freedom to choose to algorithmically promote content that is toxic but falls just on the right side of legality. If they want to do that, that freedom is there, and I just wonder whether it should be. It is a difficult and complicated topic and we are not going to make progress on it today, but it might be worth giving it a little more thought.

I think I have probably spoken for long enough on this Bill, not just today but over the last few months. I broadly welcome these amendments but I am sure that, as the Bill completes its stages, in the other place as well, there will be opportunities to slightly fine-tune it that all of us can make a contribution to.

Photo of Margaret Hodge Margaret Hodge Labour, Barking

First, congratulations to the Under-Secretary of State for Digital, Culture, Media and Sport, Damian Collins. I think his is one of the very few appointments in these latest shenanigans that is based on expertise and ability. I really welcome him, and the work he has done on the Bill this week has been terrific. I also thank Chris Philp. When he held the position, he was open to discussion and he accepted a lot of ideas from many of us across the House. As a result, I think we have a better Bill before us today than we would have had. My gratitude goes to him as well.

I support much of the Bill, and its aim of making the UK the safest place to be online is one that we all share. I support the systems-based approach and the role of Ofcom. I support holding the platforms to account and the importance of protecting children. I also welcome the cross-party work that we have done as Back Benchers, and the roles played by both Ministers and by Sir Jeremy Wright. I thank him for his openness and his willingness to talk to us. Important amendments have been agreed on fraudulent advertising, bringing forward direct liability so there is not a two-year wait, and epilepsy trolling—my hon. Friend Kim Leadbeater promoted that amendment.

I also welcome the commitment to bring forward amendments in the Lords relating to the amendments tabled by Andrew Percy and the right hon. and learned Member for Kenilworth and Southam—I think those amendments are on the amendment paper but it is difficult to tell. It is important that the onus on platforms to be subject to regulation should be based not on size and functionality but on risk of harm. I look forward to seeing those amendments when they come back from the other place. We all know that the smallest platforms can present the greatest risk. The killing of 51 people in the mosques in Christchurch, New Zealand is probably the most egregious example, as the individual concerned had been on 8chan before committing that crime.

I am speaking to amendments 156 and 157 in my name and in the names of other hon. and right hon. Members. These amendments would address the issue of anonymous abuse. I think we all accept that anonymity is hugely important, particularly to vulnerable groups such as victims of domestic violence, victims of child abuse and whistleblowers. We want to retain anonymity for a whole range of groups and, in framing these amendments, I was very conscious of our total commitment to doing so.

Equally, freedom of speech is very important, as Mr Davis said, but freedom of speech has never meant freedom to harm, which is not a right this House should promote. It is difficult to define, and it is difficult to get the parameters correct, but we should not think that freedom of speech is an absolute right without constraints.

Photo of Joanna Cherry Joanna Cherry Shadow SNP Spokesperson (Justice and Home Affairs)

I agree with the right hon. Lady that freedom of speech is not absolute. As set out in article 10 of the European convention on human rights, there have to be checks and balances. Nevertheless, does she agree freedom of speech is an important right that this House should promote, with the checks and balances set out in article 10 of the ECHR?

Photo of Margaret Hodge Margaret Hodge Labour, Barking

Absolutely. I very much welcome the hon. and learned Lady’s amendment, which clarifies the parameters under which freedom of speech can be protected and promoted.

Equally, freedom of speech does not mean freedom from consequences. The police and other enforcement agencies can pursue unlawful abuse, assuming they have the resources, which we have not discussed this afternoon. I know the platforms have committed to providing the finance for such resources, but I still question whether the resources are there.

The problem with the Bill and the Government amendments, particularly Government amendment 70, is that they weaken the platforms’ duty on legal but harmful abuse. Such abuse is mainly anonymous and the abusers are clever. They do not break the law; they avoid the law with the language they use. It might be best if I give an example. People do not say, in an antisemitic way, “I am going to kill all Jews.” We will not necessarily find that online, but we might find, “I am going to harm all globalists.” That is legal but harmful and has the same intent. We should think about that, without being beguiled by the absolute right to freedom of speech that I am afraid the right hon. Member for Haltemprice and Howden is promoting, otherwise we will find that the Bill does not meet the purposes we all want.

Much of the abuse is anonymous. We do not know how much, but much of it is. When there was racist abuse at the Euros, Twitter claimed that 99% of postings of racist abuse were identifiable. Like the Minister, I wrote to Twitter to challenge that claim and found that Twitter was not willing to share its data with me, claiming GDPR constraints.

It is interesting that, in recent days, the papers have said that one reason Elon Musk has given for pulling out of his takeover is that he doubts Twitter’s claim that fake and spam accounts represent less than 5% of users. There is a lack of understanding and knowledge of the extent of anonymous abuse.

In the case I have shared with the Minister on other occasions, I received 90,000 posts in the two months from the publication of the Equality and Human Rights Commission report to the shenanigans about the position of the previous leader of the Labour party—from October to Christmas. The posts were monitored for me by the Community Security Trust. When I asked how many of the posts were anonymous, I was told that it had been unable to do that analysis. I wish there were the resources to do so, but I think most of the posts were anonymous and abusive.

There is certainly public support for trying to tackle abusive posts. A June 2021 YouGov poll found that 78% of the public are in favour of revealing the identity of those who post online, and we should bear that in mind. If people feel strongly about this, and the poll suggests that they do, we should respond and not put it to one side.

The Government have tried to tackle this with a compromise following the very good work by Siobhan Baillie. The Bill places a duty on the platforms to give users the option to verify their identity. If a user chooses to remain unverified, they may not be able to interact with verified accounts. Although I support the motives behind that amendment, I have concerns.

First, the platform itself would have to verify who holds the account, which gives the platforms unprecedented access to personal details. Following Cambridge Analytica, we know how such data can be abused. Data on 87 million identities was stolen, and we know it was used to influence the Trump election in 2016, and it may have been a factor in the Brexit referendum.

Secondly, the police have been very clear on how I should deal with anonymous online abuse. They say that the last thing I should do is remove it, as they need it to be able to judge whether there is a real threat within the abuse that they should take seriously. So individuals having that right does not diminish the real harm they could face if the online abuse is removed.

Thirdly, one of the problems with a lot of online abuse is not just that it is horrible or can be dangerous in particular circumstances, but that it prevents democracy. It inhibits freedom of speech by inhibiting engagement in free, democratic discourse. Online abuse is used to undermine an individual’s credibility. A lot of the abuse I receive seeks to undermine my credibility. It says that I am a bad woman, that I abuse children, that I break tax law and that I do this, that and the other. Building that picture of me as someone who cannot be believed undermines my ability to enter into legitimate democratic debate on issues I care about. Simply removing anonymous online abuse from my account does not stop the circulation of abusive, misleading content that undermines my democratic right to free speech. Therefore, in its own way, it undermines free speech.

Amendments 156 and 157, in my name and in the name of other colleagues, are based on a strong commitment to protecting anonymity, especially for vulnerable groups. We seek to tackle anonymous abuse not by denying anonymity but by ensuring traceability. It is quite simple. The Government recognise the feasibility and importance of that with age verification; they have now accepted the argument on age verification, and I urge them to take it further. Although I have heard that various groups are hostile to what we are suggesting, in a meeting I held last week with HOPE not hate there was agreement that what we are proposing made sense, and therefore we and the Government should pursue it.

Under our proposed scheme, any individual who chooses to go on a platform would have to have their identity verified, not by the platform but by a third party. We would thus remove the platform’s ability to access the individual’s data, which it could use in an inappropriate way. Such a scheme is perfectly feasible, particularly now that the Government have introduced the age verification mechanism. More than 99% of us have bank accounts, so there is a simple way of verifying someone’s identity through a third-party mechanism without giving platforms the powers I have described. Everybody would be able to enter any platform and have total anonymity, and only if and when an individual posts something that breaks the law will they lose their right to anonymity.

To go back to a point I made in an intervention on the Minister, that would also involve having minimum standards on harmful but legal abuse. Under a minimum standards platform, only if someone posted abuse that is harmful—this would mainly be illegal abuse, but it would also be harmful but legal abuse—would they lose their right to anonymity. I think that is good, because one could name and shame. Most importantly, this would be the most effective tool for preventing a lot of online abuse from happening in the first place, and we should all be focusing our energies on doing so.

My hon. Friend Alex Davies-Jones, our Front Bencher, has talked about women who are particularly vulnerable, and I think our measure would be very important—my experience justifies that. It would be a powerful deterrent. I hope that our Front-Bench team will support the proposition we are putting before the House. I will not press it to a vote if they do not, although I would regret the fact that they did not support it.

I regret that the Government do not feel able to support our proposition, but I think its time will come. A lot of the stuff that we are doing in this Bill is innovative, and we are not sure where everything will land. We are likely to get some things wrong and others right. I say to all Members, from across this House, that if we really want to reduce the amount of harmful abuse online, tackling anonymous abuse, rather than anonymity, must be central to our concerns. I urge my Front-Bench team and the Government to think carefully about this.

Photo of Nicholas Fletcher Nicholas Fletcher Conservative, Don Valley 2:30, 12 July 2022

I rise to speak on amendments 50, 51 and 55, and I share the free speech concerns that I think lie behind amendment 151. As I said in Committee to the previous Minister, my hon. Friend Chris Philp, who knew this Bill inside out—it was amazing to watch him do it—I have deep concerns about how the duty on “legal but harmful” content will affect freedom of speech. I do not want people to be prevented from saying what they think. I am known for saying what I think, and I believe others should be allowed the same freedom, offline and online. What is harmful can be a subjective question, and many of us in this House might have different answers. When we start talking about restricting content that is perfectly legal, we should be very careful.

This Bill is very complex and detailed, as I know full well, having been on the Committee. I support the Bill—it is needed—but when it comes to legal but harmful content, we need to make sure that free speech is given enough protection. We have to get the right balance, but clause 19 does not do that. It says only that social media companies have

“a duty to have regard to the importance of protecting users’ right to freedom of expression within the law.”

There is no duty to do anything about freedom of speech; it just says, “You have to think about the importance of it”. That is not enough.

I know that the Bill does not state that social media companies have to restrict content—I understand that—but in the real world that is what will happen. If the Government define certain content as harmful, no social media company will want to be associated with it. The likes of Meta will want to be seen to get tough on legally defined harmful content, so of course it will be taken down or restricted. We have to counterbalance that instinct by putting stronger free speech duties in the Bill if we insist on it covering legal but harmful.

The Government have said that we cannot have stronger free speech obligations on private companies, and, in general, I agree with that. However, this Bill puts all sorts of other obligations on Facebook, Twitter and Instagram, because they are not like other private companies. These companies and their chief executive officers are household words all around the world, and their power and influence is incredible. In 2021, Facebook’s revenue was $117 billion, which is higher than the GDP—

Photo of Andrew Percy Andrew Percy Conservative, Brigg and Goole

Is that not exactly why there has to be action on legal but harmful content? The cross-boundary, cross-national powers of these organisations mean that we have to insist that they take action against harm, whether lawful or unlawful. We are simply asking those organisations to risk assess and ensure that appropriate warnings are provided, just as they are in respect of lots of harms in society; the Government require corporations and individuals to risk assess those harms and warn about them. The fact that these organisations are so transnational and huge is absolutely why we must require them to risk assess legal but harmful content.

Photo of Nicholas Fletcher Nicholas Fletcher Conservative, Don Valley

I understand what my hon. Friend is saying, but the list of what is legal but harmful will be set by the Secretary of State, not by Parliament. All we ask is for that to be discussed on the Floor of the House before we place those duties on the companies. That is all I am asking us to do.

Facebook has about 3 billion active users globally. That is more than double the population of China, the world’s most populous nation, and it is well over half the number of internet users in the entire world. These companies are unlike any others we have seen in history. For hundreds of millions of people around the world, they are the public square, which is how the companies have described themselves: Twitter founder Jack Dorsey said in 2018:

“We believe many people use Twitter as a digital public square. They gather from all around the world to see what’s happening, and have a conversation about what they see.”

In 2019, Mark Zuckerberg said:

“Facebook and Instagram have helped people connect with friends, communities, and interests in the digital equivalent of a town square.”

Someone who is blocked from these platforms is blocked from the public square, as we saw when the former President of the United States was blocked. Whatever we might think about Donald Trump, it cannot be right that he was banned from Twitter. We have to have stronger protection for free speech in the digital public square than clause 19 gives. The Bill gives the Secretary of State the power to define what is legal but harmful by regulations. As I have said, this is an area where free speech could easily be affected—

Photo of Adam Afriyie Adam Afriyie Conservative, Windsor

I commend my hon. Friend for the powerful speech he is making. It seems to many of us here that if anyone is going to be setting the law or a regulation, it should really be done in the Chamber of this House. I would be very happy if we had annual debates on what may be harmful but is currently lawful, in order to make it illegal. I very much concur with what he is saying.

Photo of Nicholas Fletcher Nicholas Fletcher Conservative, Don Valley

I thank my hon. Friend for his contribution, which deals with what I was going to finish with. It is not enough for the Secretary of State to have to consult Ofcom; there should be public consultation too. I support amendment 55, which my hon. Friend has tabled.

Photo of Anna McMorrin Anna McMorrin Shadow Minister (Justice)

Not too long ago, the tech industry was widely looked up to and the internet was regarded as the way forward for democracy and freedoms. Today that is not the case. Every day we read headlines about data leaks, racist algorithms, online abuse, and social media platforms promoting, and becoming swamped in, misinformation, misogyny and hate. These problems are not simply the fault of those platforms and tech companies; they are the result of a failure to govern technology properly. That has resulted from years of muddled thinking and a failure to bring forward this Bill, and now, a failure to ensure that the Bill is robust enough.

Ministers have talked up the Bill, and I welcome the improvements that were made in Committee. Nevertheless, Ministers had over a decade in which to bring forward proposals, and in that time online crime exploded. Child sexual abuse online has become rife; the dark web provides a location for criminals to run rampant and scams are widespread.

Delay has also allowed disinformation to spread, including state-sponsored propaganda and disinformation, such as from Russia’s current regime. False claims and fake fact checks are going viral. That encourages other groups to adopt such tactics, in an attempt to undermine democracy, from covid deniers to climate change deniers—it is rampant.

Today I shall speak in support of new clause 3, to put violence against women and girls on the face of the Bill. As a female MP, I, along with my colleagues, have faced a torrent of abuse online, attacking me personally and professionally. I have been sent images such as that of a person with a noose around their neck, as well as numerous messages containing antisemitic and misogynistic abuse directed towards both me and my children. It is deeply disturbing, but also unsurprising, that one in five women across the country have been subjected to abuse; I would guess that that figure is actually much higher.

Photo of Joanna Cherry Joanna Cherry Shadow SNP Spokesperson (Justice and Home Affairs)

I am really sorry to hear about the abuse that the hon. Lady and her family have received. Many women inside and without this Chamber, such as myself, receive terrible abuse on Twitter, including repeated threats to shoot us if we do not shut the f-u-c-k up. Twitter refuses to take down memes of a real human hand pointing a gun at me and other feminists and lesbians, telling us to shut the f-u-c-k up. Does she see the force of my amendment to ensure that Twitter apply its moderation policy evenly across society with regard to all protected characteristics, including sex?

Photo of Anna McMorrin Anna McMorrin Shadow Minister (Justice)

The hon. and learned Lady makes a very good point, and that illustrates what I am talking about in my speech—the abuse that women face online. We need this legislation to ensure that tech companies take action.

There is a very dark side to the internet, deeply rooted in misogyny. The End Violence Against Women organisation released statistics last year, stating that 85% of women who experienced online abuse from a partner or ex-partner also received abuse online. According to the latest Office for National Statistics figures, 92% of women who were killed in the year ending March 2021 were killed by men. Just yesterday, a woman was stabbed in the back by a male cyclist in east London, near to where Zara Aleena was murdered just two weeks ago. And in the year 2021, nearly 41,000 women were victims of sexual assault—and those were just the ones who reported it. We know that the actual figure was very much higher. That was the highest number of sexual offences ever recorded within a 12-month period. It is highly unlikely that any of those women will ever see their perpetrator brought to justice, because of the current 1.3% prosecution rate of rape cases. Need I continue?

Violence against women and girls is an ever-growing epidemic, and time is running out. This Government are more concerned with piecemeal actions that fail to tackle the root causes of the issue. Although the introduction of new criminal offences such as cyber-flashing and rape threats is a welcome first step, there are significant concerns about their enforceability. The cyber-flashing offence requires the police to prove a perpetrator’s intent to cause harm, which is incredibly difficult to evidence. That is the loophole through which perpetrators avoid consequences.

I doubt there are many women who have not been sent unsolicited images of male genitals online. There are accounts of women being airdropped images on public transport while on their way to work. What does that leave them feeling? Violated—scared, not knowing who in their train carriage or on their bus has sent those unsolicited images. The online dating platform Bumble conducted research on cyber-flashing and found that of its users, nearly half of women aged 18 to 24 had received a sexual photo that they did not ask for in the last year alone.

So, considering the scale of this issue and the Government’s appalling record on prosecuting sexual assault offences, why would this new offence be any different? Acts of violence towards women are not merely isolated incidents. We know that, unfortunately, there is systemic misogyny within our society that results in a shocking number of women losing their lives, but we refuse to see it. Failing to name violence against women and girls on the face of the Bill is putting the lives of countless women at risk, and will leave behind a dangerous and damning legacy, even by this Government’s standards.

I welcome the new Minister to his place and hope that he will look at this issue in a new light. I hope that the Government can put politics to one side for just a moment, match their words with deeds and commit to protecting women across the country by supporting new clause 3.

Several hon. Members:

rose—

Photo of Eleanor Laing Eleanor Laing Deputy Speaker and Chairman of Ways and Means, Chair, Standing Orders Committee (Commons), Chair, Standing Orders Committee (Commons)

Order. The House will see that a great many people still wish to speak. May I explain that there are two groups of amendments? We will finish debating this group at 4.30 pm, after which there will be some votes, and debate on the next group of amendments will last until 7 o’clock. By my calculations, there might be more time for speeches during the debate on the next group, so if anyone wishes to speak on that group rather than the current group, I would be grateful if they came and indicated that to me. Meanwhile, if everyone takes about eight minutes and no longer, everyone will have the opportunity to speak. I call Sir Jeremy Wright.

Photo of Jeremy Wright Jeremy Wright Conservative, Kenilworth and Southam

I shall speak to the amendments in my name and the names of other right hon. and hon. Members, to whom I am grateful for their support. I am also grateful to the organisations that helped me to work through some of the problems I am about to identify, including the Carnegie Trust, Reset and the Antisemitism Policy Trust.

On the first amendments I shall talk about, amendments 42 and 43, I have been able to speak to Lego, so I can honestly say that these amendments were put together with Lego. Let me explain. The focus of the Bill, quite rightly, is on safety, and there is no safety more important than the safety of children. In that respect, the Bill is clear: platforms must give the safety of children the utmost priority and pay close attention to ways to enhance it. In other parts of the Bill, however, there are countervailing duties—for example, in relation to freedom of speech and privacy—where, predominantly in relation to adults, we expect platforms to conduct a balancing exercise. It seems right to me to think about that in the context of children, too.

As I said, the emphasis is rightly on children’s safety, but the safest approach would be to prohibit children from any online activity at all. We would not regard such an approach as sensible, because there are benefits to children in being able to engage—safely, of course—in online activity and to use online products and services. It seems to me that we ought to recognise that in the language of the Bill. Amendment 42 would do that when consideration is given to the safety duties designed to protect children set out in clause 11, which requires that “proportionate measures” must be taken to protect children’s safety and goes on to explain what factors might be taken into account when deciding what is proportionate, by adding

“the benefits to children’s well-being” of the product or service in that list of factors. Amendment 43 would do the same when consideration is given to the online safety objectives set out in schedule 4. Both amendments are designed to ensure that the appropriate balance is struck when judgments are taken by platforms.

Others have spoken about journalistic content, and I am grateful for what the Minister said about that, but my amendment 10 is aimed at the defect that I perceive in clause 16. The Bill gives additional protections and considerations to journalists, which is entirely justifiable, given the important role that journalism plays in our society, but those extra protections mean that it will be harder for platforms to remove potentially harmful content that is also journalistic content. We should be sure, therefore, that the right people get the benefit of that protection.

It is worth having look at what clause 16 says and does. It sets out that a platform—a user-to-user service—in category 1 will have

“A duty to operate a service using proportionate systems and processes designed to ensure that the importance of the free expression of journalistic content is taken into account when making decisions about…how to treat such content (especially decisions about whether to take it down or restrict users’ access to it), and…whether to take action against a user generating, uploading or sharing such content.”

So it is important, because of the significance of those protections, that we get right the definitions of those who should benefit from them. Amendment 10 would amend clause 16(8), which states that:

“For the purposes of this section content is “journalistic content”, in relation to a user-to-user service, if…the content is” either

“news publisher content in relation to that service”— the definition of which I will return to—

“or…regulated user-generated content in relation to that service”.

That is the crucial point. The content also has to be

“generated for the purposes of journalism” and be linked to the UK.

The first problem here is that journalism is not defined in the Bill. There are definitions of journalism, but none appears in the text of this Bill. “UK-linked” does not narrow it down much, and “regulated user-generated content” is a very broad category indeed. Clause 16 as drafted offers the protection given to journalistic content not just to news publishers, but to almost everybody else who chooses to define themselves as a journalist, whether or not that is appropriate. I do not think that that is what the Bill is intended to do, or an approach that this House should endorse. Amendment 10 would close the loophole by removing the second limb, regulated user-generated content that is not news publisher content. Let me be clear: I do not think that that is the perfect answer to the question I have raised, but it is better than the Bill as it stands, and if the Government can come up with a way of reintroducing protections of this kind for types of journalistic content beyond news publisher content that clearly deserve them, I will be delighted and very much open to it. Currently, however, the Bill is defective and needs to be remedied.

That brings us to the definition of news publisher content, because it is important that if we are to give protection to that category of material, we are clear about what we mean by it. Amendments 11 and 12 relate to the definition of news publisher content that arises from the definition of a recognised news publisher in clauses 49 and 50. That matters for the same reason as I just set out: we should give these protections only to those who genuinely deserve them. That requires rigorous definition. Clause 50 states that if an entity is not named in the Bill, as some are, it must fulfil a set of conditions set out in subsection (2), which includes having a standards code and policies and procedures for handling and resolving complaints. The difficulty here is that in neither case does the Bill refer to any quality threshold for those two things, so having any old standards code or any old policy for complaints will apparently qualify. That cannot be right.

I entirely accept that inserting a provision that the standards code and the complaints policies and procedures should be both “suitable and sufficient” opens the question whose job it becomes to decide what is suitable and sufficient. I am familiar with all the problems that may ensue, so again, I do not say that the amendment is the final word on the subject, but I do say that the Government need to look more carefully at what the value of those two items on the list really is if the current definition stands. If we are saying that we want these entities to have a standards code and a complaints process that provide some reassurance that they are worthy of the protections the Bill gives, it seems to me that meaningful criteria must apply, which currently they do not.

The powers of the Secretary of State have also been discussed by others, but I perhaps differ from their view in believing that there should be circumstances in which the Secretary of State should hold powers to act in genuine emergency situations. However, being able to direct Ofcom, as the Bill allows the Secretary of State to do, to modify a code of practice

“for reasons of public policy” is far too broad. Amendment 13 would simply remove that capacity, with amendment 14 consequential upon it.

I accept that on 7 July the Secretary of State issued a written statement that helps to some extent on that point—it was referred to by my hon. Friend Chris Philp. First, it states that the Secretary of State would act only in “exceptional circumstances”, although it does not say who defines what exceptional circumstances are, leaving it likely that the Secretary of State would do so, which does not help us much. Secondly, it states the intention to replace the phrase

“for reasons of public policy” with a list of circumstances in which the Secretary of State might act. I agree with my hon. Friend Julian Knight that that is still too broad. The proposed list comprises

“national security, public safety, public health, the UK’s international relations and obligations, economic policy and burden to business.”—[Official Report, 7 July 2022; Vol. 717, c. 69WS.]

The platforms we are talking about are businesses. Are we really saying that a burden on them would give the Secretary of State reason to say to Ofcom, the independent regulator, that it must change a code of practice? That clearly cannot be right. This is still too broad a provision. The progress that has been made is welcome, but I am afraid that there needs to be more to further constrain this discretion. That is because, as others have said, the independence of the regulator is crucial not just to this specific part of the Bill but to the credibility of the whole regulatory and legislative structure here, and therefore we should not undermine it unless we have to.

Madam Deputy Speaker, may I also say something very briefly about new clause 14. This is the Government’s additional new clause, which is designed to assist platforms in understanding some of the judgments that they have to make and how to make them, particularly in relation to illegal content. When people first look at this Bill, they will assume that everyone knows what illegal content is and therefore it should be easy to identify and take it down, or take the appropriate action to avoid its promotion. But, as new clause 14 makes clear, what the platform has to do is not just identify content but have reasonable grounds to infer that all elements of an offence, including the mental elements, are present or satisfied, and, indeed, that the platform does not have reasonable grounds to infer that the defence to the offence may be successfully relied upon. That is right, of course, because criminal offences very often are not committed just by the fact of a piece of content; they may also require an intent, or a particular mental state, and they may require that the individual accused of that offence does not have a proper defence to it. The question of course is how on earth a platform is supposed to know either of those two things in each case. This is helpful guidance, but the Government will have to think carefully about what further guidance they will need to give—or Ofcom will need to give—in order to help a platform to make those very difficult judgments.

Photo of Julian Knight Julian Knight Chair, Culture, Media and Sport Committee, Chair, Culture, Media and Sport Committee, Chair, Culture, Media and Sport Sub-committee on Online Harms and Disinformation, Chair, Culture, Media and Sport Sub-committee on Online Harms and Disinformation 3:00, 12 July 2022

Although this is not contained within these measures, it is pertaining to them. Does my right hon. and learned Friend agree that, down the line, Ofcom will want to look at a regime of compliance officers in order to give the guidance that he seeks?

Photo of Jeremy Wright Jeremy Wright Conservative, Kenilworth and Southam

Yes, that is a possible way forward. Ofcom will need to produce a code of practice in this area. I am sure my hon. Friend on the Front Bench will say that that is a suitable way to deal with the problem that I have identified. It may well be, but at this stage, it is right for the House to recognise that the drafting of the Bill at the moment seeks to offer support to platforms, for which I am sure they will be grateful, but it will need to offer some more in order to allow these judgments to be made.

I restate the point that I have made in previous debates on this subject: there is little point in this House passing legislation aimed to make the internet a safer place if the legislation does not work as it is intended to. If our regime does not work, we will keep not a single person any safer. It is important, therefore, that we think about this Bill not in its overarching statements and principles but, particularly at this stage of consideration, in terms of how it will actually work.

You will not find a bigger supporter of the Bill in this House than me, Madam Deputy Speaker, but I want to see it work well and be effective. That means that some of the problems that I am highlighting must be addressed. Because humility is a good way to approach debates on something as ground-breaking and complex as this, I do not pretend that I have all the right answers. These amendments have been tabled because the Bill as it stands does not quite yet do the job that we want it to do. It is a good Bill—it needs to pass—but it can be better, and I very much hope that this process will improve it.

Photo of Joanna Cherry Joanna Cherry Shadow SNP Spokesperson (Justice and Home Affairs)

I rise to speak to new clause 24 and amendments 193 and 191 tabled in my name. I also want to specifically give my support to new clause 6 and amendments 33 and 34 in the name of Dame Diana Johnson.

The purpose of my amendments, as I have indicated in a number of interventions, is to ensure that, when moderating content, category 1 service providers such as Twitter abide by the anti-discrimination law of our domestic legal systems—that is to say the duties set out in the Equality Act 2010 not to discriminate against, harass or victimise their users on the grounds of a protected characteristic.

I quickly want to say a preliminary word about the Bill. Like all responsible MPs, I recognise the growing concern about online harms, and the need to protect service users, especially children, from harmful and illegal content online. That said, the House of Lords’ Communication and Digital Committee was correct to note that the internet is not currently the unregulated Wild West that some people say it is, and that civil and criminal law already applies to activities online as well as offline.

The duty of care, which the Bill seeks to impose on online services, will be a significant departure from existing legislation regulating online content. It will allow for a more preventative approach to regulating illegal online content and will form part of a unified regulatory framework applying to a wide range of online services. I welcome the benefits that this would represent, especially with respect to preventing the proliferation of child sexual and emotional abuse online.

Before I became an MP, I worked for a number of years as a specialist sex crimes prosecutor, so I am all too aware of how children are targeted online. Sadly, there are far too many people in our society, often hiding in plain sight, who seek to exploit children. I must emphasise that child safeguarding should be a No. 1 priority for any Government. In so far as this Bill does that, I applaud it. However, I do have some concerns that there is a significant risk that the Bill will lead to censorship of legal speech by online platforms. For the reasons that were set out by Mr Davis, I am also a bit worried that it will give the Government unacceptable controls over what we can and cannot say online, so I am keen to support any amendments that would ameliorate those aspects of the Bill. I say this to those Members around the Chamber who might be looking puzzled: make no mistake, when the Bill gives greater power to online service providers to regulate content, there is a very real risk that they will be lobbied by certain groups to regulate what is actually legal free speech by other groups. That is partly what my amendment is designed to avoid.

Photo of Jeremy Wright Jeremy Wright Conservative, Kenilworth and Southam

What the hon. and learned Lady says is sensible, but does she accept—this is a point the Minister made earlier—that, at the moment, the platforms have almost unfettered control over what they take down and what they leave up? What this Bill does is present a framework for the balancing exercise that they ought to apply in making those decisions.

Photo of Joanna Cherry Joanna Cherry Shadow SNP Spokesperson (Justice and Home Affairs)

That is why I am giving the Bill a cautious welcome, but I still stand by my very legitimate concerns about the chilling effect of aspects of this Bill. I will give some examples in a moment about the problems that have arisen when organisations such as Twitter are left to their own devices on their moderation of content policy.

As all hon. Members will be aware, under the Equality Act there are a number of protected characteristics. These include: age; gender reassignment; being married or in a civil partnership; being pregnant or on maternity leave; disability; race, including colour, nationality, ethnic or national origin; religion or belief; sex and sexual orientation. It is against the law to discriminate, victimise or harass anyone because of any of those protected characteristics, but Twitter does discriminate against some of the protected characteristics. It often discriminates against women in the way that I described in an intervention earlier. It takes down expressions of feminist belief, but refuses to take down expressions of the utmost violent intent against women. It also discriminates against women who hold gender-critical beliefs. I remind hon. Members that, in terms of the Employment Appeal Tribunal’s decision in the case of Maya Forstater, the belief that sex matters is worthy of respect in a democratic society and, under the Equality Act, people cannot lawfully discriminate against women, or indeed men, who hold those views.

Twitter also sometimes discriminates against lesbians, gay men and bisexual people who assert that their sexual orientation is on the basis of sex, not gender, despite the fact that same-sex orientation, such as I hold, is a protected characteristic under the Equality Act.

At present, Twitter claims not to be covered by the Equality Act. I have seen correspondence from its lawyers that sets out the purported basis for that claim, partly under reference to schedule 25 to the Equality Act, and partly because it says:

Twitter UK is included in an Irish Company and is incorporated in the Republic of Ireland. It does pursue economic activity through a fixed establishment in the UK but that relates to income through sales and marketing with the main activity being routed through Ireland.”

I very much doubt whether that would stand up in court, since Twitter is clearly providing a service in the United Kingdom, but it would be good if we took the opportunity of this Bill to clarify that the Equality Act applies to Twitter, so that when it applies moderation of content under the Bill, it will not discriminate against any of the protected characteristics.

The Joint Committee on Human Rights, of which I am currently the acting Chair, looked at this three years ago. We had a Twitter executive before our Committee and I questioned her at length about some of the content that Twitter was content to support in relation to violent threats against women and girls and, on the other hand, some of the content that Twitter took down because it did not like the expression of certain beliefs by feminists or lesbians.

We discovered on the Joint Committee on Human Rights that Twitter’s hateful conduct policy does not include sex as a protected characteristic. It does not reflect the domestic law of the United Kingdom in relation to anti-discrimination law. Back in October 2019, in the Committee’s report on democracy, freedom of expression and freedom of association, we recommended that Twitter should include sex as a protected characteristic in its hateful conduct policy, but Twitter has not done that. It seems Twitter thinks it is above the domestic law of the United Kingdom when it comes to anti-discrimination.

At that Committee, the Twitter executive assured me that certain violent memes that often appear on Twitter directed against women such as me and against many feminists in the United Kingdom, threatening us with death by shooting, should be removed. However, just in the past 48 hours I have seen an example of Twitter’s refusing to remove that meme. Colleagues should be assured that there is a problem here, and I would like us to direct our minds to it, as the Bill gives us an opportunity to do.

Whether or not Twitter is correctly praying in aid the loophole it says there is in the Equality Act—I think that is questionable—the Bill gives us the perfect opportunity to clarify matters. Clause 3 of clearly brings Twitter and other online service providers within the regulatory scheme of the Bill as a service with

“a significant number of United Kingdom users”.

The Bill squarely recognises that Twitter provides a service in the United Kingdom to UK users, so it is only a very small step to amend the Bill to make it absolutely clear that when it does so it should be subject to the Equality Act. That is what my new clause 24 seeks to do.

I have also tabled new clauses 193 and 191 to ensure that Twitter and other online platforms obey non-discrimination law regarding Ofcom’s production of codes of practice and guidance. The purpose of those amendments is to ensure that Ofcom consults with persons who have expertise in the Equality Act before producing those codes of conduct.

I will not push the new clauses to a vote. I had a very productive meeting with the Minister’s predecessor, Chris Philp, who expressed a great deal of sympathy when I explained the position to him. I have been encouraged by the cross-party support for the new clauses, both in discussions before today with Members from all parties and in some of the comments made by various hon. Members today.

I am really hoping that the Government will take my new clauses away and give them very serious consideration, that they will look at the Joint Committee’s report from October 2019 and that either they will adopt these amendments or perhaps somebody else will take them forward in the other place.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

I can assure the hon. and learned Lady that I am happy to carry on the dialogue that she had with my predecessor and meet her to discuss this at a further date.

Photo of Joanna Cherry Joanna Cherry Shadow SNP Spokesperson (Justice and Home Affairs)

I am delighted to hear that. I must tell the Minister that I have had a huge number of approaches from women, from lesbians and from gay men across the United Kingdom who are suffering as a result of Twitter’s moderation policy. There is a lot of support for new clause 24.

Of course, it is important to remember that the Equality Act protects everyone. Gender reassignment is there with the protected characteristics of sex and sexual orientation. It is really not acceptable for a company such as Twitter, which provides a service in the United Kingdom, to seek to flout and ignore the provisions of our domestic law on anti-discrimination. I am grateful to the Minister for the interest he has shown and for his undertaking to meet me, and I will leave it at that for now.

Photo of Julian Knight Julian Knight Chair, Culture, Media and Sport Committee, Chair, Culture, Media and Sport Committee, Chair, Culture, Media and Sport Sub-committee on Online Harms and Disinformation, Chair, Culture, Media and Sport Sub-committee on Online Harms and Disinformation 3:15, 12 July 2022

We live in the strangest of times, and the evidence of that is that my hon. Friend Damian Collins, who has knowledge second to none in this area, has ended up in charge of it. I have rarely seen such an occurrence. I hope he is able to have a long and happy tenure and that the blob does not discover that he knows what he is doing.

I backed the Bill on Second Reading and I will continue to back it. I support most of the content within it and, before I move on to speak to the amendments I have tabled, I want to thank the Government for listening to the recommendations of the Digital, Culture, Media and Sport Committee, which I chair. The Government have accepted eight of the Committee’s key recommendations, demonstrating that the Committee is best placed to provide Parliamentary scrutiny of DCMS Bills as they pass through this House and after they are enacted.

I also pay tribute to the work of the Joint Committee on the draft Bill, which my hon. Friend the Member for Folkestone and Hythe chaired, and the Public Bill Committee, which has improved this piece of legislation during its consideration. The Government have rightfully listened to the Select Committee’s established view that it would be inappropriate to establish a permanent joint committee on digital regulation. I also welcome the news that the Government are set to bring forward amendments in the House of Lords to legislate for a new criminal offence for epilepsy trolling, which was recommended by both the Joint Committee and the Select Committee.

That said, the Digital, Culture, Media and Sport Committee continues to have concerns around some aspects of the Bill, particularly the lack of provision for funding digital literacy, a key area where we are falling behind in and need to make some progress. However, my primary concern and that of my colleagues on the Committee relates to the powers within this Bill that would, in effect, give the Secretary of State the opportunity to interfere with Ofcom’s role in the issuing of codes of practice to service providers.

It is for that reason that I speak to amendments 44 to 46 standing in my name on the amendment paper. Clause 40, in my view, gives the Secretary of State unprecedented powers and would bring into question the future integrity of Ofcom itself. Removing the ability to exercise those powers in clause 39 would mean we could lose clauses 40 and 41, which outline the powers granted and how they would be sent to the House for consideration.

Presently, Ofcom sets out codes of practice under which,

“companies can compete fairly, and businesses and customers benefit from the choice of a broad range of services”.

Under this Bill Ofcom, which, I remind the House, is an independent media regulator, will be required to issue codes of practice to service providers, for example codes outlining measures that would enable services to comply with duties to mitigate the presence of harmful content.

Currently, codes of practice from Ofcom are presented to the House for consideration “as soon as practicable”, something I support. My concern is the powers given in this Bill that allow the Secretary of State to reject the draft codes of practice and to send them back to Ofcom before this House knows the recommendations exist, let alone having a chance to consider or debate them.

I listened with interest to my hon. Friend Chris Philp, who is not in his place but who was a very fine Minister during his time in the Department. To answer his query on the written ministerial statement and the letter written to my Committee on this matter, I say to him and to those on the Front Bench that if the Government disagree with what Ofcom is saying, they can bring the matter to the House and explain that disagreement. That would allow things to be entirely transparent and open, allow greater scrutiny rather than less, and allow for less delay than would be the case if there is forever that ping-pong between the Secretary of State and Ofcom until it gets its work right.

I want to make it clear that the DCMS Committee and I believe that this is nothing more than a power grab by the Executive. I am proud that in western Europe we have a free press without any interference from Government, and I believe that the Bill, if constituted in this particular form, has the potential to damage that relationship—I say potential, because I do not believe that is the intention of what is being proposed here, but there is the potential for the Bill to jeopardise that relationship in the long term. That is why I hope that Members will consider supporting my amendments, and I will outline why they should do so.

As William Perrin, a trustee of the Carnegie Trust UK, made clear in evidence to my Committee,

“the underpinning convention of regulation of media in Western Europe is that there is an independent regulator and the Executive does not interfere in their day to day decision-making for very good reason.” Likewise, Dr Edina Harbinja, a senior lecturer at Aston University, raised concerns that the Bill made her

“fear that Ofcom’s independence may be compromised” and that

“similar powers are creeping into other law reform pieces and proposals, such as…data protection”.

My amendments seek to cut red tape, bureaucracy and endless recurring loops that in some cases may result in significant delays in Ofcom managing to get some codes of practice approved. The amendments will allow the codes to come directly to this House for consideration by Members without another level of direct interference from the Secretary of State. Let me make it very clear that this is not a comment on any Secretary of State, at any time in the past, but in some of these cases I expect that Ofcom will require a speedy turnaround to get these codes of practices approved—for instance, measures that it wishes to bring forward to better safeguard children online. In addition, the Secretary of State has continually made it clear in our Select Committee hearings that she is a great supporter of more parliamentary scrutiny. I therefore hope that the Government will support my amendment so that we do not end up in a position where future Secretaries of State could potentially prevent draft codes coming before the House due to endless delays and recurring loops.

I also want to make it abundantly clear that my amendment does not seek to prevent the Secretary of State from having any involvement in the formulation of new codes of practice from Ofcom. Indeed, as Ofcom has rightly pointed out, the Secretary of State is already a statutory consultee when Ofcom wishes to draft new codes of practice or amend those that already exist. She can also, every three years, set out guidelines that Ofcom would have to follow when creating such codes of practice. The Government therefore already play a crucial role in influencing the genesis and the direction of travel in this area.

On Friday the Secretary of State wrote to my office outlining some of the concerns shared by Members of this House and providing steps on how her Department would address those concerns. In her letter, she recognises that the unprecedented powers awarded to the Secretary of State are of great concern to Members and goes on to state that

“regulatory independence is vital to the success of the framework”.

I have been informed that in order to appease some of these concerned Members, the Government intend to bring forward amendments around the definitions of “exceptional circumstances” and “public policy”, as referenced earlier. These definitions, including “economic policy” and “business interests”, are so broad that I cannot think of anything that would not be covered by these exceptional circumstances.

If the Secretary of State accepts our legitimate concerns, surely Ministers should accept my amendments becoming part of the Bill today, leaving a cleaner process rather than an increasingly complex system of unscrutinised ministerial interference with the regulator. The DCMS Committee and I are very clear that clause 40 represents a power grab by the Government that potentially threatens the independence of Ofcom, which is a fundamental principle of ensuring freedom of speech and what should be a key component of this legislation. The Government must maintain their approach to ensuring independent, effective, and trustworthy regulation.

I will not press my amendments to a vote, but I hope my concerns will spark not just thoughts and further engagement from Ministers but legislative action in another place as the Bill progresses, because I really do think that this could hole the Bill under the waterline and has the potential for real harm to our democratic way of life going forward as we tackle this whole new area.

Photo of Kevan Jones Kevan Jones Labour, North Durham

I rise to speak to my new clause 8, which would place a duty on all internet site providers regulated by this Bill to prevent individuals from encountering adverts for cosmetic procedures that do not contain disclaimers as to health risks of the procedure or include certified service quality indicators.

I have been campaigning for a number of years for better regulation of the non-surgical and cosmetic surgery industry, which is frankly a wild west in terms of lack of regulation, only made worse by the internet. I pay tribute to my constituent Dawn Knight, who has been a fierce campaigner in this area. We are slowly making progress. I thank the former Health Minister, Edward Argar, for his work in bringing amendments on licensing to the Bill that became the Health and Care Act 2022. That is now out for consultation. It is a first, welcome step in legislation to tame the wild west that is the cosmetic surgery sector. My amendment would enhance and run parallel to that piece of legislation.

Back in 2013, Sir Bruce Keogh first raised the issue of advertising in his recommendations on regulation of the cosmetic surgery industry, saying that cosmetic and aesthetic procedures adverts should be provided with a disclaimer or kitemark in a manner similar to that around alcohol or gambling regulation. Years ago, adverts were in newspapers and magazines. Now, increasingly, the sector’s main source of advertising revenue is the internet.

People will say, “Why does this matter?” Well, it links to some of the other things that have been raised in this debate. The first is safety. We do not have any data, for which I have been calling for a while, on how many surgical and non-surgical aesthetic procedures in the UK go wrong, but I know who picks up the tab for it—it is us as taxpayers as the NHS has to put a lot of those procedures right. The horrendous cases that I have seen over the years provide just cause for why people need to be in full control of the facts before they undertake these procedures.

This is a boom industry. It is one where decisions on whether to go ahead with a procedure are not usually made with full information on the potential risks. It is sold, certainly online, as something similar to buying any other service. As we all know, any medical procedure has health risks connected to it, and people should be made aware of them in the adverts that are now online. I have tried writing to Facebook and others to warn them about some of the more spurious claims that some of the providers are making, but have never got a reply from Facebook. This is about patient safety. My amendment would ensure that these adverts at least raise in people’s minds the fact that there is a health risk to these procedures.

Again, people will say, “Why does this matter?” Well, the target for this sector is young people. As I said, a few years ago these adverts were in newspapers and magazines; now they are on Facebook, Twitter, Instagram and so on, and we know what they are selling: they are bombarding young people with the perfect body image.

We only have to look at the Mental Health Foundation’s report on this subject to see the effect the industry is having on young people, with 37% feeling upset and 31% feeling ashamed of their own body image. That is causing anxiety and mental health problems, but it is also forcing some people to go down the route of cosmetic surgery—both surgical and non-surgical—when there is nothing wrong with their body. It is the images, often photoshopped and sadly promoted by certain celebrities, that force them down that route.

Someone has asked me before, “Do you want to close down the cosmetic surgery industry?” I am clear that I do not; what I want is for anyone going forward for these procedures to be in full control of the facts. Personally, if I had a blank sheet of paper, I would say that people should have mental health assessments before they undertake these procedures. If we had a kitemark on adverts, as Sir Bruce Keogh recommended, or something that actually said, “This is not like buying any other service. This is a medical procedure that could go wrong”, people would be in full awareness of the facts before they went forward.

It is a modest proposal for the Bill, but it could have a major impact on the industry out there at the moment, which for many years has been completely unregulated. I do not propose pressing my new clause to a vote, but will the Minister work with his Department of Health and Social Care colleagues? Following the Health and Care Act 2022, there is a consultation on the regulations, and we could make a real difference for those I am worried about and concerned for—the more and more young people who are being bombarded with these adverts. In some cases, dangerous and potentially life-threatening procedures are being sold to them as if they are just like any other service, and they are not.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport 3:30, 12 July 2022

The right hon. Gentleman makes a very important point and, as he knows, there is a wider ongoing Government review related to advertising online, which is a very serious issue. I assure him that we will follow up with colleagues in the Department of Health and Social Care to discuss the points he has raised.

Photo of Kevan Jones Kevan Jones Labour, North Durham

I am grateful to the Minister and I will be keeping a beady eye to see how far things go. The proposal would make a difference. It is a simple but effective way of protecting people, especially young people.

Photo of Damian Hinds Damian Hinds Conservative, East Hampshire

May I join others in welcoming my hon. Friend Damian Collins to his place on the Front Bench? He brings a considerable amount of expertise. I also, although it is a shame he is not here to hear me say nice things about him, pay tribute, as others have, to my hon. Friend Chris Philp. I had the opportunity to work with him, his wonderful team of officials and wonderful officials at the Home Office on some aspects of this Bill, and it was a great pleasure to do so. As we saw again today, his passion for this subject is matched only by his grasp of its fine detail.

I particularly echo what my hon. Friend said about algorithmic promotion, because if we address that, alongside what the Government have rightly done on ID verification options and user empowerment, we would address some of the core wiring and underpinnings at an even more elemental level of online harm.

I want to talk about two subjects briefly. One is fraud, and the other is disinformation. Opposition amendment 20 refers to disinformation, but that amendment is not necessary because of the amendments that the Government are bringing to the National Security Bill to address state-sponsored disinformation. I refer the House in particular to Government amendment 9 to that Bill. That in turn amends this Bill—it is the link, or so-called bridge, between the two. Disinformation is a core part of state threat activity and it is one of the most disturbing, because it can be done at huge volume and at very low cost, and it can be quite hard to detect. When someone has learned how to change the way people think, that makes that part of their weaponry look incredibly valuable to them.

We often talk about this in the context of elections. I think we are actually pretty good—when I say “we”, I mean our country, some other countries and even the platforms themselves—at addressing disinformation in the context of the elections themselves: the process of voting, eligibility to vote and so on. However, first, that is often not the purpose of disinformation at election time and, secondly, most disinformation occurs outside election times. Although our focus on interference with the democratic process is naturally heightened coming up to big democratic events, it is actually a 365-day-a-year activity.

There are multiple reasons and multiple modes for foreign states to engage in that activity. In fact, in many ways, the word “disinformation” is a bit unsatisfactory because a much wider set of things comes under the heading of information operations. That can range from simple untruths to trying to sow many different versions of an event, particularly a foreign policy or wartime event, to confuse the audience, who are left thinking, “Oh well, whatever story I’m being told by the BBC, my newspaper, or whatever it is, they are all much of a muchness.” Those states are competing for truth, even though in reality, of course, there is one truth. Sometimes the aim is to big up their own country, or to undermine faith in a democracy like ours, or the effectiveness of free societies.

Probably the biggest category of information operations is when there is not a particular line to push at all, but rather the disinformer is seeking to sow division or deepen division in our society, often by telling people things that they already believe, but more loudly and more aggressively to try to make them dislike some other group in society more. The purpose, ultimately, is to destabilise a free and open society such as ours and that has a cancerous effect. We talk sometimes of disinformation being spread by foreign states. Actually, it is not spread by foreign states; it is seeded by foreign states and then spread usually by people here. So they create these fake personas to plant ideas and then other people, seeing those messages and personas, unwittingly pick them up and pass them on themselves. It is incredibly important that we tackle that for the health of our democracy and our society.

The other point I want to mention briefly relates to fraud and the SNP amendments in the following group, but also Government new clause 14 in this group. I strongly support what the Government have done, during the shaping of the Bill, on fraud; there have been three key changes on fraud. The first was to bring user-generated content fraud into the scope of the Bill. That is very important for a particularly wicked form of fraud known as romance fraud. The second was to bring fraudulent advertising into scope, which is particularly important for categories of fraud such as investment fraud and e-commerce. The third big change was to make fraud a priority offence in the Bill, meaning that it is the responsibility of the platforms not just to remove that content when they are made aware of it, but to make strenuous efforts to try to stop it appearing in front of their users in the first place. Those are three big changes that I greatly welcome.

There are three further things I think the Government will need to do on fraud. First, there is a lot of fraudulent content beyond categories 1 and 2A as defined in the Online Safety Bill, so we are going to have to find ways—proportionate ways—to make sure that that fraudulent content is suppressed when it appears elsewhere, but without putting great burdens on the operators of all manner of community websites, village newsletters and so on. That is where the DCMS online advertising programme has an incredibly important part to play.

The second thing is about the huge variety of channels and products. Telecommunications are obviously important, alongside online content, but even within online, as the so-called metaverse develops further, with the internet of things and the massive potential for defrauding people through deep fakes and so on, we need to be one step ahead of these technologies. I hope that in DCMS my hon. Friends will look to create a future threats unit that seeks to do that.

Thirdly, we need to make sure everybody’s incentives are aligned on fraud. At present, the banks reimburse people who are defrauded and I hope that rate of reimbursement will shortly be increasing. They are not the only ones involved in the chain that leads to people being defrauded and often they are not the primary part of that chain. It is only right and fair, as well as economically efficient, to make sure the other parts of the chain that are involved share in that responsibility. The Bill makes sure their incentives are aligned because they have to take proportionate steps to stop fraudulent content appearing in front of customers, but we need to look at how we can sharpen that up to make sure everybody’s incentives are absolutely as one.

This is an incredibly important Bill. It has been a long time coming and I congratulate everybody, starting with my right hon. and learned Friend Sir Jeremy Wright, my hon. Friend Chris Philp and others who have been closely involved in creating it. I wish my hon. Friend the Minister the best of luck.

Several hon. Members:

rose—

Photo of Nigel Evans Nigel Evans Deputy Speaker (Second Deputy Chairman of Ways and Means)

We will now introduce a six-minute limit on speeches. It may come down but, if Members can take less than six minutes, please do so. I intend to call the Minister at 4.20 pm.

Photo of Jamie Stone Jamie Stone Liberal Democrat Spokesperson (Armed Forces), Liberal Democrat Spokesperson (Digital, Culture, Media and Sport)

May I, on behalf of my party, welcome the Minister to his place?

I have been reflecting on the contributions made so far and why we are here. I am here because I know of a female parliamentary candidate who pulled out of that process because of the online abuse. I also know of somebody not in my party—it would be unfair to name her or her party—who stood down from public life in Scotland mostly because of online abuse. This is something that threatens democracy, which we surely hold most dear.

Most of us are in favour of the Bill. It is high time that we had legislation that keeps users safe online, tackles illegal content and seeks to protect freedom of speech, while also enforcing the regulation of online spaces. It is clear to me from the myriad amendments that the Bill as it currently stands is not complete and does not go far enough. That is self-evident. It is a little vague on some issues.

I have tabled two amendments, one of which has already been mentioned and is on media literacy. My party and I believe Ofcom should have a duty to promote and improve the media literacy of the public in relation to regulated user-to-user services and search services. That was originally in the Bill but it has gone. Media literacy is mentioned only in the context of risk assessments. There is no active requirement for internet companies to promote media literacy.

The pandemic proved that a level of skill is needed to navigate the online world. I offer myself as an example. The people who help me out in my office here and in my constituency are repeatedly telling me what I can and cannot do and keeping me right. I am of a certain age, but that shows where education is necessary.

My second amendment is on end-to-end encryption. I do not want anything in this Bill to prevent providers of online services from protecting their users’ privacy through end-to-end encryption. It does provide protection to individuals and if it is circumvented or broken criminals and hostile foreign states can breach security. Privacy means security.

There are also concerns about the use of the word “harm” in the Bill. It remains vague and threatens to capture a lot of unintended content. I look forward to seeing what comes forward from the Government on that front. It focuses too much on content as opposed to activity and system design. Regulation of social media must respect the rights to privacy and free expression of those who use it. However, as Dame Margaret Hodge said, that does not mean a laissez-faire approach: bullying and abuse prevent people from expressing themselves and must at all costs be stamped out, not least because of the two examples I mentioned at the start of my contribution.

As I have said before, the provisions on press exemption are poorly drafted. Under the current plans, the Russian propaganda channel Russia Today, on which I have said quite a bit in this place in the past, would qualify as a recognised news publisher and would therefore be exempt from regulation. That cannot be right. It is the same news channel that had its licence revoked by Ofcom.

I will help you by being reasonably brief, Mr Deputy Speaker, and conclude by saying that as many Members have said, the nature of the Bill means that the Secretary of State will have unprecedented powers to decide crucial legislation later. I speak—I will say it again—as a former chair of the Scottish Parliament’s statutory instruments committee, so I know from my own experience that all too often, instruments that have far-reaching effects are not given the consideration in this place that they should receive. Such instruments should be debated by the rest of us in the Commons.

As I said at the beginning of my speech, the myriad amendments to the Bill make it clear that the rest of us are not willing to allow it to remain so inherently undemocratic. We are going in the right direction, but a lot can be done to improve it. I wait with great interest to see how the Minister responds and what is forthcoming in the period ahead.

Photo of Andrew Percy Andrew Percy Conservative, Brigg and Goole 3:45, 12 July 2022

This has been an interesting debate on a Bill I have followed closely. I have been particularly struck by some of the arguments that claim the Bill is an attack on freedom of speech. I always listen intently to my right hon. Friend Mr Davis and to Joanna Cherry, but I think they are wrong in the conclusions they have reached about legal but harmful content. Indeed, many of the criticisms that the hon. and learned Member for Edinburgh South West made of the various platforms were criticisms of the present situation, and that is exactly why I think this legislation will improve the position. However, those Members raised important points that I am sure will be responded to. I have also been a strong advocate of the inclusion of small but high-harm platforms, as the Minister and the shadow Minister, Alex Davies-Jones, both know—we have all had those discussions.

In the time I have, I want to focus principally on the issue of search and on new clauses 9 and 10, which stand in my name. As the shadow Minister has highlighted, last week we were—like many people in this place, perhaps—sent the most remarkable online prompt, which was to simply search Google for the words “desk ornament”. The top images displayed in response to that very mundane and boring search were of swastikas, SS bolts and other Nazi memorabilia presented as desk ornaments. Despite there having been awareness of that fact since, I believe, the previous weekend, and even though Google is making millions of pounds in seconds from advertising, images promoting Nazism were still available for all to see as a result of those searches.

When he gave evidence to the Bill Committee recently, Danny Stone, the Antisemitism Policy Trust’s very capable chief executive, pointed out that Amazon’s Alexa had used just one comment posted by one individual on Amazon’s website to inform potentially millions of users who cared to ask that George Soros was responsible for all of the world’s evils, and that Alexa had used a comment from another website to inform those who searched for it that the humanitarian group the White Helmets was an illicit operation founded by a British spy.

As we have seen throughout the covid pandemic, similar results come up in response to other searches, such as those around vaccines and covid. The Antisemitism Policy Trust has previously demonstrated that Microsoft Bing, the platform that lies behind Alexa, was directing users to hateful searches such as “Jews are bastards” through autocompletes, as well as pointing people to homophobic stories. We even had the sickening situation of Google’s image carousel highlighting Jewish baby strollers in response to people searching for portable barbecues.

Our own Alexa searches highlighted the issue some time ago. Users who asked Alexa “Do Jews control the media?” were responded to with a quote from a website called Jew Watch—that should tell Members all they need to know about the nature of the platform—saying that Jews control not only the media, but the financial system too. The same problem manifests itself across search platforms in other languages, as we highlighted not so long ago with Siri in Spanish. When asked, “Do the Jews control the media?” she responds with an article that states that Jews do indeed control international media. This goes on and on, irrespective of whether the search is voice or text-based.

The largest search companies in the world are falling at the first hurdle when it comes to risk assessing for harms on their platform. That is the key point when we ask for lawful but harmful content to be responded to. It is about risk assessment—requiring companies that do not respect borders, operate globally and are in many ways more powerful than Governments to risk assess and warn about lawful but deeply harmful content that all of us in the House would be disgusted by.

At present, large traditional search services including Google and Microsoft Bing, and voice search assistants including Alexa and Siri, will be exempted from having to risk assess their systems and address harm to adults, despite the fact that other large user-to-user services will have to do so. How can it be possible that Google does not have to act, when Meta—Facebook—and Twitter do? That does not seem consistent with the aims of the Bill.

There is a lot more that I would like to have said on the Bill. I welcome the written ministerial statement last week in relation to small but high-harm platforms. I hope that as the Bill progresses to the other place, we can look again at search. Some of the content generated is truly appalling, even though it may very well be considered lawful.

Photo of Feryal Clark Feryal Clark Shadow Minister (Health and Social Care)

I join everyone else in the House in welcoming the Minister to his place.

I rise to speak in support of amendments 15 and 16. At the core of this issue is the first duty of any Government: to keep people safe. Too often in debates, which can become highly technical, we lose sight of that fact. We are not just talking about technology and regulation; we are talking about real lives and real people. It is therefore incumbent on all of us in this place to have that at the forefront of our minds when discussing such legislation.

Labelling social media as the wild west of today is hardly controversial—that is plain and obvious for all to see. There has been a total failure on the part of social media companies to make their platforms safe for everyone to use, and that needs to change. Regulation is not a dirty word, but a crucial part of ensuring that as the internet plays a bigger role in every generation’s lives, it meets the key duty of keeping people safe. It has been a decade since we first heard of this Bill, and almost four years since the Government committed to it, so I am afraid that there is nothing even slightly groundbreaking about the Bill as it is today. We have seen progress being made in this area around the world, and the UK is falling further and further behind.

Of particular concern to me is the impact on children and young people. As a mother, I worry for the world that my young daughter will grow up in, and I will do all I can in this place to ensure that children’s welfare is at the absolute forefront. I can see no other system or institution that children are allowed to engage with that has such a wanting lack of safeguards and regulation. If there was a faulty slide in a playground, it would be closed off and fixed. If a sports field was covered with glass or litter, that would be reported and dealt with. Whether we like it or not, social media has become the streets our children hang out in, the world they grow up in and the playground they use. It is about time we started treating it with the same care and attention.

There are far too many holes in the Bill that allow for the continued exploitation of children. Labour’s amendments 15 and 16 tackle the deeply troubling issue of “breadcrumbing”. That is where child abusers use social networks to lay trails to illegal content elsewhere online and share videos of abuse edited to fall within content moderation guidelines. The amendments would give the regulators powers to tackle that disgusting practice and ensure that there is a proactive response to it. They would bring into regulatory scope the millions of interactions with accounts that actively enable child abuse. Perhaps most importantly, they would ensure that social media companies tackled child abuse at the earliest possible stage.

In its current form, even with Government amendment 14, the Bill merely reinforces companies’ current focus only on material that explicitly reaches the criminal threshold. That is simply not good enough. Rather than acknowledging that issue, Government amendments 71 and 72 let social media companies off the hook. They remove the requirement for companies to apply their terms and conditions “consistently”. That was addressed very eloquently by Chris Philp and Sir Jeremy Wright, who highlighted that Government amendment 14 simply does not go far enough.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

On the amendments that the former Minister, my hon. Friend Chris Philp, spoke to, the word “consistently” has not been removed from the text. There is new language that follows the use of “consistently”, but the use of that word will still apply in the context of the companies’ duties to act against illegal content.[This section has been corrected on 21 July 2022, column 13MC — read correction]

Photo of Feryal Clark Feryal Clark Shadow Minister (Health and Social Care)

I welcome the Minister’s clarification and look forward to the amendments being made to the Bill. Other than tying one of our hands behind our back in relation to trying to keep children safe, however, the proposals as they stand do not achieve very much. This will undermine the entire regulatory system, practically rendering it completely ineffective.

Although I welcome the Bill and some of the Government amendments, it still lacks a focus on ensuring that tech companies have the proper systems in place to fulfil their duty of care and keep our children safe. The children of this country deserve better. That is why I wholeheartedly welcome the amendments tabled by my hon. Friend Alex Davies-Jones and urge Government Members to support them.

Several hon. Members:

rose—

Photo of Nigel Evans Nigel Evans Deputy Speaker (Second Deputy Chairman of Ways and Means)

Order. We will stick with a time limit of six minutes, but I put everybody on notice that we may have to move that down to five.

Photo of Adam Afriyie Adam Afriyie Conservative, Windsor

I very much welcome the Bill, which has been a long time in the making. It has travelled from my right hon. and learned Friend Sir Jeremy Wright to my hon. Friend Chris Philp and now to my hon. Friend Damian Collins; I say a huge thank you to them for their work. The Bill required time because this is a very complex matter. There are huge dangers and challenges in terms of committing offences against freedom of speech. I am glad that Ministers have recognised that and that we are very close to an outcome.

The Bill is really about protection—it is about protecting our children and our society from serious harms—and nobody here would disagree that we want to protect children from harm online. That is what 70% to 80% of the Bill achieves. Nobody would disagree that we need to prevent acts of terror and incitement to violence. We are all on the same page on that across the House. What we are talking about today, and what we have been talking about over the past several months, are nips and tucks to try to improve elements of the Bill. The framework appears to be generally correct. We need to drill down into some of the details to ensure that the areas that each of us is concerned about are dealt with in the Bill we finally produce, as it becomes an Act of Parliament.

There are several amendments tabled in my name and those of other right hon. and hon. Members. I can only canter through them cursorily in the four minutes and 30 seconds remaining to me, but I will put these points on the record in the hope that the Minister will respond positively to many of them.

Amendments 48 and 49 would ensure that providers can decide to keep user-generated content online, taking no action if that content is not harmful. In effect, the Government have accepted those amendments by tabling amendment 71, so I thank the Minister for that.

My amendment 50 says that the presumption should be tipped further in favour of freedom of expression and debate by ensuring that under their contractual terms of service, except in particular circumstances, providers are obliged to leave content online. I emphasise that I am not talking about harmful or illegal content; amendment 50 seeks purely to address content that may be controversial but does not cross the line.

I turn to amendment 51. It appears that the Bill protects the media, journalists, Governments and us politicians, while providers have some protections against being fined unjustly. In many ways, the only people who are not protected are the public—the users with user-generated legal content. It seems to me that we need to increase the powers for citizens to get an outcome if their content is taken down inaccurately, incorrectly or inappropriately. We would do well to look at ensuring, as amendment 51 would, that citizens can also seek compensation. Tens of millions of pounds, if not hundreds of millions, are about to go to the regulator and to the Government in a form of quasi-taxation, so there needs to be a mechanism for a judge to decide that if somebody has been harmed by the inappropriate removal of legal content, they can get some redress.

Government amendment 94 is quite interesting. I can certainly see the reason for it and the purpose that it seeks to achieve, but it will require providers to take into account the entire criminal code. Effectively, they will have to act as a policeman, policing all internet content against all legislation. I am sure that that is not the intent behind amendment 94. I simply urge the Government to take a look at my amendment 52, which would ensure that relevant offences include only those specified, so that providers do not need to understand the entire criminal code.

The primary area of concern, which many other hon. Members have voiced, is that it looks as if the Secretary of State will be given the power to specify priority harms without that decision necessarily being passed on the Floor of the House. It seems to me that it is Parliament that should primarily be making regulations and legislation, so I really urge the Government to take another look and ensure that if a Secretary of State seeks to modify the priority harms or specify certain content as harmful or illegal, it is debated in the Chamber of the House of Commons. That is the primary function of this place.

Technology moves very quickly, so personally I would welcome an annual debate on areas that may need improvement. Now that we are outside the European Union and have autonomy, those are the kinds of things that we must decide in this Chamber.

Photo of Munira Wilson Munira Wilson Liberal Democrat Spokesperson (Education) 4:00, 12 July 2022

I rise to speak to new clauses 25 and 26 in my name. The Government rightly seek to make the UK the safest place in the world to go online, especially for our children, and some of their amendments will start to address previous gaps in the Bill. However, I believe that the Bill still falls short in its aim not only to protect children from harm and abuse, but, importantly, to empower and enable young people to make the most of the online world.

I welcome the comments that Sir Jeremy Wright made about how we achieve the balance between rights and protecting children from harm. I also welcome his amendments on children’s wellbeing, which seek to achieve that balance.

With one in five children going online, keeping them safe is more difficult but more important than ever. I speak not only as the mother of two very young children who are growing up with iPads in their hands, but as—like everyone else in the Chamber—a constituency Member of Parliament who speaks regularly to school staff and parents who are concerned about the harms caused by social media in particular, but also those caused by games and other services to which children have access.

The Bill proffers a broad and vague definition of content that is legal yet harmful. As many have already said, it should not be the responsibility of the Secretary of State, in secondary legislation, to make decisions about how and where to draw the line; Parliament should set clear laws that address specific, well-defined harms, based on strong evidence. The clear difficulty that the Government have in defining what content is harmful could have been eased had the Bill focused less on removing harmful content and more on why service providers allow harmful content to spread so quickly and widely. Last year, the 5Rights Foundation conducted an experiment in which it created several fake Instagram profiles for children aged between 14 and 17. When the accounts were searched for the term “skinny”, while a warning pop-up message appeared, among the top results were

“accounts promoting eating disorders and diets, as well as pages advertising appetite-suppressant gummy bears.”

Ultimately, the business models of these services profit from the spread of such content. New clause 26 requires the Government and Ofcom to focus on ensuring that internet services are safe by design. They should not be using algorithms that give prominence to harmful content. The Bill should focus on harmful systems rather than on harmful content.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

It does focus on systems as well as content. We often talk about content because it is the exemplar for the failure of the systems, but the systems are entirely within the scope of the Bill.

Photo of Munira Wilson Munira Wilson Liberal Democrat Spokesperson (Education)

I thank the Minister for that clarification, but there are still many organisations out there, not least the Children’s Charities Coalition, that feel that the Bill does not go far enough on safety by design. Concerns have rightly been expressed about freedom of expression, but if we focus on design rather than content, we can protect freedom of expression while keeping children safe at the same time. New clause 26 is about tackling harms downstream, safeguarding our freedoms and, crucially, expanding participation among children and young people. I fear that we will always be on the back foot when trying to tackle harmful content. I fear that regulators or service providers will become over-zealous in taking down what they consider to be harmful content, removing legal content from their platforms just in case it is harmful, or introducing age gates that deny children access to services outright.

Of course, some internet services are clearly inappropriate for children, and illegal content should be removed—I think we all agree on that—but let us not lock children out of the digital world or let their voices be silenced. Forty-three per cent. of girls hold back their opinions on social media for fear of criticism. Children need a way to exercise their rights. Even the Children’s Commissioner for England has said that heavy-handed parental controls that lock children out of the digital world are not the solution.

I tabled new clause 25 because the Bill’s scope, focusing on user-to-user and search services, is too narrow and not sufficiently future-proof. It should cover all digital technology that is likely to be accessed by children. The term

“likely to be accessed by children” appears in the age-appropriate design code to ensure that the privacy of children’s data is protected. However, that more expansive definition is not included in the Bill, which imposes duties on only a subset of services to keep children safe. Given rapidly expanding technologies such as the metaverse—which is still in its infancy—and augmented reality, as well as addictive apps and games that promote loot boxes and gambling-type behaviour, we need a much more expansive definition

Photo of Nigel Evans Nigel Evans Deputy Speaker (Second Deputy Chairman of Ways and Means)

I am grateful to Dame Diana Johnson for keeping her powder dry and deferring her speech until the next group of amendments, so Members now have five minutes each.

Photo of Kim Leadbeater Kim Leadbeater Labour, Batley and Spen

I rise to speak in favour of amendments 15 to 19 in the names of my hon. Friends and, later, amendments 11 and 12 in the name of Sir Jeremy Wright.

As we discussed at great length in Committee—my first Bill Committee; a nice simple one to get me started—the Bill has a number of critical clauses to address the atrocious incidence of child sexual expectation online. Amendments 15 to 19 are aimed at strengthening those protections and helping to ensure that the internet is a safer place for every young person. Amendments 15 and 16 will bring into scope tens of millions of interactions with accounts that actively enable the discovery and sharing of child abuse material. Amendments 17 to 19 will tackle the issue of cross-platform abuse, where abuse starts on one platform and continues on another. These are urgent measures that children’s charities and advocacy groups have long called for, and I seriously hope this House will support them.

Last week, along with the shadow Minister and the then Minister, I attended an extremely moving reception hosted by one of those organisations, the NSPCC. It included a speech by Rachel, a mother of a victim of online grooming and child sexual exploitation. She outlined in a very powerful way how her son Ben was forced from the age of 13 to take and share photos of himself that he did not want to, and to enter Skype chats with multiple men. He was then blackmailed with those images and subjected to threats of violence to his family. Rachel said to us:

“We blamed ourselves and I thought we had failed…I felt like I hadn’t done enough to protect our children”.

I want to say to you, Rachel, that you did not fail Ben. Responsibility for what happened to Ben lies firmly with the perpetrators of these heinous crimes, but what did fail Ben and has failed our young people for far too long is the lack of urgency and political will to regulate the wild west of the internet. No one is pretending that this is an easy task, and we are dealing with a highly complex piece of legislation, but if we are to protect future Bens we have to strengthen this Bill as much as possible.

Another young woman, Danielle, spoke during the NSPCC event. She had been a victim of online CSE that had escalated into horrific real-world physical and sexual abuse. She told us how she has to live with the fear that her photos may appear online and be shared without her knowledge or control. She is a strong young woman who is moving on with her life with huge resilience, but her trauma is very real. Amendment 19 would ensure that proportionate measures are in place to prevent the encountering or dissemination of child abuse content—for example, through intelligence sharing of new and emerging threats. This will protect Danielle and people like her, giving them some comfort that measures are in place to stop the spread of these images and to place far more onus on the platforms to get on top of this horrific practice.

Amendments 11 and 12, in the name of the right hon. and learned Member for Kenilworth and Southam, will raise the threshold for non-broadcast media outlets to benefit from the recognised news publisher exemption by requiring that such publishers are subject to complaints procedures that are both suitable and sufficient. I support those amendments, which, while not perfect, are a step forward in ensuring that this exception is protected from abuse.

I am also pleased that the Government have listened to some of my and other Members’ concerns and have now agreed to bring forward amendments at a later stage to exclude sanctioned publishers such as Russia Today from accessing this exemption. However, there are hundreds if not thousands of so-called news publishers across the internet that pose a serious threat, from the far right and also from Islamist, antisemitic and dangerous conspiratorial extremism. We must act to ensure that journalistic protections are not abused by those wishing to spread harm. Let us be clear that this is as much about protecting journalism as it is about protecting users from harm.

We cannot overstate the seriousness of getting this right. Carving out protections within the Bill creates a risk that if we do not get the criteria for this exemption right, harmful and extremist websites based internationally will simply establish offices in the UK, just so that they too can access this powerful new protection. Amendments 11 and 12 will go some way towards ensuring that news publishers are genuine, but I recognise that the amendments are not the perfect solution and that more work is needed as the Bill progresses in the other place.

In closing, I hope that we can find consensus today around the importance of protecting children online and restricting harmful content. It is not always easy, but I know we can find common ground in this place, as we saw during the Committee stage of the Bill when I was delighted to gain cross-party support to secure the introduction of Zach’s law, inspired by my young constituent Zach Eagling, which will outlaw the dreadful practice of epilepsy trolling online.

Photo of Nigel Evans Nigel Evans Deputy Speaker (Second Deputy Chairman of Ways and Means)

You will resume your seat no later than 4.20 pm. We will therefore not put the clock on you.

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Work and Pensions) 4:15, 12 July 2022

I will try to avoid too much preamble, but I thank the former Minister, Chris Philp, for all his work in Committee and for listening to my nearly 200 contributions, for which I apologise. I welcome the new Minister to his place.

As time has been short today, I am keen to meet the Minister to discuss my new clauses and amendments. If he cannot meet me, I would be keen for him to meet the NSPCC, in particular, on some of my concerns.

Amendment 196 is about using proactive technology to identify CSEA content, which we discussed at some length in Committee. The hon. Member for Croydon South made it very clear that we should use scanning to check for child sexual abuse images. My concern is that new clause 38, tabled by the Lib Dems, might exclude proactive scanning to look for child sexual abuse images. I hope that the Government do not lurch in that direction, because we need proactive scanning to keep children protected.

New clause 18 specifically addresses child user empowerment duties. The Bill currently requires that internet service providers have user empowerment duties for adults but not for children, which seems bizarre. Children need to be able to say yes or no. They should be able to make their own choices about excluding content and not receiving unsolicited comments or approaches from anybody not on their friend list, for example. Children should be allowed to do that, but the Bill explicitly says that user empowerment duties apply only to adults. New clause 18 is almost a direct copy of the adult user empowerment duties, with a few extra bits added. It is important that children have access to user empowerment.

Amendment 190 addresses habit-forming features. I have had conversations about this with a number of organisations, including The Mix. I regularly accessed its predecessor, The Site, more than 20 years ago, and it is concerned that 42% of young people surveyed by YoungMinds show addiction-like behaviour in what they are accessing on social media. There is nothing on that in this Bill. The Mix, the Mental Health Foundation, the British Psychological Society, YoungMinds and the Royal College of Psychiatrists are all unhappy about the Bill’s failure to regulate habit-forming features. It is right that we provide support for our children, and it is right that our children are able to access the internet safely, so it is important to address habit-forming behaviour.

Amendment 162 addresses child access assessments. The Bill currently says that providers need to do a child access assessment only if there is a “significant” number of child users. I do not think that is enough and I do not think it is appropriate, and the NSPCC agrees. The amendment would remove the word “significant.” OnlyFans, for example, should not be able to dodge the requirement to child risk assess its services because it does not have a “significant” number of child users. These sites are massively harmful, and we need to ensure changes are made so they cannot wriggle out of their responsibilities.

Finally, amendment 161 is about live, one-to-one oral communications. I understand why the Government want to exempt live, one-to-one oral communications, as they want to ensure that phone calls continue to be phone calls, which is totally fine, but they misunderstand the nature of things like Discord and how people communicate on Fortnite, for example. People are having live, one-to-one oral communications, some of which are used to groom children. We cannot explicitly exempt them and allow a loophole for perpetrators of abuse in this Bill. I understand what the Government are trying to do, but they need to do it in a different way so that children can be protected from the grooming behaviour we see on some online platforms.

Once again, if the Minister cannot accept these amendments, I would be keen to meet him. If he cannot meet me, will he please meet the NSPCC? We cannot explicitly exempt those and allow a loophole for perpetrators of abuse in this Bill. I understand what the Government are trying to do, but they need to do it in a different way, in order that children can be protected from that grooming behaviour that we see on some of those platforms that are coming online. Once again, if the Minister cannot accept these amendments, I would be keen to meet him. If he cannot do that, I ask that the NSPCC have a meeting with him.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

We have had a wide-ranging debate of passion and expert opinion from Members in all parts of the House, which shows the depth of interest in this subject, and the depth of concern that the Bill is delivered and that we make sure we get it right. I speak as someone who only a couple of days ago became the Minister for online safety, although I was previously involved in engaging with the Government on this subject. As I said in my opening remarks, this has been an iterative process, where Members from across the House have worked successfully with the Government to improve the Bill. That is the spirit in which we should complete its stages, both in the Commons and in the Lords, and look at how we operate this regime when it has been created.

I wish to start by addressing remarks made by Alex Davies-Jones, the shadow Minister, and by Anna McMorrin about violence against women and girls. There is a slight assumption that if the Government do not accept an amendment that writes, “Violence against women and girls” into the priority harms in the Bill, somehow the Bill does not address that issue. I think we would all agree that that is not the case. The provisions on harmful content that is directed at any individual, particularly the new harms offences approved by the Law Commission, do create offences in respect of harm that is likely to lead to actual physical harm or severe psychological harm. As the father of a teenage girl, who was watching earlier but has now gone to do better things, I say that the targeting of young girls, particularly vulnerable ones, with content that is likely to make them more vulnerable is one of the most egregious aspects of the way social media works. It is right that we are looking to address serious levels of self-harm and suicide in the Bill and in the transparency requirements. We are addressing the self-harm and suicide content that falls below the illegal threshold but where a young girl who is vulnerable is being sent content and prompted with content that can make her more vulnerable, could lead her to harm herself or worse. It is absolutely right that that was in the scope of the Bill.

New clause 3, perfectly properly, cites international conventions on violence against women and girls, and how that is defined. At the moment, with the way the Bill is structured, the schedule 7 offences are all based on existing areas of UK law, where there is an existing, clear criminal threshold. Those offences, which are listed extensively, will all apply as priority areas of harm. If there is, through the work of the Law Commission or elsewhere, a clear legal definition of misogyny and violence against women and girls that is not included, I think it should be included within scope. However, if new clause 3 was approved, as tabled, it would be a very different sort of offence, where it would not be as clear where the criminal threshold applied, because it is not cited against existing legislation. My view, and that of the Government, is that existing legislation covers the sorts of offences and breadth of offences that the shadow Minister rightly mentioned, as did other Members. We should continue to look at this—

Photo of Anna McMorrin Anna McMorrin Shadow Minister (Justice)

The Minister is not giving accurate information there. Violence against women and girls is defined by article 3 of the Council of Europe convention on preventing violence against women and domestic violence—the Istanbul convention. So there is that definition and it would be valid to put that in the Bill to ensure that all of that is covered.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

I was referring to the amendment’s requirement to list that as part of the priority illegal harms. The priority illegal harms set out in the Bill are all based on existing UK Acts of Parliament where there is a clear established criminal threshold—that is the difference. The spirit of what that convention seeks to achieve, which we would support, is reflected in the harm-based offences written into the Bill. The big change in the structure of the Bill since the draft Bill was published—the Joint Committee on the Draft Online Safety Bill and I pushed for this at the time—is that far more of these offences have been clearly written into the Bill so that it is absolutely clear what they apply to. The new offences proposed by the Law Commission, particularly those relating to self-harm and suicide, are another really important addition. We know what the harms are. We know what we want this Bill to do. The breadth of offences that the hon. Lady and her colleagues have set out is covered in the Bill. But of course as law changes and new offences are put in place, the structure of the Bill, through the inclusion of new schedule 7 on priority offences, gives us the mechanism in the future, through instruments of this House, to add new offences to those primary illegal harms as they occur. I expect that that is what would happen. I believe that the spirit of new clause 3 is reflected in the offences that are written into the Bill.

The hon. Member for Pontypridd mentioned Government new clause 14. It is not true that the Government came up with it out of nowhere. There has been extensive consultation with Ofcom and others. The concern is that some social media companies, and some users of services, may have sought to interpret the criminal threshold as being based on whether a court of law has found that an offence has been committed, and only then might they act. Actually, we want them to pre-empt that, based on a clear understanding of where the legal threshold is. That is how the regulatory codes work. So it is an attempt, not to weaken the provision but to bring clarity to the companies and the regulator over the application.

John Nicolson raised an important point with regard to the Modern Slavery Act. As the Bill has gone along, we have included existing migration offences and trafficking offences. I would be happy to meet him further to discuss that aspect. Serious offences that exist in law should have an application, either as priority harms or as non-priority legal harms, and we should consider how we do that. I do not know whether he intends to press the amendment, but either way, I would be happy to meet him and to discuss this further.

My hon. Friend the Member for Solihull, the Chair of the Digital, Culture, Media and Sport Committee, raised an important matter with regard to the power of the Secretary of State, which was a common theme raised by several other Members. The hon. Member for Ochil and South Perthshire rightly quoted me, or my Committee’s report, back to me—always a chilling prospect for a politician. I think we have seen significant improvement in the Bill since the draft Bill was published. There was a time when changes to the codes could be made by the negative procedure; now they have to be by a positive vote of both Houses. The Government have recognised that they need to define the exceptional circumstances in which that provision might be used, and to define specifically the areas that are set out. I accept from the Chair of the Select Committee and my right hon. and learned Friend the Member for Kenilworth and Southam that those things could be interpreted quite broadly—maybe more broadly than people would like—but I believe that progress has been made in setting out those powers.

I would also say that this applies only to the period when the codes of practice are being agreed, before they are laid before Parliament. This is not a general provision. I think sometimes there has been a sense that the Secretary of State can at any time pick up the phone to Ofcom and have it amend the codes. Once the codes are approved by the House they are fixed. The codes do not relate to the duties. The duties are set out in the legislation. This is just the guidance that is given to companies on how they comply. There may well be circumstances in which the Secretary of State might look at those draft codes and say, “Actually, we think Ofcom has given the tech companies too easy a ride here. We expected the legislation to push them further.” Therefore it is understandable that in the draft form the Secretary of State might wish to have the power to raise that question, and not dictate to Ofcom but ask it to come back with amendments.

I take on board the spirit of what Members have said and the interest that the Select Committee has shown. I am happy to continue that dialogue, and obviously the Government will take forward the issues that they set out in the letter that was sent round last week to Members, showing how we seek to bring in that definition.

A number of Members raised the issue of freedom of speech provisions, particularly my hon. Friend Adam Afriyie at the end of his excellent speech. We have sought to bring, in the Government amendments, additional clarity to the way the legislation works, so that it is absolutely clear what the priority legal offences are. Where we have transparency requirements, it is absolutely clear what they apply to. The amendment that the Government tabled reflects the work that he and his colleagues have done, setting out that if we are discussing the terms of service of tech companies, it should be perfectly possible for them to say that this is not an area where they intend to take enforcement action and the Bill does not require them to do so. [This section has been corrected on 21 July 2022, column 14MC — read correction]

Kim Leadbeater mentioned Zach’s law. The hon. Member for Ochil and South Perthshire raised that before the Joint Committee. So, too, did my hon. Friend Dean Russell; he and the hon. Member for Ochil and South Perthshire are great advocates on that. It is a good example of how a clear offence, something that we all agree to be wrong, can be tackled through this legislation; in this case, a new offence will be created, to prevent the pernicious targeting of people with epilepsy with flashing images.

Finally, in response to the speech by Kirsty Blackman, I certainly will continue dialogue with the NSPCC on the serious issues that she has raised. Obviously, child protection is foremost in our mind as we consider the legislation. She made some important points about the ability to scan for encrypted images. The Government have recently made further announcements on that, to be reflected as the Bill progresses through the House.

Photo of Nigel Evans Nigel Evans Deputy Speaker (Second Deputy Chairman of Ways and Means)

To assist the House, I anticipate two votes on this first section and one vote immediately on the next, because it has already been moved and debated.

Proceedings interrupted (Programme Order, this day).

The Deputy Speaker put forthwith the Question already proposed from the Chair (Standing Order No. 83E), That the clause be read a Second time.

Question agreed to.

New clause 19 accordingly read a Second time, and added to the Bill.

The Deputy Speaker then put forthwith the Questions necessary for the disposal of the business to be concluded at that time (Standing Order No. 83E).