New Clause 7 - Duties regarding user-generated pornographic content: regulated services

Online Safety Bill – in the House of Commons at 5:15 pm on 12 July 2022.

Alert me about debates like this

Votes in this debate

“(1) This section sets out the duties which apply to regulated services in relation to user-generated pornographic content.

(2) A duty to verify that each individual featuring in the pornographic content has given their permission for the content in which they feature to be published or made available by the service.

(3) A duty to remove pornographic content featuring a particular individual if that individual withdraws their consent, at any time, to the pornographic content in which they feature remaining on the service.

(4) For the meaning of ‘pornographic content’, see section 66(2).

(5) In this section, ‘user-generated pornographic content’ means any content falling within the meaning given by subsection (4) and which is also generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service.

(6) For the meaning of ‘regulated service’, see section 2(4).”—(Dame Diana Johnson.)

Brought up, and read the First time.

Photo of Lindsay Hoyle Lindsay Hoyle Speaker of the House of Commons, Chair, Speaker's Committee for the Independent Parliamentary Standards Authority, Chair, Speaker's Committee for the Independent Parliamentary Standards Authority, Chair, House of Commons Commission, Chair, Speaker's Committee on the Electoral Commission, Chair, Speaker's Committee on the Electoral Commission

With this it will be convenient to discuss the following:

New clause 33—Meaning of “pornographic content”—

“(1) In this Act ‘pornographic content’ means any of the following—

(a) a video work in respect of which the video works authority has issued an R18 certificate;

(b) content that was included in a video work to which paragraph (a) applies, if it is reasonable to assume from its nature that its inclusion was among the reasons why the certificate was an R18 certificate;

(c) any other content if it is reasonable to assume from its nature that any classification certificate issued in respect of a video work including it would be an R18 certificate;

(d) a video work in respect of which the video works authority has issued an 18 certificate, and that it is reasonable to assume from its nature was produced solely or principally for the purposes of sexual arousal;

(e) content that was included in a video work to which paragraph (d) applies, if it is reasonable to assume from the nature of the content—

(i) that it was produced solely or principally for the purposes of sexual arousal, and

(ii) that its inclusion was among the reasons why the certificate was an 18 certificate;

(f) any other content if it is reasonable to assume from its nature—

(i) that it was produced solely or principally for the purposes of sexual arousal, and

(ii) that any classification certificate issued in respect of a video work including it would be an 18 certificate;

(g) a video work that the video works authority has determined not to be suitable for a classification certificate to be issued in respect of it, if—

(i) it includes content that it is reasonable to assume from its nature was produced solely or principally for the purposes of sexual arousal, and

(ii) it is reasonable to assume from the nature of that content that its inclusion was among the reasons why the video works authority made that determination;

(h) content that was included in a video work that the video works authority has determined not to be suitable for a classification certificate to be issued in respect of it, if it is reasonable to assume from the nature of the content—

(i) that it was produced solely or principally for the purposes of sexual arousal, and

(ii) that its inclusion was among the reasons why the video works authority made that determination;

(i) any other content if it is reasonable to assume from the nature of the content—

(i) that it was produced solely or principally for the purposes of sexual arousal, and

(ii) that the video works authority would determine that a video work including it was not suitable for a classification certificate to be issued in respect of it.

(2) In this section—

‘18 certificate’ means a classification certificate which—

(a) contains, pursuant to section 7(2)(b) of the Video Recordings Act 1984, a statement that the video work is suitable for viewing only by persons who have attained the age of 18 and that no video recording containing that work is to be supplied to any person who has not attained that age, and

(b) does not contain the statement mentioned in section 7(2)(c) of that Act that no video recording containing the video work is to be supplied other than in a licensed sex shop;

‘classification certificate’ has the same meaning as in the Video Recordings Act 1984 (see section 7 of that Act);

‘content’ means—

(a) a series of visual images shown as a moving picture, with or without sound;

(b) a still image or series of still images, with or without sound; or

(c) sound;

‘R18 certificate’ means a classification certificate which contains the statement mentioned in section 7(2)(c) of the Video Recordings Act 1984 that no video recording containing the video work is to be supplied other than in a licensed sex shop;

‘the video works authority’ means the person or persons designated under section 4(1) of the Video Recordings Act 1984 as the authority responsible for making arrangements in respect of video works other than video games;

‘video work’ means a video work within the meaning of the Video Recordings Act 1984, other than a video game within the meaning of that Act.”

This new clause defines pornographic content for the purposes of the Act and would apply to user-to-user services and commercial pornographic content.

Amendment 205, in clause 34, page 33, line 23, at end insert—

“(3A) But an advertisement shall not be regarded as regulated user-generated content and precluded from being a ‘fraudulent advertisement’ by reason of the content constituting the advertisement being generated directly on, uploaded to, or shared on a user-to-user service before being modified to a paid-for advertisement.”

Amendment 206, page 33, line 30, after “has” insert

“or may reasonably be expected to have”.

Amendment 207, in clause 36, page 35, line 12, at end insert—

“(3A) An offence under section 993 of the Companies Act 2006 (fraudulent trading).”

Amendment 208, page 35, line 18, after “(3)” insert “, 3(A)”.

Amendment 209, page 35, line 20, after “(3)” insert “, 3(A)”

Amendment 210, page 35, line 23, after “(3)” insert “, 3(A)”

Amendment 201, in clause 66, page 59, line 8, leave out from “Pornographic content” to end of line 10 and insert

“has the same meaning as section [meaning of pornographic content]”.

This amendment defines pornographic content for the purposes of the Part 5. It is consequential on NC33.

Amendment 56, page 59, line 8, after “content” insert “, taken as a whole,”

This amendment would require that content is considered as a whole before being defined as pornographic content.

Amendment 33, in clause 68, page 60, line 33, at end insert—

“(2A) A duty to verify that every individual featured in regulated provider pornographic content is an adult before the content is published on the service.

(2B) A duty to verify that every individual featured in regulated provider pornographic content that is already published on the service when this Act is passed is an adult and, where that is not the case, remove such content from the service.

(2C) A duty to verify that each individual appearing in regulated provider pornographic content has given their permission for the content in which they appear to be published or made available by the internet service.

(2D) A duty to remove regulated provider pornographic content featuring an individual if that individual withdraws their consent, at any time, to the pornographic content in which they feature remaining on the service.”

This amendment creates a duty to verify that each individual featured in pornographic content is an adult and has agreed to the content being uploaded before it is published. It would also impose a duty to remove content if the individual withdraws consent at any time.

Amendment 34, page 60, line 37, leave out “subsection (2)” and insert “subsections (2) to (2D)”.

This amendment is consequential on Amendment 33.

Amendment 31, in clause 182, page 147, line 16, leave out from “unless” to end of line 17 and insert—

“(a) a draft of the instrument has been laid before each House of Parliament,

“(b) the Secretary of State has made a motion in the House of Commons in relation to the draft instrument, and

(c) the draft instrument has been approved by a resolution of each House of Parliament.”

This amendment would require a draft of a statutory instrument containing regulations under sections 53 or 54 to be debated on the floor of the House of Commons, rather than in a delegated legislation committee (as part of the affirmative procedure).

Amendment 158, in clause 192, page 155, line 26, after “including” insert “but not limited to”.

This amendment clarifies that the list of types of content in clause 192 is not exhaustive.

Photo of Diana R. Johnson Diana R. Johnson Chair, Home Affairs Committee, Chair, Home Affairs Committee

May I welcome the Minister to his place, as I did not get an opportunity to speak on the previous group of amendments?

New clause 7 and amendments 33 and 34 would require online platforms to verify the age and consent of all individuals featured in pornographic videos uploaded to their site, as well as enabling individuals to withdraw their consent to the footage remaining on the website. Why are the amendments necessary? Let me read a quotation from a young woman:

“I sent Pornhub begging emails. I pleaded with them. I wrote, ‘Please, I’m a minor, this was assault, please take it down.’”

She received no reply and the videos remained live. That is from a BBC article entitled “I was raped at 14, and the video ended up on a porn site”.

This was no one-off. Some of the world’s biggest pornography websites allow members of the public to upload videos without verifying that everyone in the film is an adult or that everyone in the film gave their permission for it to be uploaded. As a result, leading pornography websites have been found to be hosting and profiting from filmed footage of rape, sex trafficking, image-based sexual abuse and child sexual abuse.

In 2020, The New York Times documented the presence of child abuse videos on Pornhub, one of the most popular pornography websites in the world, prompting Mastercard, Visa and Discover to block the use of their cards for purchases on the site. The New York Times reporter Nicholas Kristof wrote about Pornhub:

“Its site is infested with rape videos. It monetizes child rapes, revenge pornography, spy cam videos of women showering, racist and misogynist content, and footage of women being asphyxiated in plastic bags.”

Even before that, in 2019, PayPal took the decision to stop processing payments for Pornhub after an investigation by The Sunday Times revealed that the site contained child abuse videos and other illegal content. The newspaper reported:

“Pornhub is awash with secretly filmed ‘creepshots’ of schoolgirls and clips of men performing sex acts in front of teenagers on buses. It has also hosted indecent images of children as young as three.

The website says it bans content showing under-18s and removes it swiftly. But some of the videos identified by this newspaper’s investigation had 350,000 views and had been on the platform for more than three years.”

One of the women who is now being forced to take legal action against Pornhub’s parent company, MindGeek, is Crystal Palace footballer Leigh Nicol. Leigh’s phone was hacked and private content was uploaded to Pornhub without her knowledge. She said in an interview:

“The damage is done for me so this is about the next generation. I feel like prevention is better than someone having to react to this. I cannot change it alone but if I can raise awareness to stop it happening to others then that is what I want to do…The more that you dig into this, the more traumatising it is because there are 14-year-old kids on these websites and they don’t even know about it. The fact that you can publish videos that have neither party’s consent is something that has to be changed by law, for sure.”

Leigh Nicol is spot on.

Unfortunately, when this subject was debated in Committee, the previous Minister, Chris Philp, argued that the content I have described—including child sexual abuse images and videos—was already illegal, and there was therefore no need for the Government to introduce further measures. However, that misses the point: the Minister was arguing against the very basis of his own Government’s Bill. At the core of the Bill, as I understand it, is a legal duty placed on online platforms to combat and remove content that is already illegal, such as material relating to terrorism. ln keeping with that, my amendments would place a legal duty on online platforms hosting pornographic content to combat and remove illegal content through the specific and targeted measure of verifying the age and consent of every individual featured in pornographic content on their sites. The owners and operators of pornography websites are getting very rich from hosting footage of rape, trafficking and child sexual abuse, and they must be held to account under the law and required to take preventive action.

The Organisation for Security and Co-operation in Europe, which leads action to combat human trafficking across 57 member states, recommends that Governments require age and consent verification on pornography websites in order to combat exploitation. The OSCE told me:

“These sites routinely feature sexual violence, exploitation and abuse, and trafficking victims. Repeatedly these sites have chosen profits over reasonable prevention and protection measures. At the most basic level, these sites should be required to ensure that each person depicted is a consenting adult, with robust age verification and the right to withdraw consent at any time. Since self- regulation hasn’t worked, this will only work through strong, state-led regulation”.

Who else supports that? Legislation requiring online platforms to verify the age and consent of all individuals featured in pornographic content on their sites is backed by leading anti-sexual exploitation organisations including CEASE—the Centre to End All Sexual Exploitation—UK Feminista and the Traffickinghub movement, which has driven the global campaign to expose the abuses committed by, in particular, Pornhub.

New clause 7 and amendments 33 and 34 are minimum safety measures that would stop the well-documented practice of pornography websites hosting and profiting from videos of rape, trafficking and child sexual abuse. I urge the Government to reconsider their position, and I will seek to test the will of the House on new clause 7 later this evening.

Photo of Adam Afriyie Adam Afriyie Conservative, Windsor

I echo the concerns expressed by Dame Diana Johnson. Some appalling abuses are taking place online, and I hope that the Bill goes some way to address them, to the extent that that is possible within the framework that it sets up. I greatly appreciate the right hon. Lady’s comments and her contribution to the debate.

I have a tight and narrow point for the Minister. In amendment 56, I seek to ensure that only pornographic material is caught by the definition in the Bill. My concern is that we catch these abuses online, catch them quickly and penalise them harshly, but also that sites that may display, for example, works of art featuring nudes—or body positivity community sites, of which there are several—are not inadvertently caught in our desire to clamp down on illegal pornographic sites. Perhaps the Minister will say a few words about that in his closing remarks.

Photo of Barbara Keeley Barbara Keeley Shadow Minister (Cabinet Office), Shadow Minister (Culture, Media and Sport)

I rise to speak to this small group of amendments on behalf of the Opposition. Despite everything that is going on at the moment, we must remember that this Bill has the potential to change lives for the better. It is an important piece of legislation, and we cannot miss the opportunity to get it right. I would like to join my hon. Friend Alex Davies-Jones in welcoming the Under-Secretary of State for Digital, Culture, Media and Sport, Damian Collins to his role. His work as Chair of the Joint Committee on this Bill was an important part of the pre-legislative scrutiny process, and I look forward to working in collaboration with him to ensure that this legislation does as it should in keeping us all safe online. I welcome the support of the former Minister, Chris Philp, on giving access to data to academic researchers and on looking at the changes needed to deal with the harm caused by the way in which algorithmic prompts work. It was a pity he was not persuaded by the amendments in Committee, but better late than never.

Earlier we debated new clause 14, which will reduce the amount of illegal content and fraudulent advertising that is identified and acted upon. In our view, this new clause undermines and weakens the safety mechanisms that members of the Joint Committee and the Public Bill Committee worked so hard to get right. I hope the Government will reconsider this part of the Bill when it goes through its stages in the House of Lords. Even without new clause 14, though, there are problems with the provisions around fraudulent advertising. Having said that, we were pleased that the Government conceded to our calls in Committee to ensure that major search engines and social media sites would be subject to the same duties to prevent fraudulent advertising from appearing on their sites.

However, there are other changes that we need to see if the Bill is to be successful in reducing the soaring rates of online fraud and changing the UK’s reputation as the

“scam capital of the world”,

according to Which? The Government voted against other amendments tabled in Committee by me and my hon. Friend the Member for Pontypridd that would have tackled the reasons why people become subject to online fraud. Our amendments would have ensured that customers had better protection against scams and a clearer understanding of which search results were paid-for ads. In rejecting our amendments, the Government have missed an opportunity to tackle the many forms of scamming that people experience online.

One of those forms of scamming is in the world of online ticketing. In my role as shadow Minister for the Arts and Civil Society, I have worked on this issue and been informed by the expertise of my hon. Friend Mrs Hodgson, who chairs the all-party parliamentary group on ticket abuse. I would like to thank her and those who have worked with the APPG on the anti-ticket touting campaign for their insights. Ticket reselling websites have a well-documented history of breaching consumer protection laws. These breaches include cases of fraud such as the sale of non-existent tickets. If our amendment had been passed, secondary ticketing websites such as Viagogo would have had to be members of a regulatory body responsible for secondary ticketing such as the Society of Ticket Agents and Retailers, and they would have had to comply with established standards.

I have used ticket touting as an example, but the repercussions of this change go wider to include scamming by holiday websites, debt services and fraudulent passport renewal companies. Our amendments, together with amendments 205 to 210, which were tabled by the hon. Members for Ochil and South Perthshire (John Nicolson) and for Aberdeen North (Kirsty Blackman), would improve protection against scams and close loopholes in the definitions of fraudulent advertising. I hope the Minister recognises how many more scams these clauses would prevent if the amendments were accepted.

Part 5 of the Bill includes provisions that relate to pornographic content, which we have already heard about in this debate. For too long, we have seen a proliferation of websites with illegal and harmful content rife with representations of sexual violence, incest, rape and exploitation, and I thank my right hon. Friend Dame Diana Johnson for the examples she has just given us. We welcomed the important changes made to the Bill before the Committee stage, which meant that all pornographic content, not just user-generated content, would now be included within the duties in part 5.

Other Members have tabled important amendments to this part of the Bill. New clause 33 and new schedule 1, tabled by the hon. Members for Ochil and South Perthshire and for Aberdeen North, will ensure parity between online and offline content standards for pornography. New clause 33 is important in specifying that content that fails to obtain an R18 certificate has to be removed, just as happens in the offline world under the Video Recordings Act 1984. My right hon. Friend the Member for Kingston upon Hull North tabled amendment 33 and new clause 7, which place new duties on user-generated commercial pornography sites to verify the age and obtain consent of people featured in pornographic content, and to remove content should that consent be withdrawn. These are safeguards that should have been put in place by pornography platforms from the very start.

I would like to raise our concern about how quickly these duties can be brought into force. Clause 196 lays out that the only clauses in part 5 to be enacted once the Bill receives Royal Assent will be those covering the definitions—clauses 66 and 67(4)—and not those covering the duties. Children cannot wait another three years for protections from harm, having been promised this five years ago under part 3 of the Digital Economy Act 2017, which was never implemented. I hope the Minister appreciates the need for speed in regulating this particularly high harm part of the internet.

Part 11 clarifies companies’ liability and outlines the type of information offences contained in the Bill. It is important that liability is at the heart of discussions about the practical applications of the Bill, because we know that big internet companies have got away with doing nothing for far too long. However, the current focus on information offences means that criminal liability for repeated and systematic failures resulting in serious harm to users remains a crucial omission from the Bill.

My hon. Friend the Member for Pontypridd was vocal in making the point, but it needs to be made again, that we are very concerned about the volume of last-minute amendments tabled by the Government, and particularly their last-ditch attempt at power grabbing through amendment 144. The Secretary of State should not have the ability to decide what constitutes a priority offence without appropriate scrutiny, and our amendments would bring appropriate parliamentary oversight.

Amendment 31, in my name and in the name of my hon. Friend, would require that any changes to clauses 53 or 54, on harmful content, are debated on the Floor of the House rather than in a Delegated Legislation Committee. Without this change, the Secretary of State of the day will have the power to make decisions about priority content quietly through secondary legislation, which could have real-life consequences. Any changes to priority content are worthy of proper debate. If the Minister is serious about proper scrutiny of the online safety regime, he should carefully consider amendment 31. I urge hon. Members to support the amendment.

Finally, part 12 includes clarifications and definitions. The hon. Members for Ochil and South Perthshire and for Aberdeen North tabled amendment 158, which would expand the definition of content in the Bill. This is an important future-proofing measure.

As I mentioned, we are concerned about the delays to the implementation of certain duties set out in part 12. We are now in a situation in which many children who need protection will no longer even be children by the time this legislation and its protections come into effect. Current uncertainty about the running of Government will compound the concerns of many charities and children’s advocacy groups. I hope the Minister will agree that we cannot risk further delays.

At its core, the Online Safety Bill should be about reducing harm, and we are all aligned on that aim. I am disappointed that the Government have reversed some of the effectiveness of the scrutiny in Committee by now amending the Bill to such a degree. I hope the Minister considers our amendments in the collaborative spirit in which they are intended, and recognises their potential to make this Bill stronger and more effective for all.

Photo of Jeremy Wright Jeremy Wright Conservative, Kenilworth and Southam 5:30, 12 July 2022

I think it is extraordinarily important that this Bill does what Barbara Keeley has just described. As the Bill moves from this place to the other place, we must debate what the right balance is between what the Secretary of State must do—in the previous group of amendments, we heard that many of us believe that is too extensive as the Bill stands—what the regulator, Ofcom, must do and what Parliament must do. There is an important judgment call for this House to make on whether we have that balance right in the Bill as it stands.

These amendments are very interesting. I am not convinced that the amendments addressed by the hon. Lady get the balance exactly right either, but there is cause for further discussion about where we in this House believe the correct boundary is between what an independent regulator should be given authority to do under this legislative and regulatory structure and what we wish to retain to ourselves as a legislature.

Photo of Adam Afriyie Adam Afriyie Conservative, Windsor

My right hon. and learned Friend is highlighting, and I completely agree, that there is a very sensitive balance between different power bases and between different approaches to achieving the same outcome. Does he agree that as even more modifications are made—the nipping and tucking I described earlier—this debate and future debates, and these amendments, will contribute to those improvements over the weeks and months ahead?

Photo of Jeremy Wright Jeremy Wright Conservative, Kenilworth and Southam

Yes, I agree with my hon. Friend about that. I hope it is some comfort to the hon. Member for Worsley and Eccles South when I say that if the House does not support her amendment, it should not be taken that she has not made a good point that needs further discussion—probably in the other place, I fear. We are going to have think carefully about that balance. It is also important that we do not retain to ourselves as a legislature those things that the regulator ought to have in its own armoury. If we want Ofcom to be an effective and independent regulator in this space, we must give it sufficient authority to fulfil that role. She makes interesting points, although I am not sure I can go as far as supporting her amendments. I know that is disappointing, but I do think that what she has done is prompted a further debate on exactly this balance between Secretary of State, Executive, legislature and regulator, which is exactly where we need to be.

I have two other things to mention. The first relates to new clause 7 and amendment 33, which Dame Diana Johnson tabled. She speaks powerfully to a clear need to ensure that this area is properly covered. My question, however, is about practicalities. I am happy to take an intervention if she can answer it immediately. If not, I am happy to discuss it with her another time. She has heard me speak many times about making sure that this Bill is workable. The challenge in what she has described in her amendments may be that a platform needs to know how it is to determine and “verify”—that is the word she has used—that a participant in a pornographic video is an adult and a willing participant. It is clearly desirable that the platform should know both of those things, but the question that will have to be answered is: by what mechanism will it establish that? Will it ask the maker of the pornographic video and be prepared to accept the assurances it is given? If not, by what other mechanism should it do this? For example, there may be a discussion to be had on what technology is available to establish whether someone is an adult or is not—that bleeds into the discussion we have had about age assurance. It may be hard for a platform to establish whether someone is a willing participant.

Photo of Jess Phillips Jess Phillips Shadow Minister (Home Office)

This has been quite topical this week. When we have things on any platform that is on our television, people absolutely have to have signed forms to say that they are a willing participant. It is completely regular within all other broadcast media that people sign consent forms and that people’s images are not allowed to be used without their consent.

Photo of Jeremy Wright Jeremy Wright Conservative, Kenilworth and Southam

Yes, I am grateful to the hon. Lady for that useful addition to this debate, but it tends to clarify the point I was seeking to clarify, which is whether or not what the right hon. Member for Kingston upon Hull North has in mind is to ensure that a platform would be expected to make use of those mechanisms that already exist in order to satisfy itself of the things that she rightly asks it to be satisfied of or whether something beyond that would be required to meet her threshold. If it is the former, that is manageable for platforms and perfectly reasonable for us to expect of them. If it is the latter, we need to understand a little more clearly how she expects a platform to achieve that greater assurance. If it is that, she makes an interesting point.

Finally, let me come to amendment 56, tabled by my hon. Friend Adam Afriyie. Again, I have a practical concern. He seeks to ensure that the pornographic content is “taken as a whole”, but I think it is worth remembering why we have included pornographic content in the context of this Bill. We have done it to ensure that children are not exposed to this content online and that where platforms are capable of preventing that from happening, that is exactly what they do. There is a risk that if we take this content as a whole, it is perfectly conceivable that there may be content online that is four hours long, only 10 minutes of which is pornographic in nature. It does not seem to me that that in any way diminishes our requirement of a platform to ensure that children do not see those 10 minutes of pornographic content.

Photo of Adam Afriyie Adam Afriyie Conservative, Windsor

I am very sympathetic to that view. I am merely flagging up for the Minister that if we get the opportunity, we need to have a look at it again in the Lords, to be absolutely certain that we are not ruling out certain types of art, and certain types of community sites that we would all think were perfectly acceptable, that are probably not accessible to children, just to ensure that we are not creating further problems down the road that we would have to correct.

Photo of Jeremy Wright Jeremy Wright Conservative, Kenilworth and Southam 5:45, 12 July 2022

I follow that point. I will channel, with some effort, Jess Phillips, who I suspect would say that these things are already up for debate and discussed in other contexts—the ability to distinguish between art and pornography is something that we have wrestled with in other media. Actually, in relation to the Bill, I think that one of our guiding principles ought to be that we do not reinvent the wheel where we do not have to, and that we seek to apply to the online world the principles and approaches that we would expect in all other environments. That is probably the answer to my hon. Friend’s point.

I think it is very important that we recognise the need for platforms to do all they can to ensure that the wrong type of material does not reach vulnerable users, even if that material is a brief part of a fairly long piece. Those, of course, are exactly the principles that we apply to the classification of films and television. It may well be that a small portion of a programme constitutes material that is unsuitable for a child, but we would still seek to put it the wrong side of the 9 o’clock watershed or use whatever methods we think the regulator ought to adopt to ensure that children do not see it.

Good points are being made. The practicalities are important; it may be that because of a lack of available time and effort in this place, we have to resolve those elsewhere.

Photo of John Nicolson John Nicolson Shadow SNP Spokesperson (Digital, Culture, Media and Sport)

I wish to speak to new clause 33, my proposed new schedule 1 and amendments 201 to 203. I notice that the Secretary of State is off again. I place on record my thanks to Naomi Miles of CEASE—the Centre to End All Sexual Exploitation—and Ceri Finnegan of Barnardos for their support.

The UK Government have taken some steps to strengthen protections on pornography and I welcome the fact that young teenagers will no longer be able to access pornography online. However, huge quantities of extreme and harmful pornography remain online, and we need to address the damage that it does. New clause 33 would seek to create parity between online and offline content—consistent legal standards for pornography. It includes a comprehensive definition of pornography and puts a duty on websites not to host content that would fail to attain the British Board of Film Classification standard for R18 classification.

The point of the Bill, as the Minister has repeatedly said, is to make the online world a safer place, by doing what we all agree must be done—making what is illegal offline, illegal online. That is why so many Members think that the lack of regulation around pornography is a major omission in the Bill.

The new clause stipulates age and consent checks for anyone featured in pornographic content. It addresses the proliferation of pornographic content that is both illegal and harmful, protecting women, children and minorities on both sides of the camera.

The Bill presents an opportunity to end the proliferation of illegal and harmful content on the internet. Representations of sexual violence, animal abuse, incest, rape, coercion, abuse and exploitation—particularly directed towards women and children—are rife. Such content can normalise dangerous and abusive acts and attitudes, leading to real-world harm. As my hon. Friend Alex Davies-Jones said in her eloquent speech earlier, we are seeing an epidemic of violence against women and girls online. When bile and hatred is so prolific online, it bleeds into the offline space. There are real-world harms that flow from that.

The Minister has said how much of a priority tackling violence against women and girls is for him. Knowing that, and knowing him, he will understand that pornography is always harmful to children, and certain kinds of pornographic content are also potentially harmful to adults. Under the Video Recordings Act 1984, the BBFC has responsibility for classifying pornographic content to ensure that it is not illegal, and that it does not promote an interest in abusive relationships, such as incest. Nor can it promote acts likely to cause serious physical harm, such as breath restriction or strangulation. In the United Kingdom, it is against the law to supply pornographic material that does not meet this established BBFC classification standard, but there is no equivalent standard in the online world because the internet evolved without equivalent regulatory oversight.

I know too that the Minister is determined to tackle some of the abusive and dangerous pornographic content online. The Bill does include a definition of pornography, in clause 66(2), but that definition is inadequate; it is too brief and narrow in scope. In my amendment, I propose a tighter and more comprehensive definition, based on that in part 3 of the Digital Economy Act 2017, which was debated in this place and passed into law. The amendment will remove ambiguity and prevent confusion, ensuring that all websites know where they stand with regard to the law.

The new duty on pornographic websites aligns with the UK Government’s 2020 legislation regulating UK-established video-sharing platforms and video-on-demand services, both of which appeal to the BBFC’s R18 classification standards. The same “high standard of rules in place to protect audiences”, as the 2020 legislation put it, and “certain content standards” should apply equally to online pornography and offline pornography, UK-established video-sharing platforms and video-on-demand services.

Let me give some examples sent to me by Barnardo’s, the children’s charity, which, with CEASE, has done incredibly important work in this area. The names have been changed in these examples, for obvious reasons.

“There are also children who view pornography to try to understand their own sexual abuse. Unfortunately, what these children find is content that normalises the most abhorrent and illegal behaviours, such as 15-year-old Elizabeth, who has been sexually abused by a much older relative for a number of years. The content she found on pornography sites depicted older relatives having sex with young girls and the girls enjoying it. It wasn’t until she disclosed her abuse that she realised that it was not normal.

Carrie is a 16-year-old who was being sexually abused by her stepfather. She thought this was not unusual due to the significant amount of content she had seen on pornography sites showing sexual relationships within stepfamilies.”

That is deeply disturbing evidence from Barnardo’s.

Although in theory the Bill will prevent under-18s from accessing such content, the Minister knows that under-18s will be able to bypass regulation through technology like VPNs, as the DCMS Committee and the Bill Committee—I served on both—were told by experts in various evidence sessions. The amendment does not create a new law; it merely moves existing laws into the online space. There is good cause to regulate and sometimes prohibit certain damaging offline content; I believe it is now our duty to provide consistency with legislation in the online world.

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Work and Pensions)

I want to talk about several things, but particularly new clause 7. I am really pleased that the new clause has come back on Report, as we discussed it in the Bill Committee but unfortunately did not get enough support for it there—as was the case with everything we proposed—so I thank Dame Diana Johnson for tabling it. I also thank my hon. Friend Ronnie Cowan for his lobbying and for providing us with lots of background information. I agree that it is incredibly important that new clause 7 is agreed, particularly the provisions on consent and making sure that participants are of an appropriate age to be taking part. We have heard so many stories of so many people whose videos are online—whose bodies are online—and there is nothing they can do about it because of the lack of regulation. My hon. Friend John Nicolson has covered new clause 33 in an awful lot of detail—very good detail—so I will not comment on that.

Sir Jeremy Wright mentioned how we need to get the balance right, and specifically talked about the role of the regulator. In many ways, this Bill has failed to get the balance right in its attempts to protect children online. Many people who have been involved in writing this Bill, talking about this Bill, scrutinising this Bill and taking part in every piece of work that we have done around it do not understand how children use the internet. Some people do, absolutely, but far too many of the people who have had any involvement in this Bill do not. They do not understand the massive benefits to children of using the internet, the immense amount of fun they can have playing Fortnite, Fall Guys, Minecraft, or whatever it is they happen to be playing online and how important that is to them in today’s crazy world with all of the social media pressures. Children need to decompress. This is a great place for children to have fun—to have a wonderful time—but they need to be protected, just as we would protect them going out to play in the park, just the same as we would protect them in all other areas of life. We have a legal age for smoking, for example. We need to make sure that the protections are in place, and the protections that are in place need to be stronger than the ones that are currently in the Bill.

I did not have a chance earlier—or I do not think I did—to support the clause about violence against women and girls. As I said in Committee, I absolutely support that being in the Bill. The Government may say, “Oh we don’t need to have this in the Bill because it runs through everything,” but having that written in the Bill would make it clear to internet service providers—to all those people providing services online and having user-generated content on their sites—how important this is and how much of a scourge it is. Young women who spend their time on social media are more likely to have lower outcomes in life as a result of problematic social media use, as a result of the pain and suffering that is caused. We should be putting such a measure in the Bill, and I will continue to argue for that.

We have talked a lot about pornographic content in this section. There is not enough futureproofing in the Bill. My hon. Friend the Member for Ochil and South Perthshire and I tabled amendment 158 because we are concerned about that lack of futureproofing. The amendment edits the definition of “content”. The current definition of “content” says basically anything online, and it includes a list of stuff. We have suggested that it should say “including but not limited to”, on the basis that we do not know what the internet will look like in two years’ time, let alone what it will look like in 20 years’ time. If this Bill is to stand the test of time, it needs to be clear that that list is not exhaustive. It needs to be clear that, when we are getting into virtual reality metaverses where people are meeting each other, that counts as well. It needs to be clear that the sex dungeon that exists in the child’s game Roblox is an issue—that that content is an issue no matter whether it fits the definition of “content” or whether it fits the fact that it is written communication, images or whatever. It does not need to fit any of that. If it is anything harmful that children can find on the internet, it should be included in that definition of “content”, no matter whether it fits any of those specific categories. We just do not know what the internet is going to look like.

I have one other specific thing in relation to the issues of content and pornography. One of the biggest concerns that we heard is the massive increase in the amount of self-generated child sexual abuse images. A significant number of new images of child sexual abuse are self-generated. Everybody has a camera phone these days. Kids have camera phones these days. They have much more potential to get themselves into really uncomfortable and difficult situations than when most of us were younger. There is so much potential for that to be manipulated unless we get this right.

I have concerns about the age assurance that was mentioned. If it is livestreamed content, content that is being generated right now at this moment, scanning for those child sexual abuse images will be very difficult. There will not be a hashtag. It has not been discovered before. It is not an image that will have been looked at, categorised and worked out before. It is something that evil people are convincing and forcing children to do today.

That is another place where the Bill fails to recognise how young people use the internet today. It fails to talk specifically about things such as livestreaming and how duties on providing services for children on the internet should reduce things such as the ability to livestream and to have private conversations with potential abusers. All those things are not in the Bill in the way I would like. I understand that the codes of practice will come through and there will be guidance on the risk assessments, but I have not seen enough so far to convince me that people know what they are doing when they are writing those codes of practice.

From what I have heard from Ofcom, it has generally been pretty sensible, but nearly every person I have encountered talking about this Bill who has had or continues to have any say over it does not understand how children actually use the internet. I have been online for nearly 30 years, since I was younger than my children are now. I grew up on the internet. I spend a lot of time on the internet. I have spoken to so many people who I did not know online. I have had so many ridiculous, harmful conversations that I would be aghast and devastated if my children were having now. I do not want that to be happening to tomorrow’s generation of children.

No matter what we put in place, there will always be loopholes and bad actors and there will always be issues, but we want the Bill to be genuinely the best possible. The biggest failing is that lack of understanding of how children interact with the internet. We want it to be a safe place for them. We want them to be able to have great experiences online and to enjoy themselves, and we want to put up those protections in the same way that we put up crossing patrols and so on to protect children, and we are just not there yet.

I appreciate that the Minister and the previous Minister, Chris Philp, have brought forward a significant number of amendments, although it is unfortunate that they have come at such short notice that we have not had enough time to look at them properly, and I appreciate that they will bring forward more, but there is still more they need to do. Even with all the amendments and all the commitments we have seen, I am still not comfortable enough that my children and my children’s children will be as safe on the internet as they should be.

Photo of Lloyd Russell-Moyle Lloyd Russell-Moyle Labour/Co-operative, Brighton, Kemptown 6:00, 12 July 2022

I rise to support new clauses 7 and 33 in particular. I support them sometimes from a different angle from my hon. Friends, but fundamentally from the same angle: consent. I am not afraid to say that I have a different perspective from some hon. Members in this House in that I view sex work as a legitimate form of work under regulated and protected conditions, and pornography as part of that. What I do have a problem with is the lack of consent that occurs far too often not only in the industry—that may be too broad a term—but in particular content that we see online at the moment.

That is true particularly for those sex workers who might have produced content with consent at the time, as adults, but who later in life realise that they do not wish that material to be available any more—not just because they may be embarrassed about it, but perhaps because they just do not want that material commercially available and people making profits off their bodies any more. They are struggling to get content taken down because they are told, “You gave consent at the time and that can’t now be removed. You have to allow your body to be used.” We would not allow any other form of worker or artist to suffer that. In any other form of music or production, if they wished to remove their consent for it to be played, it would be taken down, but in pornography there seems to be a free-for-all where, even if people remove their consent, it still proliferates in copies of copies that are put all over the internet. That is not even to mention people who never gave their consent at all and experience revenge porn or their phones being hacked and the devastation that that can cause.

I might come from a different position on some of this, but I think we can be united in saying that of course we need better action on under-18s, which is very important, but even for those who have supposedly given their consent at one point or another, the removal of consent must be put into the Bill and platforms must have a strict responsibility to remove that content. Without that being in the Bill, there is a danger that platforms will continue to play loophole after loophole and the content will still be there when it should not be.

Photo of Ronnie Cowan Ronnie Cowan Shadow SNP Spokesperson (Infrastructure)

I was not planning to speak, but we have a couple of minutes so I will abuse that position.

I just want to say that I do not want new clause 7 to be lost in this debate and become part of the flotsam and jetsam of the tide of opinion that goes back and forth in this place, because new clause 7 is about consent. We are trying very hard to teach young men all about consent, and if we cannot do it from this place, then when can we do it? We can work out the details of the technology in time, as we always do. It is out there. Other people are way ahead of us in this matter. In fact, the people who produce this pornography are way ahead of us in this matter.

Photo of Diana R. Johnson Diana R. Johnson Chair, Home Affairs Committee, Chair, Home Affairs Committee

While we have been having this debate, Iain Corby, executive director at the Age Verification Providers Association, has sent me an email in which he said that the House may be interested to know that one of the members of that organisation offers adult sites a service that facilitates age verification and the obtaining and maintaining of records of consent. So it is possible to do this if the will is there.

Photo of Ronnie Cowan Ronnie Cowan Shadow SNP Spokesperson (Infrastructure)

I absolutely agree. We can also look at this from the point of view of gambling reform and age verification for that. The technology is there, and we can harness and use it to protect people. All I am asking is that we do not let this slip through the cracks this evening.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

We have had an important debate raising a series of extremely important topics. While the Government may not agree with the amendments that have been tabled, that is not because of a lack of seriousness of concern about the issues that have been raised.

Dame Diana Johnson spoke very powerfully. I have also met Leigh Nicol, the lady she cited, and she discussed with me the experience that she had. Sadly, it was during lockdown and it was a virtual meeting rather than face to face. There are many young women, in particular, who have experienced the horror of having intimate images shared online without their knowledge or consent and then gone through the difficult experience of trying to get them removed, even when it is absolutely clear that they should be removed and are there without their consent. That is the responsibility of the companies and the platforms to act on.

Thinking about where we are now, before the Bill passes, the requirement to deal with illegal content, even the worst illegal content, on the platforms is still largely based on the reporting of that content, without the ability for us to know how effective they are at actually removing it. That is largely based on old legislation. The Bill will move on significantly by creating proactive responsibilities not just to discover illegal content but to act to mitigate it and to be audited to see how effectively it is done. Under the Bill, that now includes not just content that would be considered to be an abuse of children. A child cannot give consent to have sex or to appear in pornographic content. Companies need to make sure that what they are doing is sufficient to meet that need.

It should be for the regulator, Ofcom, as part of putting together the codes of practice, to understand, even on more extreme content, what systems companies have in place to ensure that they are complying with the law and certainly not knowingly hosting content that has been flagged to them as being non-consensual pornography or child abuse images, which is effectively what pornography with minors would be; and to understand what systems they have in place to make sure that they are complying with the law and, as hon. Members have said, making sure that they are using available technologies in order to deliver that.

Photo of Jess Phillips Jess Phillips Shadow Minister (Home Office)

We have an opportunity here today to make sure that the companies are doing it. I am not entirely sure why we would not take that opportunity to legislate to make sure that they are. With the greatest of respect to the Minister back in a position of authority, it sounds an awful lot like the triumph of hope over experience.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

It is because of the danger of such a sentiment that this Bill is so important. It not just sets the targets and requirements of companies to act against illegal content, but enables a regulator to ensure that they have the systems and processes in place to do it, that they are using appropriate technology and that they apply the principle that their system should be effective at addressing this issue. If they are defective, that is a failure on the company’s part. It cannot be good enough that the company says, “It is too difficult to do”, when they are not using technologies that would readily solve that problem. We believe that the technologies that the companies have and the powers of the regulator to have proper codes of practice in place and to order the companies to make sure they are doing it will be sufficient to address the concern that the hon. Lady raises.

Photo of Diana R. Johnson Diana R. Johnson Chair, Home Affairs Committee, Chair, Home Affairs Committee

I am a little taken aback that the Minister believes that the legislation will be sufficient. I do not understand why he has not responded to the point that my hon. Friend Jess Phillips was making that we could make this happen by putting the proposal in the Bill and saying, “This is a requirement.” I am not sure why he thinks that is not the best way forward.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

It is because the proposal would not make such content more illegal than it is now. It is already illegal and there are already legal duties on companies to act. The regulator’s job is to ensure they have the systems in place to do that effectively, and that is what the Bill sets out. We believe that the Bill addresses the serious issue that the right hon. Lady raises in her amendments. That legal requirement is there, as is the ability to have the systems in place.

If I may, I will give a different example based on the fraud example given by the shadow Minister, Barbara Keeley. On the Joint Committee that scrutinised the Bill, we pushed hard to have fraudulent ads included within the scope of the Bill, which has been one of the important amendments to it. The regulator can consider what systems the company should have in place to identify fraud, but also what technologies it employs to make it far less likely that fraud would be there in the first place. Google has a deal with the Financial Conduct Authority, whereby it limits advertisers from non-accredited companies advertising on its platform. That makes it far less likely that fraud will be discovered because, if the system works, only properly recognised organisations will be advertising.

Facebook does not have such a system in place. As a consequence, since the Google system went live, we have seen a dramatic drop in fraud ads on Google, but a substantial increase in fraud ads on Facebook and platforms such as Instagram. That shows that if we have the right systems in place, we can have a better outcome and change the result. The job of the regulator with illegal pornography and other illegal content should be to look at those systems and say, “Do the companies have the right technology to deliver the result that is required?” If they do not, that would still be a failure of the codes.

Photo of Barbara Keeley Barbara Keeley Shadow Minister (Cabinet Office), Shadow Minister (Culture, Media and Sport)

The Minister is quoting a case that I quoted in Committee, and the former Minister, Chris Philp, would not accept amendments on this issue. We could have tightened up on fraudulent advertising. If Google can do that for financial ads, other platforms can do it. We tabled an amendment that the Government did not accept. I do not know why this Minister is quoting something that we quoted in Committee—I know he was not there, but he needs to know that we tried this and the former Minister did not accept what we called for.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

I am quoting that case merely because it is a good example of how, if we have better systems, we can get a better result. As part of the codes of practice, Ofcom will be able to look at some of these other systems and say to companies, “This is not just about content moderation; it is about having better systems that detect known illegal activity earlier and prevent it from getting on to the platform.” It is not about how quickly it is removed, but how effective companies are at stopping it ever being there in the first place. That is within the scope of regulation, and my belief is that those powers exist at the moment and therefore should be used.

Photo of Jess Phillips Jess Phillips Shadow Minister (Home Office)

Just to push on this point, images of me have appeared on pornographic sites. They were not necessarily illegal images of anything bad happening to me, but other Members of Parliament in this House and I have suffered from that. Is the Minister telling me that this Bill will allow me to get in touch with that site and have an assurance that that image will be taken down and that it would be breaking the law if it did not do so?

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

The Bill absolutely addresses the sharing of non-consensual images in that way, so that would be something the regulator should take enforcement action against—[This section has been corrected on 21 July 2022, column 14MC — read correction]

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

Well, the regulator is required, and has the power, to take enforcement action against companies for failing to do so. That is what the legislation sets out, and we will be in a very different place from where we are now. That is why the Bill constitutes a very significant reform.

Photo of Lloyd Russell-Moyle Lloyd Russell-Moyle Labour/Co-operative, Brighton, Kemptown

Could the Minister give me a reassurance about when consent is withdrawn? The image may initially have been there “consensually”—I would put that in inverted commas—so the platform is okay to put it there. However, if someone contacts the platform saying that they now want to change their consent—they may want to take a role in public life, having previously had a different role; I am not saying that about my hon. Friend Jess Phillips—my understanding is that there is no ability legally to enforce that content coming down. Can the Minister correct me, and if not, why is he not supporting new clause 7?

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee), The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport 6:15, 12 July 2022

With people who have appeared in pornographic films consensually and signed contracts to do so, that would be a very different matter from the question of intimate images being shared without consent. When someone has not consented for such images to be there, that would be a very different matter. I am saying that the Bill sets out very clearly—it did not do so in draft form—that non-consensual sexual images and extreme pornography are within the scope of the regulator’s power. The regulator should be taking action not just on what a company does to take such content down when it is discovered after the event, but on what systems the company has in place and whether it deploys all available technology to make sure that such content is never there in the first place.

Before closing, I want to touch briefly on the point raised about the Secretary of State’s powers to designate priority areas of harm. This is now under the affirmative procedure in the Bill, and it requires the approval of both Houses of Parliament. The priority illegal harms will be based on offences that already exist in law, and we are writing those priority offences into the Bill. The other priorities will be areas where the regulator will seek to test whether companies adhere to their terms of service. The new transparency requirements will set that out, and the Government have said that we will set out in more detail which of those priority areas of harm such transparency will apply to. There is still more work to be done on that, but we have given an indicative example. However, when it comes to adding a new priority illegal offence to the Bill, the premise is that it will already be an offence that Parliament has created, and writing it into the Bill will be done with the positive consent of Parliament. I think that is a substantial improvement on where the Bill was before. I am conscious that I have filled my time.

Question put, That the clause be read a Second time.

Division number 38 Online Safety Bill Report Stage: New Clause 7

Aye: 219 MPs

No: 282 MPs

Aye: A-Z by last name

Tellers

No: A-Z by last name

Tellers

Abstained: 1 MP

Abstained: A-Z by last name

The House divided: Ayes 220, Noes 285.

Question accordingly negatived.