I congratulate Jeremy Wright on his introduction and on all that he said. In my intervention I referred to the need for a social media regulator, and, as Elliot Colburn has just said, we need a regulator with teeth. We need a regulator that actually does what it says it is going to do. That is important.
The Conservative manifesto of 2015 was very clear that it pertained not to social media platforms but to pornographic websites, and it committed to protecting children from them through the provision of statutory age verification. Part 3 of the Digital Economy Act 2017 made provision for that and it should have been implemented over a year ago. I respectfully express my dismay and concern that that has not happened.
The non-implementation of part 3 of the Act is a disaster for children as it needlessly exposes them to commercial pornographic websites, when this House has made provision for their protection from some sites. Perhaps the Minister could give us an explanation as to why the Government’s detailed defence in the judicial review for not proceeding with the implementation seems to relate to the protection under paragraph 19, which states:
I have great concerns about that.
I am also troubled by the way in which the Government have moved from the language of requiring age verification for pornographic websites, as referred to in their manifesto, to the very different language of expectation. The Government have said:
“This includes age verification tools and we expect them to continue to play a key role in protecting children online.”
They also said:
“Our proposals will introduce higher levels of protection for children. We will expect companies to use a proportionate range of tools including age assurance and age verification technologies to prevent children from accessing age-inappropriate or harmful content.”
In their initial response to the online harms White Paper consultation, the Government also said:
“we expect companies to use a proportionate range of tools, including age assurance and age verification technologies to prevent children accessing age-inappropriate content such as online pornography and to protect them from harms.”
Quite simply, that is not enough. That should not be an expectation; it should be a requirement. We have to have that in place.
The NSPCC has highlighted some worrying statistics. Instagram removed 75% fewer suicide and self-harm images between July and September 2020, industry compliance to take down child abuse images fell by 89%, and 50% of recorded online grooming cases between April and June this year took place on Facebook platforms. What conversations have the Government had to ensure that Facebook and others design and deliver platforms that put child protection services front and centre, as they should be?