Digital, Culture, Media and Sport Committee

Part of the debate – in the House of Commons at 11:50 am on 27th January 2022.

Alert me about debates like this

Photo of Julian Knight Julian Knight Chair, Digital, Culture, Media and Sport Committee, Chair, Digital, Culture, Media and Sport Committee, Chair, Digital, Culture, Media and Sport Sub-committee on Online Harms and Disinformation, Chair, Digital, Culture, Media and Sport Sub-committee on Online Harms and Disinformation 11:50 am, 27th January 2022

Thank you, Mr Speaker. With your words of endorsement ringing in my ears, I will ensure that I am as brief as the subject can allow.

I am grateful to have been granted this statement to discuss the DCMS Committee’s report on the draft online safety Bill. This is an important piece of legislation that, if done right, will prevent a tremendous amount of harm to so many in our society. The ultimate aim for all of us involved in the production of the Bill is to make user-to-user and search service providers more accountable for decisions they make when designing their platforms and the systems and processes that govern them. The Committee I chair has a crucial role in ensuring that that is the ultimate outcome of this work. While I welcome large parts of the Bill’s content in draft form, there are some elements that do need work so that we do not miss the opportunity to make the internet a safer space for all, while protecting freedom of expression.

One such area of particular concern to the Committee is that the Bill in its current form lacks clarity on what falls within the parameters of illegal content and in its treatment of legal but nevertheless terribly harmful content. For example, the Committee was alarmed to hear in evidence so many examples of online abuse towards women and girls that would not be adequately covered by the Bill in its current form. We are all aware of frankly appalling images being shared online without the consent of those pictured, some of whom are underage. Many of these would be covered by the Bill, but not all.

Furthermore, the internet is awash with images that are often edited to cause harm and are clearly not within the scope of the Bill. My Committee’s report seeks to tackle this. We also have concerns about the less immediately obvious examples of abuse such as breadcrumbing—leading someone on virtually with a series of digital breadcrumbs on the way to illegal and harmful material. In such instances, the context of these communications is key. Some examples of online abuse that we have heard in our investigations are insidious—inch by inch, step by step, allowing people, often children and teenagers, to be lured in. In such instances, no one message, picture or like is technically illegal, but they none the less form part of a sequence of online child sexual exploitation and abuse. The Bill can and must stop this. For this reason, we propose reframing the definition of “illegal content” to include context.

The Committee was truly shocked by the repeated examples of cyber-flashing and deliberate manipulation of images such as tech-enabled nudifying of women and deepfake pornography, which currently go unchecked. The deliberate manipulation of images to circumnavigate content moderators is egregious in its own right. It is also a key hallmark of potential child exploitation. This Bill, if crafted correctly, can and must protect children from such acts and such tactics. In its current form, it does not adequately cover these examples of truly harmful content. As such, we propose that they should be included in the Bill and covered by the duties of care in it.

Another area that many Members are rightly deeply concerned by is the many examples of inherently harmful activity that are not illegal. We support the Joint Committee in its view about harmful actions such as cyber-flashing, and people with photosensitive epilepsy being targeted by trolls sending malicious flashing images with a deliberate intent to trigger a seizure: these offences, in all the senses that we would understand, must be included in the Bill.

Finally, I come to the issue of scrutiny. The current provisions in the Bill to provide Ofcom with a suite of powers to address such actions are unclear and impractical. We urge the Government to bake in best practice by providing greater clarity in the Bill on when and how these powers should be used to ensure that they are both practical and proportionate. We recommend that there should be compliance officers in the social media companies, paid for by those companies, baking in that best practice. That will, hopefully, also lead to the ending, or at least reduction, of unwarranted take-downs.

The present situation is deeply unsatisfactory. Effectively, social media companies are editors-in-chief of the content on their sites. There is no say, and no transparency. They act according to their terms and conditions, which they decide. That can lead—and has led in the past—to unwarranted take-downs, and the people who suffer those take-downs then have to appeal to the social media companies. This is not right. It is against freedom of speech. We need proper systems so that transparency and know-how on the ground can ensure that any such issues of take-down are set against clear parameters. That can, I believe, be regulated in the same way as financial services are effectively regulated—through a strong compliance regime.

We specifically recommend that the Government reframe the language relating to freedom-of-expression considerations to incorporate a “must balance” test, to enable Ofcom, and the compliance officers whose introduction we propose, to assess whether providers have duly balanced their freedom-of-expression obligations with their decision making, thereby preventing unjustified take-downs of material.

Our Committee has made clear that it strongly disagrees with the recommendation of the now defunct Joint Committee—which did amazing work in this area—that a permanent Joint Committee be established as

“a solution to the lack of transparency and…oversight”.

We disagree with that proposal for a range of reasons, but not least because it would set a precedent which could be written into any other Bill and could then effectively circumnavigate the Select Committee system. I think the Select Committee system is the jewel in the crown of this House, and I say that not just because I have a personal interest in it. This, I think, is something we can do ourselves. If there is a need for pre-legislative scrutiny, Select Committees should be able to deal with it, but in any event the Government are free to set up a framework of pre-legislative scrutiny which may be on a one-off or ad hoc basis. That has happened before after a period of time in the case of other Acts that have passed through this place.

I welcome wholeheartedly the aims of this Bill and much of its content. I hope and expect the Department to be in listening mode—I know that the Minister personally is absolutely committed to that—so that we can all work together to ensure that the aim and the reality of the Bill are aligned, and we can make the internet a safer and a better place that is more in tune with what I would describe as the health of our society.