Joint Committee on the Draft Online Safety Bill

– in the House of Commons at 12:47 pm on 16 December 2021.

Alert me about debates like this

Select Committee statement

Photo of Rosie Winterton Rosie Winterton Deputy Speaker (First Deputy Chairman of Ways and Means)

We now come to the Select Committee statement. Damian Collins will speak for up to 10 minutes, during which no interventions may be taken. At the conclusion of his statement, I will call Members to put questions on the subject of the statement and call him to respond to those in turn. Front Benchers may take part in questioning as well. I call the Chair of the Joint Committee on the draft Online Safety Bill.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee) 12:52, 16 December 2021

May I take this opportunity to wish you, Madam Deputy Speaker, all Members of the House and all members of House staff a very merry Christmas?

Following the publication of the Joint Committee’s report on the draft Online Safety Bill on Tuesday, I want to take this opportunity to inform the House of its publication and the key themes that we have addressed. The Joint Committee was formed as a pre-legislative scrutiny Committee by order of this House and the House of Lords on the last sitting day before the summer recess. Anyone who has been involved in a Joint Committee of this nature knows that, because it has a clear deadline to hit, it is inevitably a race against time. Although we had the summer recess to plan and prepare for our hearings in September, we effectively had around 11 to 12 sitting weeks, including some of the conference recess, to produce our report. The report was concluded on the last day of the Joint Committee’s existence last Friday and then published on Tuesday this week.

Before addressing the report, I thank the members of the Joint Committee, who worked so hard throughout the inquiry and produced a unanimous report. It was genuinely a very collaborative process to which all members of the Joint Committee contributed. To have completed that and produced a unanimous report without division among the members shows the strength of feeling and the importance and strength of working closely together through that.

I also thank the staff of the Joint Committee, particularly the Commons Clerk, David Slater, who led a very impressive team of Clerks and advisers. Without their herculean efforts, we would not have completed the project within the timeframe that we were given by the Government. The Joint Committee held oral evidence sessions with 50 witnesses, received 200 written evidence submissions and produced a report of 192 pages, totalling around 60,000 words, so it was a huge effort to produce what I think is an important report.

The draft Bill has been of considerable interest to Members of the House. We organised a roundtable to enable Members to contribute directly to the work of the Joint Committee, as well as other roundtables working with the University of Cambridge and the London School of Economics. The high number of written evidence submissions also demonstrates the high level of interest in this issue. For those of us who have been following the debate closely over a number of years, the Bill feels like it has been a long time coming. I think that is because it is anticipated and wanted, but we should still remember that this Parliament will be the first in the world to introduce such a comprehensive piece of legislation to create regulation for the online world. Other Parliaments in the world are discussing that and the European Union is discussing it, but we have gone further. When the Bill is introduced before the end of this Session, as I believe the Government intend, it will be the first such comprehensive Bill in the world to seek parliamentary approval.

In addition to my thanks to the members and the staff of the Joint Committee, I thank the ministerial team and the Secretary of State at the Department for Digital, Culture, Media and Sport, as well as the Bill team officials, with whom we had a very constructive and open dialogue throughout the inquiry. It was good to see them stand by the commitments Ministers made that they wanted the scrutiny process to be open and genuine. The Bill was by no means locked down when it was given to us and the Secretary of State herself has gone on the record to say that she expects the Bill to change as a consequence of the work of the Joint Committee. That is good to hear and important.

The reason the Bill has been so anticipated is because the online world has become central to our lives. It is where we work. It is where we stay in touch with our family and friends. It is where people play games. It is where people get their news and information. It has become our public square. But people are rightly asking, “What kind of place is that public square?” It is increasingly an environment where, for too many people, it is the forum in which they are abused. It is the forum in which their vulnerabilities are targeted and exploited. It is the forum through which hate speech has become normalised. We are seeing a disturbing trend of that affecting offline behaviour, too. People are more likely to be subject to attacks because of their race, sexual orientation or gender. People are more likely to become victims of scams and frauds through the internet. People are more likely to receive egregious disinformation that could damage public health, or interfere and undermine the integrity of elections. We see that taking place around the world, but we experience it at home as well. As Members of Parliament, we are often subject to abuse. We often have constituents who come to us who have been the victims of abuse. They say, “What can be done about this? What can the social media companies do?”

There is a presumption that the law applies equally in all areas, but I think we all know that the law being applied online has become a very difficult place. It is difficult to get social media companies to take responsibility for the systems they have created and the activity of the users on their platforms. We have to recognise that the Bill does not just address content moderation. We are not just looking at harmful and abusive content that has no place on the internet; we are looking at the systems that create an audience for that content, too. The bigger area of harm is done by the amplification of content on these platforms. If abuse was being directed by someone shouting in the street, ultimately that person would probably be arrested and moved on, but it is difficult to control it when that voice of abuse is being amplified to millions of people. That is what the systems of social media companies do and they should be accountable for those systems. They have designed and built those systems to hold the engagement of users, because the more often they visit the site, the longer they are on it and the more engaged they are, the more valuable they are to the platforms and the more advertising they can sell.

Too often, the platforms work on the assumption that all engagement is good, that engagement in itself is a positive metric, because people would not go on the platforms if they were not enjoying it. But we all know the nature of addiction is that people return to things they know are harmful and damaging to them. It was interesting to hear Frances Haugen, the Facebook whistleblower who gave evidence to the Joint Committee, cite research from within Facebook showing how vulnerable teenage girls were suffering heightened anxiety and depression as a consequence of their experience of using Instagram, but felt, at the same time, that they could not not use the platform because all their friends were on it and they could not miss out. It is disturbing not just to see those problems discussed in cold research documents, but to know that the companies themselves know that and are still not doing enough to act on it.

That is why we now have to move to a regulatory regime for social media companies, big search engines and other big online firms, where it is the laws passed in this Parliament that apply and terms of service written in silicon valley are not the guiding principle for regulation. The Joint Committee’s central recommendation for the online safety Bill is that Ofcom, as the independent regulator, should set mandatory codes of practice, based on existing laws in this country, that will deal with the worst kinds of illegal content, such as child abuse and content that promotes and glamorises terrorism. We should also bring into force the equalities legislation—people expect to be respected and not to be abused because of their race, sexual orientation or gender, and that should apply online as well. The regulator’s job should be to set the standards for the companies and explain to them what they are expected to do.

We greatly welcome the work of the Law Commission in suggesting specific new offences, particularly in respect of knowingly sharing false information on social media platforms with the intention of causing physical harm or severe psychological harm to other users. The commission suggests making the promotion of self-harm, which is a particular problem with vulnerable younger users of social media platforms, a specific new offence. We also should create new offences around cyber-flashing. The law needs to keep up to speed with new technology, and people who use new technology to abuse others should know that the law will come for them.

The report also addresses the issue of anonymity, about which many Members have spoken. Anonymity can play an important role in helping victims of abuse and people who speak out against oppressive regimes to speak truth to power when they might be fearful of doing so in their own name, but it is also used by some as a shield to abuse others, in the belief that anonymity will protect them and allow them to commit acts for which they would otherwise be charged and face prosecution. The Committee believes that in such circumstances people should be traceable: we should be able to identify people who abuse others and a request from law enforcement to get that information readily and speedily should be complied with. There should be traceability and people should know that, even if they do not post in their own name, they can be traced if they abuse others and break the law.

Age assurance is another important issue that the Committee considered. We are particularly concerned that children can be vulnerable and can access content—particularly adult content—to which they should not have access all too easily on the internet. Companies are not doing enough to address that, so we say that they should have effective age-assurance policies in place.

Finally, the key principle that underpins the Bill as it stands and that we think is very important is that the regulator has the power to inspect and audit the companies. We will not be reliant on self-declared information and reports from those companies but will have the ability to get for ourselves information that is too often supplied to the outside world only by brave whistleblowers and investigators who speak out about it. We should have access to that information and know on what basis the companies make decisions, and the companies should be liable for big fines if they do not comply with the legislation. We agree with the Secretary of State that individual named directors should also have liability if the companies are in flagrant breach, and there should be redress for individual users.

I encourage all Members to add the report to their Christmas reading list.

Photo of Dean Russell Dean Russell Chair, Speaker's Advisory Committee on Works of Art

First, as a member of the Joint Committee on the draft Online Safety Bill, I thank my hon. Friend for his words.

Does my hon. Friend agree that the Bill will have a profound impact on real people—especially vulnerable people and young children—like Zach Eagling? He has cerebral palsy and epilepsy and was targeted with cruel flashing images, as many people have been on social media to trigger seizures, cause harm to their lives and potentially even risk death. The Bill will, in its own small way, not only help those people but make sure that people like Zach are supported through our support of the proposed Zach’s law. It will also make sure that tech firms are held to account for the harm that they may do.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee)

I am grateful to my hon. Friend, who has spoken out strongly on the targeting of people with epilepsy both in Committee and elsewhere. That is a clear example of how some people can be badly abused: people who are known to have epilepsy are deliberately targeted with flashing images that the sender knows will trigger a seizure and cause them physical harm. This practice should have no place on the internet and the companies should be working to stop it and close it down, and they should make sure that action is taken against the accounts that do it. That is one of the clear recommendations in our report and I completely agree with my hon. Friend on that.

Photo of Rosie Winterton Rosie Winterton Deputy Speaker (First Deputy Chairman of Ways and Means)

Just a little reminder: Members should ask questions.

Photo of Diana R. Johnson Diana R. Johnson Chair, Home Affairs Committee, Chair, Home Affairs Committee

I congratulate the hon. Gentleman on the report, along with all who served on the Committee. I will certainly add it to my reading list for the Christmas recess.

I have a specific question on whether the Committee was able to look at the issue of pimping websites, on which individuals, often trafficked, are advertised for sex. They make large amounts of money for websites such as Vivastreet. Did the Committee feel able to make any recommendation about how that should be covered in the draft Online Safety Bill?

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee)

I am grateful to the right hon. Lady, and I congratulate her on her election as Chair of the Home Affairs Committee.

The Joint Committee received evidence on this important issue, and we discussed this and other issues with Interpol. We believe that the general principle behind the online safety regime should be that illegality should not exist in these online spaces and communities. If links to such sites are being shared by special interest groups on broader social media platforms, the companies should have a responsibility to address that. The basic principle is that encouraging illegality should not have a place on social media.

Photo of David Johnston David Johnston Conservative, Wantage

I thank my hon. Friend and his Committee for their work on this issue. Does he agree that, as big tech companies can analyse our search histories to suggest purchases and can even read our messages to suggest a reply, it is ludicrous for them to say that the sorts of things we want them to do to keep people safer are too difficult?

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee)

My hon. Friend is absolutely right. These companies are engaging in mass surveillance and data gathering to target their users with ads. The same technology and data they use to target people with ads and other information, such as recommending content through recommendation tools, could be used to stop bad things happening. We discussed this throughout the inquiry and in the report. He is right that it is not that it cannot be done; it is that there has not been a requirement in law for it to be done, and that there has been no regulator with proper investigatory powers checking whether the companies are actually doing it.

Photo of Fleur Anderson Fleur Anderson Shadow Paymaster General

I welcome the Committee’s work. I am not a member of the Committee, but I have followed its work closely. I know how important it is, and I will certainly make this report my Christmas reading. I am glad the Committee looked into age assurance, but how convinced is it that the social media platforms can put age assurance measures in place? How much will it have to be enforced? Is it possible to stop the appalling increase in the amount of porn being seen by such young children?

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee)

I agree with the hon. Lady. We address this issue in the report, which goes further than the Bill currently does by saying that the companies do not seem to have an effective policy. They have to deliver on this, and there are various technologies they can use. The Bill does not mandate a particular technology, but it says there should be an effective system, and it will be the regulator’s job to check.

The regulator can also insist on much better research on a platform’s audiences. Ofcom says that 50% of 10-year-olds are on social media. I believe the companies know this and can identify it. Indeed, reports have shown that some advertising is targeted at people who are known to be under 13. Again, the regulator should have access to such data and information to bring more light to this problem.

Photo of Alicia Kearns Alicia Kearns Conservative, Rutland and Melton

One of my gravest concerns is about the way in which disinformation is being used as a weapons system by our adversaries, and particularly hostile states. How much did the Committee consider whether this Bill is an appropriate legislative vehicle to tackle such activity by hostile states? I am not convinced the hostile states Bill will allow us to tackle it adequately, because we are under threat every single day in the online arena due to disinformation from hostile states, which is a major concern.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee)

My hon. Friend makes an important point. We are familiar with disinformation from Russian agencies targeting voters during elections in countries around the world. It is an offence in UK electoral law if a foreign entity buys adverts targeting UK voters, and the report says that the offence should apply. The platforms should not accept such ads, and they should take them down once it is identified that they have been placed by a foreign state with hostile intent.

The regulator also has the role of applying a company’s own terms of service to its systems. A lot of the activity my hon. Friend describes is being done by networks of inauthentic accounts. These accounts should not exist on some platforms, and therefore they should be taken down. The regulator should use its powers to identify fake accounts and networks of fake accounts.

We took evidence on this from another Facebook whistleblower, Sophie Zhang, whose job was to identify such foreign state interference and such networks of inauthentic accounts, which again have no place on platforms such as Facebook.

Photo of Afzal Khan Afzal Khan Shadow Minister (Justice)

I welcome the Committee’s work. Like many Muslims, I face Islamophobic racist abuse online, which has skyrocketed during the pandemic. Did the Committee consider the definition of Islamophobia suggested by the all-party parliamentary group on British Muslims?

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee)

The hon. Gentleman makes an important point. In the report, we address the issues of Islamophobia, antisemitism and any form of religious hatred, and that should be considered one of the harms that the regulator can take enforcement action on against the companies.

Photo of Bob Blackman Bob Blackman Conservative, Harrow East

I congratulate my hon. Friend on the report. One of the challenges relates to where these social media companies are based, where their servers are and where international accounts are held. What account has the Committee taken of how we can control the international aspects, as well as the national aspects, of harmful social media content?

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee)

My hon. Friend makes an important point. The rules apply to content that is accessed by users in the UK; it does not matter where in the world that is coming from. For example, we have recommended in the report that frauds and scams should be within scope, including when they appear in adverts as well as in organic postings. Google is already working with the Financial Conduct Authority to limit people advertising unless they are FCA-accredited, but what about organisations elsewhere in the world that are not accredited? They should clearly be in scope as well. We are asking the companies to take responsibility for content that is accessed by users in the UK, and they will have to comply with UK law if we set that law. We can see how this is already being done in legislation elsewhere in the world, and we can set laws, even for global companies, that have to be applied for users in the UK.

Photo of Jim Shannon Jim Shannon Shadow DUP Spokesperson (Human Rights), Shadow DUP Spokesperson (Health)

With reports that children as young as nine years old have smartphones, that the internet is essential to their learning and that their homework is almost all done online from the age of six, can the hon. Gentleman tell the House what will be done to filter out the trash to ensure that those smartphones do not turn into a tool to disrupt our children’s healthy development?

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee)

The hon. Gentleman makes an important point about the impact on children. Important work on this has already been done and this Government have passed legislation on the design of services, which is known as the age-appropriate design code. In our report and in the Bill, we stress the importance of risk assessment by the regulator of the different services that are offered, and of the principles of safety by design, particularly in regard to services that are accessed by children and products that are designed for and used by children. I spoke earlier about the regulator’s power to seek data and information from companies about younger users and to challenge companies whose platform policy is that those under 13 cannot access their content and ask whether they have research showing that they know people under that age are using it but allow them to keep their accounts open anyway. Keeping children off the systems that are not designed for them, and from which they are supposed to be deliberately excluded, could be an important role for the regulator to take on.

Photo of Richard Thomson Richard Thomson Shadow SNP Deputy Spokesperson (Treasury - Financial Secretary), Shadow SNP Spokesperson (Wales), Shadow SNP Spokesperson (Northern Ireland)

I add my own party’s grateful thanks to the Committee for the diligent and thorough way in which it has gone about compiling the report, and we hope to see that feed through into the legislation that eventually comes forward. Does the hon. Gentleman agree that, with the enhanced role that is envisaged for Ofcom, it is all the more important that, whoever heads Ofcom, the regulator can act as a genuinely independent regulator?

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee)

I thank the hon. Gentleman for his question. We are also grateful to John Nicolson, who is a member of the Committee. He is not in his place today. The question of the next chair of Ofcom was not one that the Committee was asked to consider. The Government will run a process, and the DCMS Committee will hold a hearing for the pre-appointment scrutiny of the new candidates. The hon. Gentleman is right to say that online safety will be a big job for Ofcom. The world will be watching, and we have to get the legislation right and ensure that Ofcom has the resources it needs to do the job. It believes that it has, and that it has the powers to do the job, but it should be an ongoing role for this House to scrutinise that process and ensure that it is being run effectively.

Photo of Rosie Winterton Rosie Winterton Deputy Speaker (First Deputy Chairman of Ways and Means)

I thank the Chair of the Joint Committee for that statement. We now come to the Backbench debate on matters to be raised before the forthcoming Adjournment.