Clause 15 - Duties to protect content of democratic importance

Online Safety Bill – in a Public Bill Committee at 4:13 pm on 7th June 2022.

Alert me about debates like this

Photo of Kim Leadbeater Kim Leadbeater Labour, Batley and Spen 4:13 pm, 7th June 2022

I beg to move amendment 105, in clause 15, page 14, line 33, after “ensure” insert “the safety of people involved in UK elections and”.

Photo of Christina Rees Christina Rees Labour/Co-operative, Neath

With this it will be convenient to discuss amendment 106, in clause 37, page 25, line 31, at end insert—

‘(2A) OFCOM must prepare and issue a code of practice for providers of Category 1 and 2(a) services describing measures recommended for the purpose of compliance with duties set out in section 15 concerning the safety of people taking part in elections.”

Photo of Kim Leadbeater Kim Leadbeater Labour, Batley and Spen

I rise to speak to amendments 105 and 106, in my name, on protecting democracy and democratic debate.

Within the Bill, there are significant clauses intended to prevent the spread of harm online, to protect women and girls against violence and to help prevent child sexual exploitation, while at the same time protecting the right of journalists to do their jobs. Although those clauses are not perfect, I welcome them.

The Bill is wide-ranging. The Minister talked on Second Reading about the power in clause 150 to protect another group—those with epilepsy—from being trolled with flashing images. That subject is close to my heart due to the campaign for Zach’s law—Zach is a young boy in my constituency. I know we will return to that important issue later in the Committee, and I thank the Minister for his work on it.

In protecting against online harm while preserving fundamental rights and values, we must also address the threats posed to those involved in the democratic process. Let me be clear: this is not self-serving. It is about not just MPs but all political candidates locally and nationally and those whose jobs facilitate the execution of our democratic process and political life: the people working on elections or for those elected to public office at all levels across the UK. These people must be defended from harm not only for their own protection, but to protect our democracy itself and, with it, the right of all our citizens to a political system capable of delivering on their priorities free from threats and intimidation.

Many other groups in society are also subjected to a disproportionate amount of targeted abuse, but those working in and around politics sadly receive more than almost any other people in this country, with an associated specific set of risks and harms. That does not mean messages gently, or even firmly, requesting us to vote one way or another—a staple of democratic debate—but messages of hate, abuse and threats intended to scare people in public office, grind them down, unfairly influence their voting intentions or do them physical and psychological harm. That simply cannot be an acceptable part of political life.

As I say, we are not looking for sympathy, but we have a duty to our democracy to try to stamp that out from our political discourse. Amendment 105 would not deny anybody the right to tell us firmly where we are going wrong—quite right, too—but it is an opportunity to draw the essential distinction between legitimately holding people in public life to account and illegitimate intimidation and harm.

The statistics regarding the scale of online abuse that MPs receive are shocking. In 2020, a University of Salford study found that MPs received over 7,000 abusive or hate-filled tweets a month. Seven thousand separate messages of harm a month on Twitter alone directed at MPs is far too many, but who in this room does not believe that the figure is almost certainly much higher today? Amnesty conducted a separate study in 2017 looking at the disproportionate amount of abuse that women and BAME MPs faced online, finding that my right hon. Friend Ms Abbott was the recipient of almost a third of all the abusive tweets analysed, as alluded to already by the hon. Member for Edinburgh—

Photo of Kim Leadbeater Kim Leadbeater Labour, Batley and Spen

I knew that. [Laughter.]

Five years later, we continue to see significant volumes of racist, sexist and homophobic hate-filled abuse and threats online to politicians of all parties. That is unacceptable in itself, but we must ask whether this toxic environment helps to keep decent people in politics or, indeed, attracts good people into politics, so that our democracy can prosper into the future across the political spectrum. The reality we face is that our democracy is under attack online each and every day, and every day we delay acting is another day on which abuse becomes increasingly normalised or is just seen as part of the job for those who have put themselves forward for public service. This form of abuse harms society as a whole, so it deserves specific consideration in the Bill.

While elected Members and officials are not a special group of people deserving of more legal protections than anyone else, we must be honest that the abuse they face is distinct and specific to those roles and directly affects our democracy itself. It can lead to the most serious physical harm, with two Members of Parliament having been murdered in the last six years, and many others face death threats or threats of sexual or other violence on a daily basis. However, this is not just about harm to elected representatives; online threats are often seen first, and sometimes only, by their members of staff. They may not be the intended target, but they are often the people harmed most. I am sure we all agree that that is unacceptable and cannot continue.

All of us have probably reported messages and threats to social media platforms and the police, with varying degrees of success in terms of having them removed or the individuals prosecuted. Indeed, we sadly heard examples of that from my hon. Friend the shadow Minister. Often we are told that nothing can be done. Currently, the platforms look at their own rules to determine what constitutes freedom of speech or expression and what is hateful speech or harm. That fine line moves. There is no consistency across platforms, and we therefore urgently need more clarity and a legal duty in place to remove that content quickly.

Amendment 105 would explicitly include in the Bill protection and consideration for those involved in UK elections, whether candidates or staff. Amendment 106 would go further and place an obligation on Ofcom to produce a code of practice, to be issued to the platforms. It would define what steps platforms must take to protect those involved in elections and set out what content is acceptable or unacceptable to be directed at them.

While I am cautious about heaping responsibility on Ofcom and I remain nervous about the Government’s willingness to leave more and more contentious issues for it to deal with, I believe that that is a reasonable step. It would allow Ofcom to outline what steps a platform must take to protect democratic debate and to set out acceptable and unacceptable content in the context of our ever-changing political landscape. That form of nuance would need to be regularly updated, so it clearly would not be practical to put it in the Bill.

Let us be honest: will this amendment solve the issue entirely? No. However, does more need to be done to protect our democracy? Yes. I am in constant conversation with people and organisations in this sector about what else could be brought forward to assist the police and the Crown Prosecution Service in prosecuting those who wish to harm those elected to public office—both online and offline. Directly addressing the duty of platforms to review content, remove harmful speech and report those who wish to do harm would, I believe, be a positive first step towards protecting our democratic debate and defending those who work to make it effective on behalf of the people of the United Kingdom.

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Work and Pensions) 4:30 pm, 7th June 2022

I want to make a few comments on the amendment. As a younger female parliamentarian, I find that I am often asked to speak to young people about becoming an MP or getting involved in politics. I find it difficult to say to young women, “Yes, you should do this,” and most of the reason for that is what people are faced with online. It is because a female MP cannot have a Twitter account without facing abuse. I am sure male MPs do as well, but it tends to be worse for women.

We cannot engage democratically and with constituents on social media platforms without receiving abuse and sometimes threats as well. It is not just an abusive place to be—that does not necessarily meet the threshold for illegality—but it is pretty foul and toxic. There have been times when I have deleted Twitter from my phone because I just need to get away from the vile abuse that is being directed towards me. I want, in good conscience, to be able to make an argument to people that this is a brilliant job, and it is brilliant to represent constituents and to make a difference on their behalf at whatever level of elected politics, but right now I do not feel that I am able to do that.

When my footballing colleague, the hon. Member for Batley and Spen, mentions “UK elections” in the amendment, I assume she means that in the widest possible way—elections at all levels.

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Work and Pensions)

Sometimes we miss out the fact that although MPs face abuse, we have a level of protection as currently elected Members. Even if there were an election coming up, we have a level of security protection and access that is much higher than for anybody else challenging a candidate or standing in a council or a Scottish Parliament election. As sitting MPs, we already have an additional level of protection because of the security services we have in place. We need to remember, and I assume this is why the amendment is drawn in a pretty broad way, that everybody standing for any sort of elected office faces significant risk of harm—again, whether or not that meets the threshold for illegality.

There are specific things that have been mentioned. As has been said, epilepsy is specifically mentioned as a place where specific harm occurs. Given the importance of democracy, which is absolutely vital, we need to have a democratic system where people are able to stand in elections and make their case. Given the importance of democracy, which is absolutely vital, we need to have a democratic system where people are able to stand in elections and make their case. That is why we have election addresses and a system where the election address gets delivered through every single person’s door. There is an understanding and acceptance by people involved in designing democratic processes that the message of all candidates needs to get out there. If the message of all candidates cannot get out there because some people are facing significant levels of abuse online, then democracy is not acting in the way that it should be. These amendments are fair and make a huge amount of sense. They are protecting the most important tenets of democracy and democratic engagement.

I want to say something about my own specific experiences. We have reported people to the police and have had people in court over the messages they have sent, largely by email, which would not be included in the Bill, but there have also been some pretty creepy ones on social media that have not necessarily met the threshold. As has been said, it is my staff who have had to go to court and stand in the witness box to explain the shock and terror they have felt on seeing the email or the communication that has come in, so I think any provision should include that.

Finally, we have seen situations where people working in elections—this is not an airy-fairy notion, but something that genuinely happened—have been photographed and those pictures have been shared on social media, and they have then been abused as a result. They are just doing their job, handing out ballot papers or standing up and announcing the results on the stage, and they have to abide by the processes that are in place now. In order for us to have free and fair elections that are run properly and that people want to work at and support, we need to have that additional level of protection. The hon. Member for Batley and Spen made a very reasonable argument and I hope the Minister listened to it carefully.

Photo of Chris Philp Chris Philp The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

I have listened very carefully to both the hon. Member for Batley and Spen and the hon. Member for Aberdeen North. I agree with both of them that abuse and illegal activity directed at anyone, including people running for elected office, is unacceptable. I endorse and echo the comments they made in their very powerful and moving speeches.

In relation to the technicality of these amendments, what they are asking for is in the Bill already but in different places. This clause is about protecting content of “democratic importance” and concerns stopping online social media firms deleting content through over-zealous takedown. What the hon. Members are talking about is different. They are talking about abuse and illegal activities, such as rape threats, that people get on social media, particularly female MPs, as they both pointed out. I can point to two other places in the Bill where what they are asking for is delivered.

First, there are the duties around illegal content that we debated this morning. If there is content online that is illegal—some of the stuff that the shadow Minister referred to earlier sounds as if it would meet that threshold—then in the Bill there is a duty on social media firms to remove that content and to proactively prevent it if it is on the priority list. The route to prosecution will exist in future, as it does now, and the user-verification measures, if a user is verified, make it more likely for the police to identify the person responsible. In the context of identifying people carrying out abuse, I know the Home Office is looking at the Investigatory Powers Act 2016 as a separate piece of work that speaks to that issue.

So illegal content is dealt with in the illegal content provisions in the Bill, but later we will come to clause 150, which updates the Malicious Communications Act 1988 and creates a new harmful communications offence. Some of the communications that have been described may not count as a criminal offence under other parts of criminal law, but if they meet the test of harmful communication in clause 150, they will be criminalised and will therefore have to be taken down, and prosecution will be possible. In meeting the very reasonable requests that the hon. Members for Batley and Spen and for Aberdeen North have made, I would point to those two parts of the Bill.

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Work and Pensions)

But clause 150(5) says that if a message

“is, or is intended to be, a contribution to a matter of public interest”, people are allowed to send it, which basically gives everybody a get-out clause in relation to anything to do with elections.

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Work and Pensions)

I know we are not discussing that part of the Bill, and if the Minister wants to come back to this when we get to clause 150, I have no problem with that.

Photo of Chris Philp Chris Philp The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

I will answer the point now, as it has been raised. Clause 150 categorically does not give a get-out-of-jail-free card or provide an automatic excuse. Clearly, there is no way that abusing a candidate for elected office with rape threats and so on could possibly be considered a matter of public interest. In fact, even if the abuse somehow could be considered as possibly contributing to public debate, clause 150(5) says explicitly in line 32 on page 127:

“but that does not determine the point”.

Even where there is some potentially tenuous argument about a contribution to a matter of public interest, which most definitely would not be the case for the rape threats that have been described, that is not determinative. It is a balancing exercise that gets performed, and I hope that puts the hon. Lady’s mind at rest.

Photo of Kim Leadbeater Kim Leadbeater Labour, Batley and Spen

The Minister makes a really valid point and is right about the impact on the individual. The point I am trying to make with the amendments is that this is about the impact on the democratic process, which is why I think it fits in with clause 15. It is not about how individuals feel; it is about the impact that that has on behaviours, and about putting the emphasis and onus on platforms to decide what is of democratic importance. In the evidence we had two weeks ago, the witnesses certainly did not feel comfortable with putting the onus on platforms. If we were to have a code of practice, we would at least give them something to work with on the issue of what is of democratic importance. It is about the impact on democracy, not just the harm to the individual involved.

Photo of Chris Philp Chris Philp The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

Clearly, if a communication is sufficiently offensive that it meets the criminal threshold, it is covered, and that would obviously harm the democratic process as well. If a communication was sufficiently offensive that it breached the harmful communication offence in clause 150, it would also, by definition, harm the democratic process, so communications that are damaging to democracy would axiomatically be caught by one thing or the other. I find it difficult to imagine a communication that might be considered damaging to democracy but that would not meet one of those two criteria, so that it was not illegal and would not meet the definition of a harmful communication.

My main point is that the existing provisions in the Bill address the kinds of behaviours that were described in those two speeches—the illegal content provisions, and the new harmful communication offence in clause 150. On that basis, I hope the hon. Member for Batley and Spen will withdraw the amendment, safe in the knowledge that the Bill addresses the issue that she rightly and reasonably raises.

Question put, That the amendment be made.

Division number 13 Online Safety Bill — Clause 15 - Duties to protect content of democratic importance

Aye: 6 MPs

No: 9 MPs

Ayes: A-Z by last name

Nos: A-Z by last name

The Committee divided: Ayes 6, Noes 9.

Question accordingly negatived.

Question proposed, That the clause stand part of the Bill.

Photo of Christina Rees Christina Rees Labour/Co-operative, Neath

With this it will be convenient to discuss the following:

Clause 16 stand part.

New clause 7—Report on duties to protect content of democratic importance and journalistic content—

“(1) The Secretary of State must publish a report which—

(a) reviews the extent to which Category 1 services have fulfilled their duties under—

(i) Clause 15; and

(ii) Clause 16;

(b) analyses the effectiveness of Clauses 15 and 16 in protecting against—

(i) foreign state actors;

(ii) extremist groups and individuals; and

(iii) sources of misinformation and disinformation.

(2) The report must be laid before Parliament within one year of this Act being passed.”

This new clause would require the Secretary of State to publish a report reviewing the effectiveness of Clauses 15 and 16.

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport) 4:45 pm, 7th June 2022

I will speak to clauses 15 and 16 and to new clause 7. The duties outlined in the clause, alongside clause 16, require platforms to have special terms and processes for handling journalistic and democratically important content. In respect of journalistic content, platforms are also required to provide an expedited appeals process for removed posts, and terms specifying how they will define journalistic content. There are, however, widespread concerns about both those duties.

As the Bill stands, we feel that there is too much discretion for platforms. They are required to define “journalistic” content, a role that they are completely unsuited to and, from what I can gather, do not want. In addition, the current drafting leaves the online space open to abuse. Individuals intent on causing harm are likely to apply to take advantage of either of those duties; masquerading as journalists or claiming democratic importance in whatever harm they are causing, and that could apply to almost anything. In the evidence sessions, we also heard about the concerns expressed brilliantly by Kyle Taylor from Fair Vote and Ellen Judson from Demos, that the definitions as they stand in the Bill thus far are broad and vague. However, we will come on to those matters later.

Ultimately, treating “journalistic” and “democratically important” content differently is unworkable, leaving platforms to make impossible judgments over, for example, when and for how long an issue becomes a matter of reasonable public debate, or in what settings a person is acting as a journalist. As the Minister knows, the duties outlined in the clause could enable a far-right activist who was standing in an election, or potentially even just supporting candidates in elections, to use all social media platforms. That might allow far-right figures to be re-platformed on to social media sites where they would be free to continue spreading hate.

The Bill indicates that content will be protected if created by a political party ahead of a vote in Parliament, an election or a referendum, or when campaigning on a live political issue—basically, anything. Can the Minister confirm whether the clause means that far-right figures who have been de-platformed for hate speech already must be reinstated if they stand in an election? Does that include far-right or even neo-Nazi political parties? Content and accounts that have been de-platformed from mainstream platforms for breaking terms of service should not be allowed to return to those platforms via this potential—dangerous—loophole.

As I have said, however, I know that these matters are complex and, quite rightly, exemptions must be in place to allow for free discussion around matters of the day. What cannot be allowed to perpetuate is hate sparked by bad actors using simple loopholes to avoid any consequences.

On clause 16, the Minister knows about the important work that Hope not Hate is doing in monitoring key far-right figures. I pay tribute to it for its excellent work. Many of them self-define as journalists and could seek to exploit this loophole in the Bill and propagate hate online. Some of the most high-profile and dangerous far-right figures in the UK, including Stephen Yaxley-Lennon, also known as Tommy Robinson, now class themselves as journalists. There are also far-right and conspiracy-theory so-called “news companies” such as Rebel Media and Urban Scoop. Both those replicate mainstream news publishers, but are used to spread misinformation and discriminatory content. Many of those individuals and organisations have been de-platformed already for consistently breaking the terms of service of major social media platforms, and the exemption could see them demand their return and have their return allowed.

New clause 7 would require the Secretary of State to publish a report reviewing the effectiveness of clauses 15 and 16. It is a simple new clause to require parliamentary scrutiny of how the Government’s chosen means of protecting content of democratic importance and content of journalistic content are working.

Hacked Off provided me with a list of people it found who have claimed to be journalists and who would seek to exploit the journalistic content duty, despite being banned from social media because they are racists or bad actors. First is Charles C. Johnson, a far-right activist who describes himself as an “investigative journalist”. Already banned from Twitter for saying he would “take out” a civil rights activist, he is also alleged to be a holocaust denier.

Secondly, we have Robert Stacy McCain. Robert has been banned from Twitter for participating in targeted abuse. He was a journalist for The Washington Post, but is alleged to have also been a member of the League of the South, a far-right group known to include racists. Then, there is Richard B. Spencer, a far-right journalist and former editor, only temporary banned for using overlapping accounts. He was pictured making the Nazi salute and has repeated Nazi propaganda. When Trump became President, he encouraged people to “party like it’s 1933”. Sadly, the list goes on and on.

Transparency is at the very heart of the Bill. The Minister knows we have concerns about clauses 15 and 16, as do many of his own Back Benchers. We have heard from my hon. Friend the Member for Batley and Spen how extremist groups and individuals and foreign state actors are having a very real impact on the online space. If the Minister is unwilling to move on tightening up those concepts, the very least he could commit to is a review that Parliament will be able to formally consider.

Photo of Chris Philp Chris Philp The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

I thank the shadow Minister for her comments and questions. I would like to pick up on a few points on the clauses. First, there was a question about what content of democratic importance and content of journalistic importance mean in practice. As with many concepts in the Bill, we will look to Ofcom to issue codes of practice specifying precisely how we might expect platforms to implement the various provisions in the Bill. That is set out in clause 37(10)(e) and (f), which appear at the top of page 37, for ease. Clauses 15 and 16 on content of democratic and journalistic importance are expressly referenced as areas where codes of practice will have to be published by Ofcom, which will do further work on and consult on that. It will not just publish it, but will go through a proper process.

The shadow Minister expressed some understandable concerns a moment ago about various extremely unpleasant people, such as members of the far right who might somehow seek to use the provisions in clauses 15 and 16 as a shield behind which to hide, to enable them to continue propagating hateful, vile content. I want to make it clear that the protections in the Bill are not absolute—it is not that if someone can demonstrate that what they are saying is of democratic importance, they can say whatever they like. That is not how the clauses are drafted.

I draw attention to subsection (2) of both clauses 15 and 16. At the end of the first block of text, just above paragraph (a), it says “taken into account”: the duty is to ensure that matters concerning the importance of freedom of expression relating to content of democratic importance are taken into account when making decisions. It is not an absolute prohibition on takedown or an absolute protection, but simply something that has to be taken into account.

If someone from the far right, as the shadow Minister described, was spewing out vile hatred, racism or antisemitism, and tried to use those clauses, the fact that they might be standing in an election might well be taken into account. However, in performing that balancing exercise, the social media platforms and Ofcom acting as enforcers—and the court if it ever got judicially reviewed—would weigh those things up and find that taking into account content of democratic importance would not be sufficient to outweigh considerations around vile racism, antisemitism or misogyny.

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport)

The Minister mentions that it would be taken into account. How long does he anticipate it would be taken into account for, especially given the nature of an election? A short campaign could be a number of weeks, or something could be posted a day before an election, be deemed democratically important and have very serious and dangerous ramifications.

Photo of Chris Philp Chris Philp The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

As I say, if content was racist, antisemitic or flagrantly misogynistic, the balancing exercise is performed and the democratic context may be taken into account. I do not think the scales would tip in favour of leaving the content up. Even during an election period, I think common sense dictates that.

To be clear on the timing point that the hon. Lady asked about, the definition of democratic importance is not set out in hard-edged terms. It does not say, “Well, if you are in a short election period, any candidate’s content counts as of democratic importance.” It is not set out in a manner that is as black and white as that. If, for example, somebody was a candidate but it was just racist abuse, I am not sure how even that would count as democratic importance, even during an election period, because it would just be abuse; it would not be contributing to any democratic debate. Equally, somebody might not be a candidate, or might have been a candidate historically, but might be contributing to a legitimate debate after an election. That might be seen as being of democratic importance, even though they were not actually a candidate. As I said, the concept is not quite as black and white as that. The main point is that it is only to be taken into account; it is not determinative.

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport)

I appreciate the Minister’s allowing me to come back on this. During the Committee’s evidence sessions, we heard of examples where bad-faith state actors were interfering in the Scottish referendum, hosting Facebook groups and perpetuating disinformation around the royal family to persuade voters to vote “Yes” to leave the United Kingdom. That disinformation by illegal bad-faith actors could currently come under both the democratic importance and journalistic exemptions, so would be allowed to remain for the duration of that campaign. Given the exemptions in the Bill, it could not be taken down but could have huge, serious ramifications for democracy and the security of the United Kingdom.

Photo of Chris Philp Chris Philp The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

I understand the points that the hon. Lady is raising. However, I do not think that it would happen in that way.

Photo of Chris Philp Chris Philp The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

No, I don’t. First of all, as I say, it is taken into account; it is not determinative. Secondly, on the point about state-sponsored disinformation, as I think I mentioned yesterday in response to the hon. Member for Liverpool, Walton, there is, as we speak, a new criminal offence of foreign interference being created in the National Security Bill. That will criminalise the kind of foreign interference in elections that she referred to. Because that would then create a new category of illegal content, that would flow through into this Bill. That would not be overridden by the duty to protect content of democratic importance set out here. I think that the combination of the fact that this is a balancing exercise, and not determinative, and the new foreign interference offence being created in the National Security Bill, will address the issue that the hon. Lady is raising—reasonably, because it has happened in this country, as she has said.

I will briefly turn to new clause 7, which calls for a review. I understand why the shadow Minister is proposing a review, but there is already a review mechanism in the Bill; it is to be found in clause 149, and will, of course, include a review of the way that clauses 15 and 16 operate. They are important clauses; we all accept that journalistic content and content of democratic importance is critical to the functioning of our society. Case law relating to article 10 of the European convention on human rights, for example, recognises content of journalistic importance as being especially critical. These two clauses seek to ensure that social media firms, in making their decisions, and Ofcom, in enforcing the firms, take account of that. However, it is no more than that: it is “take account”, it is not determinative.

Question put and agreed to.

Clause 15 accordingly ordered to stand part of the Bill.

Clause 16 ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned. —(Steve Double.)

Adjourned till Thursday 9 June at half-past Eleven o’clock.

Written evidence to be reported to the House

OSB39 LV=General Insurance

OSB40 Epilepsy Society

OSB41 Free Speech Union

OSB42 Graham Smith

OSB43 Center for Data Innovation

OSB44 Samaritans

OSB45 End Violence Against Women coalition, Glitch, Refuge, Carnegie UK, 5Rights, NSPCC and Professors Lorna Woods and Clare McGlynn (joint submission)

OSB46 Sky

OSB47 Peter Wright, Editor Emeritus, DMG Media

OSB48 Graham Smith (further submission)

OSB49 CARE (Christian Action Research and Education)

OSB50 Age Verification Providers Association (supplementary submission)

OSB51 Legal Advice Centre at Queen Mary, University of London and Mishcon de Reya LLP (joint submission)

OSB52 Google UK (supplementary submission)

OSB53 Refuge (supplementary submission)

OSB54 Reset (supplementary submission)

OSB55 Public Service Broadcasters (BBC, Channel 4, and Channel 5)

OSB56 Which?

OSB57 Professor Corinne Fowler, School of Museum Studies, University of Leicester

OSB58 Independent Media Association

OSB59 Hacked Off Campaign

OSB60 Center for Countering Digital Hate