Examination of Witness

National Security Bill – in a Public Bill Committee at 3:21 pm on 7 July 2022.

Alert me about debates like this

Poppy Wood gave evidence.

Photo of Rushanara Ali Rushanara Ali Labour, Bethnal Green and Bow 3:52, 7 July 2022

Q We are now going to hear from Poppy Wood, UK director of Reset.tech. We have until around 4.20 pm for this session. Could you introduce yourself for the record?

Poppy Wood:

Good afternoon, everyone. My name is Poppy Wood, and I lead on UK public policy for an organisation called Reset. We are a philanthropic organisation that focuses on digital threats to democracy. We have a particular interest in disinformation. I was a civil servant about 10 years ago, and have worked in tech and, at times, in cyber-security over the past decade. I am pleased to be here today to talk about some of our work as it relates to the Bill, particularly our research on disinformation and state actors.

Photo of Scott Mann Scott Mann Assistant Whip

Q Thank you, Poppy, for being here this afternoon. Do you agree that the Bill strengthens our protections against co-ordinated, state-backed disinformation?

Poppy Wood:

That is a good question, and one I hope is being asked every time that we are looking at new versions and new clauses of the Bill. When the consultation came out last year, those of us who had worked in state-backed disinformation for a while were delighted to see some of the questions being asked, at least in the first instance, about the role of state actors and about foreign interference.

When Ken McCallum said last year in his annual threat report that our adversaries are really good at using co-ordinated behaviour to probe UK vulnerabilities, and that we in return really need a holistic response to that—that was about a year ago—a lot of us thought, “But we’re not. It’s great that they are, but we certainly aren’t. No one is really gripping this.” That echoed language from the ISC report in 2020—the Russia report, which said that co-ordinated disinformation and state-backed interference is a really hot potato. No one wants to grip it—not GCHQ, not DCMS, not the other security services. It is too difficult, so we were really relieved to see the Bill come forward, and the consultation late last year.

We were even more relieved earlier this week to see that there will be a link between this Bill and the Online Safety Bill. I have not yet seen that amendment brought forward by the Government; I am hoping that is happening now, because we expected to see it yesterday—I hear the Government have been quite busy this week. That is really about saying that the Home Office and DCMS recognise the role of social media in pushing these co-ordinated campaigns, that electoral interference and foreign state interference is a priority, and that we are seeing platforms being weaponised in order to push the sort of disinformation you mentioned in your question.

We have seen that time and again. In the Scottish referendum in 2014, the Free Scotland 2014 campaign turned out to be backed by Russian and Iranian actors. They were massively weaponising social media by putting up inauthentic accounts and Facebook pages, with mocked-up pictures of the royal family, saying they wanted to take all the money from Scotland and buy new houses. It was complete nonsense, the aim of which was to destabilise the Union.

The Free Scotland 2014 campaign was called out by Twitter and Facebook in 2018. So four years later they said, “Hey, we’ve just found all these accounts that were trying to destabilise the Union four years ago”, and we were going, “But what did you do about that four years ago?” I think we are going to see that again in Northern Ireland, we saw it in the US elections in 2016 and 2020, when the US Senate said that Russia was targeting African- American electors as a priority, to drive division in the States, and we will see that in any election we have in the UK.

I am really pleased to see that the Government are trying to link the two Bills. I think there are three words missing from both the Bills, and they are “co-ordinated inauthentic behaviour”. This Bill and the Online Safety Bill might be getting towards those words, but one of them has to say them, because we are talking about individuals and organisations in this Bill and social media in the Online Safety Bill, but the examples I have just given are absolutely about co-ordination.

It will be hard to find one person. The extra-territoriality provisions in this Bill are good, but we should not be measuring the success of this Bill as people in prison. This is all about troll armies abroad, so the link is important, but I think it needs to go further on specifically calling out co-ordinated inauthentic behaviour in either or both of these pieces of legislation.

There are some questions about case law linked to the Online Safety Bill and the National Security Bill. In the amendments, we are expecting, hopefully today, for foreign interference to be listed as a priority harm in the Online Safety Bill. The question arises of how social media platforms, which will now effectively be given the power to police these kinds of things, will catch foreign interference when, as the Online Safety Bill says, the

“content amounts to an offence”.

How can a social media platform judge how content would amount to a criminal offence?

We need to think about some of the language around how people identify that criminal offence. I think Carnegie UK, or another group, has suggested something along the lines of illegal content meaning content that the provider has “reasonable grounds to believe” amounts to a relevant offence. I do not think that “amounts to” has the precedent, and it is going to be hard, particularly in content law, to catch that.

The other thing about the Online Safety Bill and the National Security Bill is that we may end up seeing the case law being made in the civil courts, because we will see Ofcom taking a case against a platform, that platform appealing and the case being handled in the civil court, even if it involves foreign interference and a criminal offence. That needs to be thought about. I certainly do not have a solution, but I just want to flag it as a risk of linking these two Bills but not thinking about how they are fully linked.

However, going back to my first point, we were delighted to see that the Government are taking this really seriously.

Photo of Scott Mann Scott Mann Assistant Whip

Q You mentioned some of the cyber-threats to elections. Could you expand on the kind of cyber-threats that are posed to national security in the wider sense?

Poppy Wood:

Obviously, you have heard from much greater experts than me about hack-and-leak operations et cetera, and I refer you to their remarks about that. In terms of co-ordinated disinformation campaigns, as I said we have seen that in the US election, with really targeted approaches to particular groups that people wanted to divide. When I mentioned that the US Senate said that African-American electors were being targeted, it was clear that the Russians wanted to stir up tensions within that group and between that group and white police. They would really push Ku Klux Klan narratives, false images and all sorts to make sure that those groups were infighting. I would absolutely expect to see that here as well.

Political ads are also a really big issue. I cannot work out whether they are dealt with in the Bill, but they are certainly not dealt with in the Online Safety Bill. The Cabinet Office seems to own the political ads regime, but we are seeing shell companies buying these ads purely to stoke division and tension, and we would expect to see that again. One of the problems with not having a grip of the issue, particularly as we could go into an election period in the UK at any point, is that we need someone to comprehensively pull this all together.

The Russians and the Iranians often leave quite a lot of fingerprints on their work, sometimes intentionally. I know that Ken McCallum, who is director general of MI5, and the FBI discussed the threat from China yesterday. They did not mention disinformation, which I thought was interesting, but the Chinese have historically been much better at not leaving their fingerprints on things, so I cannot really speak to some of their activity. However, we have seen it time and time again.

It is probably best not to talk about the Brexit referendum, but we all know what happened there with the engagement from foreign actors. We should not be surprised to see disinformation. We are vulnerable in the UK because of our role in supporting Ukraine, and we have to pull it all together. If the Online Safety Bill, combined with the National Security Bill, does not do so, I do not know what will.

Photo of Holly Lynch Holly Lynch Shadow Minister (Home Office)

Q We have heard in some of the previous contributions that hostile states’ use of disinformation does not always cross the thresholds that we are talking about, and that sometimes it is about the amplification of uncomfortable truths. You used the example of pitting different elements of society against each other in the US elections. To what extent do you think we need to improve some of our definitions and understanding, so that we can start looking at how we disrupt disinformation?

Poppy Wood:

We have to be careful not to try to define disinformation. There is some language in the Bill about misrepresentation, and the idea of intentionally misrepresenting is important. We will never get a grip on exactly what disinformation is, because it is a shapeshifter.

On the first part of your question, it is about the system of amplifying and the ease with which people with malicious intent can manipulate systems by creating fake accounts, not verifying IDs and exploiting the recommender algorithms so that they hook you with one piece of content. We see this time and time again. One piece of bad content is not the problem, but they hook you on it, which then leads you down a rabbit hole to something much darker and more radical. It does not even have to be radical; it can be the sort of stuff that we were talking about with the Scotland referendum. It can be innocuous, such as stories about what the royal family are doing. It is about sowing seeds and exploiting cognitive dissonance, which bad actors are very good at and which social media is absolutely weaponised to make the most of, because of the pace and amplification of the content.

The Online Safety Bill goes part of the way there; it is imperfect, partly because it is so hard to define disinformation. There is very little in the Online Safety Bill on disinformation. There is an advisory committee that is years down the road. It is ironic that the National Security Bill is about trying to rein in certain types of transparency. Transparency is a really big part of all this, so it is about trying to find out who is behind things and what the data patterns really look like, and building in researchers. I think that was something Ken McCallum said last year. A holistic approach is a cross-Government approach, but it also involves industry, civil society, journalists and researchers. Everyone has to focus on this. Both Bills could go further on systems and, as I say, the co-ordinated inauthentic behaviour language just is not there either.

Photo of Holly Lynch Holly Lynch Shadow Minister (Home Office)

Q We will be tabling an amendment that would require the Government to commission an independent review every year on the prevalence of disinformation and the impact that it has on elections. Who would you imagine would be most suited to undertake that report?

Poppy Wood:

That is a brilliant idea. It goes back to the point about grip. We are seeing really good work being done by the Home Department and the Department for Digital, Culture, Media and Sport. I think the DCMS counter-disinformation unit is an important tool, but it is very small, as is DCMS, and it is lacking the transparency that such interventions require. It should probably be a body like the Intelligence and Security Committee—some kind of cross-party body, quasi-independent of Government, thinking about the issues, with input from expertise in the relevant services and relevant Departments. I know that the Home Department and DCMS work together closely on this, and I think the Cabinet Office also has a role to play. Instinctively, I feel that something like the ISC would be the best place for it, but I am sure that is to be worked out.

One of the issues with a lot of this stuff is the role of the Executive, and making sure that the body is that far removed from political interference.

Photo of Damian Hinds Damian Hinds Minister of State (Home Office) (Security)

Q Hello. Earlier, you queried why something that happened in 2014 might only have been called out by Facebook in 2018. Isn’t it quite obvious that what happened was 2016 in the middle, and all the brouhaha that followed from the American elections and the congressional inquiry, and all the rest of it? It turned out that when Facebook and others went looking, it was amazing what they could find.

Poppy Wood:

Absolutely. If you are suggesting that they respond to PR crises, I would agree with you on that one. Of course, this about brands. We have seen with revelations from Frances Haugen that Facebook is not understaffed but just not focusing them in the right direction on this stuff. There are only handfuls of people focusing on co-ordinated disinformation for the whole world within these big technology companies. It should be dozens, especially if they are hiring 10,000 engineers for the metaverse in Europe. They can put some of them on elections and tracking. They say that they go far, but they could go much further. When there is pressure on them, they respond, and so far that pressure has been PR because there has not been regulation.

Photo of Damian Hinds Damian Hinds Minister of State (Home Office) (Security)

Q Would it be fair to say that they have at least got better? If you take the American 2020 election, there does not seem to have been the same volume of attempted disruption as in 2016 election, or at least not in the places where we are now looking, like Facebook?

Poppy Wood:

We do not know, because we have not got the transparency. They may seem to have got better, but as a percentage of what, we cannot know. They will say that it has got better and that they have caught this many thousand as opposed to that many thousand last time, and those accounts have been taken down, but we have no idea if it is a percentage of what. That is why people, such as Frances Haugen, who have come forward as whistleblowers to say, “They are telling you this, but the data says that,” show that we should not be relying on those people. I am sure we will come on to the whistleblowers, but there have to touchpoints much earlier on, from civil society, from Government, from researchers, to say “Hey, actually, the scale is much larger,” or, “You’re not even looking at this stuff.”

London is one of the most linguistically diverse cities in the world, and when we are talking about counter-terrorism speech, one of Frances’s revelations was that 75% of counter-terrorism speech was identified as AI—it is terrorism speech, so it is taken down. We are thinking about the UK as an English monolith, but there is plenty of linguistic diversity that puts us at risk when those platforms are weaponised in elections, focusing on diaspora and so on.

I would hope that the platforms have got better, and I would like to give them the benefit of the doubt, but the truth is that we just do not know.

Photo of Damian Hinds Damian Hinds Minister of State (Home Office) (Security)

Q You mentioned that there is not transparency, but there is at least one type of transparency with Facebook—main Facebook—as in you can see what is on it. I wonder what you think of the role of channels that you cannot see, such as private messaging that includes private parts of Facebook, WhatsApp, and what they call copypasta—copying and pasting SMS messages—and so on. How much do we know about that?

Poppy Wood:

I would challenge the first assumption that you can see what you can see on Facebook. They still view that as private information. Researchers cannot get access to that unless they kind of beg, borrow and steal. I understand the question—

Photo of Damian Hinds Damian Hinds Minister of State (Home Office) (Security)

But you can see public postings on Facebook. That is my point.

Poppy Wood:

On your page, you can, but researchers cannot.

Photo of Damian Hinds Damian Hinds Minister of State (Home Office) (Security)

But that is still more than you can see on WhatsApp, where you cannot see a post at all.

Poppy Wood:

That’s true. I suppose I would say they could do much more about transparency just about the public posts—that is my first point. Secondly, on encryption, there are concerns about some of the amendments in the Online Safety Bill and what that really means for encryption. I know we are not here to talk about that Bill, but encryption is an important tool. We know that those spaces are misused, but we need to be really clear about some of the benefits that encryption offers to lots of people, particularly the security services, for sharing information safely. We need to be careful.

Photo of Damian Hinds Damian Hinds Minister of State (Home Office) (Security)

I was not trying to start an argument or even a discussion or analysis of end-to-end encryption. I was just asking, relatively speaking, how much do we know? There is a hypothesis that the reason why there was apparently less material in recent American elections on Facebook than in 2016 is that large parts of it have moved to other channels where we just cannot see it. We just do not know what is there.Q

Poppy Wood:

Let me give you a good example on Russia Today. We do a lot of work and analysis around Russia and Ukraine. Obviously, Russia Today was taken down from most national broadcast networks. It has been resurrected multiple times on social media. This week, we saw it resurrected with another name, like “Discovery Dig” or something, on YouTube, where lots of the comments, imagery and language were directing people to Telegram channels where they are actively mobilising.

What we see in the active mobilisation on Telegram channels is the outing of national security agents, the putting up of email addresses of politicians and saying, “Target them and say they are on the wrong side of the debate,” or, “Write to this national newspaper.” In all three of those examples, it is predominantly in the UK. They are telling them it is all fabricated. They are absolutely weaponising those private spaces. As you say, it is quite hard to get into them—but actually, it is not that hard. They are pretty open channels, with thousands and millions of engagements and followers. That is the scarier bit. They are private, but you are getting tens of millions of people and engagements on them. I am not sure that is the true definition of private, but it is certainly in an encrypted space.

Photo of Jess Phillips Jess Phillips Shadow Minister (Home Office), Shadow Minister (Domestic Violence and Safeguarding)

I want to touch on the whistleblower issue you raised. There have been some concerns that the Bill might not sufficiently target those with malicious intent. Is there a risk that it potentially criminalises whistleblowersQ ?

Poppy Wood:

The role of whistleblowers in society is really important. I know the Government understand that. There are some good recommendations from the ISC about whistleblowers that I do not think have been adopted in this version of the Bill. That is about at least giving some clarity to where the thresholds lie, and giving a disclosure offence and a public interest defence to whistleblowers so they can say, “These are the reasons why.” My understanding is that at the moment it sits with juries and it is on a case-by-case basis. I would certainly commend to you the recommendations from the ISC.

I would also say—this was a recommendation from the Law Commission and also, I think, from the ISC—that lots of people have to blow the whistle because they feel that they do not have anywhere else to go. There could be formal procedures—an independent person or body or office to go to when you are in intelligence agencies, or government in general or anywhere. One of the reasons why Frances Haugen came forward—she has been public about this—is that she did not really know where else to go. There were no placards saying, “Call the Information Commissioner in the UK if you have concerns about data.” People do not know where to go.

Getting touchpoints earlier down the chain so that people do not respond in desperation in the way we have seen in the past would be a good recommendation to take forward. Whistleblowers play an important part in our society and in societies all round the world. Those tests on a public interest defence would give some clarity, which would be really welcome. Building a system around them—I know the US intelligence services do that; they have a kind of whistleblower programme within the CIA and the Department of Defence that allows people to go to someone, somewhere, earlier on, to raise concerns—is the sort of thing you might be looking at. I think a whistleblower programme is an ISC recommendation, but it is certainly a Law Commission recommendation.

Photo of Sally-Ann Hart Sally-Ann Hart Conservative, Hastings and Rye

Q On malign activity, is there a risk that through clauses 13 and 14 on foreign interference, the Bill could affect free speech, including political speech and journalism? If you think it could, what additional safeguards can be put in place to ensure that only malign activity is captured?

Poppy Wood:

I have certainly read and heard concerns about journalism, about the “foreign power” test on civil society and about having Government money being quite a blunt measure for whether or not you might fall foul of these offences. On journalism, I think that is why you should never try to define disinformation: because those kinds of shape-shifting forms are very hard to pin down, particularly with questions like “What is journalism?”, “What is a mistruth?”, “What is a mis-speak?” and so on. We need to be careful about that.

On your specific question, I refer you to Article 19 and others who have really thought through the impact on journalism and free speech. I am sure it would be an unintended consequence but, again, we are seeing Russia using its co-ordinated armies on Telegram and other channels to target Ukrainian journalists. They are saying, “Complain to the platforms that the journalist is not who they say they are or is saying something false, so they are breaking the terms of service. Bombard the platforms so that that journalist gets taken down and cannot post live from Ukraine for a handful of days.”

That is just another example of how these systems are weaponised. This is where you can go much further on systems through the Online Safety Bill and the National Security Bill without worrying too much about speech. But I refer the Committee to other experts, such as Article 19, that have looked really deeply at the journalism issue. I think Index on Censorship may have done some work as well.

Photo of Sally-Ann Hart Sally-Ann Hart Conservative, Hastings and Rye

Q You have mentioned disinformation. In this Bill, the Online Safety Bill, and perhaps the review that Ms Lynch mentioned, which you thought was a good idea, what more do you want to see the Government do to address dis, mis or malinformation and malign foreign influence online?

Poppy Wood:

I think that where we are now is much better than where we were last year, but my concern is whether this will all be law when we have an election. If not, what are the backstops that the Government have in place to focus on this stuff? It will get tested only when we have an election, really. If that is before March next year or whenever these laws get Royal Assent, there will be a genuine question of crisis management: if this is not law, what are we doing? I would ask that question of the Government and the civil service.

As I said, the disinformation committee in the Online Safety Bill is years down the line. Bring that forward—there is no need not to bring it forward—and please make sure that it is not chaired by someone from a tech platform. I would write that into the Bill, because otherwise there is a risk that that will happen.

Poppy Wood:

Why should the committee on disinformation not be chaired by someone from a tech platform? They have a vested interest in this stuff, so I would get an academic or someone from civil society—someone at arm’s length who can take a holistic view. These platforms will want to protect their interests on this stuff, so I would warn against that.

I would like to see the transparency provisions in the Online Safety Bill go much further. This is a bit in the weeds of the Online Safety Bill, if you will forgive me, but there is a very good clause in that Bill, clause 136, which says that Ofcom should ask whether researchers should be given access to data. It is an important clause, but it says, “Ask the question,” and it gives Ofcom two years to do it. I do not think it needs two years; I think we know that the answer is “Yes, researchers desperately need access to data.”

Almost all the stuff that is caught about malign information operations is caught via Twitter’s API. Twitter makes 10% of all the tweets public, and researchers use that to run analysis, so if you ever want to do research on disinformation, you always use the Twitter API. In many cases, that is mapped over to Facebook to identify the same operations on Facebook, but they are always caught in the first instance because of open data. I think that the Online Safety Bill, if this Committee and this Bill want to back it up, could bring that forward and say, “Either do the report in six months or don’t even ask the question.”

By the way, the European legislation that is equivalent to the Online Safety Bill makes that happen as of Tuesday this week, so researchers should, in theory, be able to access data. I would bring the transparency provisions forward, and I would really want the Bill to call out co-ordinated inauthentic behaviour.

Photo of Rushanara Ali Rushanara Ali Labour, Bethnal Green and Bow

That brings us to the end of this panel. On behalf of the Committee, I thank our witness for taking the time to give evidence.