We need your support to keep TheyWorkForYou running and make sure people across the UK can continue to hold their elected representatives to account.

Donate to our crowdfunder

Queen’s Speech - Debate (4th Day)

Part of the debate – in the House of Lords at 2:37 pm on 17th October 2019.

Alert me about debates like this

Photo of Baroness O'Neill of Bengarve Baroness O'Neill of Bengarve Crossbench 2:37 pm, 17th October 2019

My Lords, it is not easy to debate or even to discern the cumulative implications of the measures mentioned in the gracious Speech. Many serious measures that urgently require new legislation received no mention, while many of those that are mentioned are dealt with by gesturing towards indeterminate action. For example, and this has been widely discussed, nothing is said about housing, which is surely an urgent matter, or about ensuring that future elections and referenda are fair—again, an urgent matter. I have decided, perhaps rashly, to discuss the latter theme in today’s debate because I believe that the business models underlying certain uses of electoral campaigning are doing great damage.

The Government have repeatedly said that they regard the protection of elections and referenda as urgent. For example, on 9 May—this was only one of a number of Answers given by Ministers in that month —in reply to a Question asked by the noble Lord, Lord Tyler, the Minister stated:

“The Government are committed to protecting electoral and democratic processes from foreign interference into the future”,—[Official Report, 9/5/19; col. 1301.]

and then claimed that the Government are consulting, and that in the event of another referendum there would be time for legislation. Nothing has been done, however, and repeated questions have elicited no more definite answers. The bitterness and tremendous distrust that Brexit has produced will not be surmounted if democratic processes are widely believed to have been corrupted.

I hope that other, more knowledgeable noble Lords will speak on housing, but I will say a bit about what is needed if future elections are to earn the respect of the electorate; that is, even of those whose preferred outcome does not receive a majority of the votes. I decided to speak on it today, rather than in the debate on constitutional issues on Monday, because I believe that the dangers arise in very considerable part from the business models that currently support the distribution and targeting of online content with political aims.

Many who buy or supply targeted online content with political aims or effects are not regulated by and cannot be regulated by the Electoral Commission. Whereas campaigning expenditure by political parties during election and referendum campaigns is tightly regulated, campaigning expenditure by others—whether other political or commercial groups, foreign states or rich individuals—is unregulated. Moreover—this is the crucial matter—it is protected by a cloak of anonymity, despite the harm it can do to democratic process and, indeed, to democracy.

The harms that I am concerned about are not the well-known private harms based on the misuse and abuse of digital technologies, which are usually initiated by users of social media. There is a great deal of concern and expertise in your Lordships’ House about those harms and, in many ways, the report on online harms from the Department of Digital, Culture, Media and Sport addresses them. They range across the many forms of online abuse familiar to us, from fraud to cyberbullying, extreme porn to defamation, and many others. I agree with other noble Lords that these harms can be very serious.

However, the online harms White Paper does not deal with the other harms—those which concern me today. These are public harms, in the economist’s sense of the term: harms that damage not individuals or individual interests but social institutions and processes—communication, culture, serious journalism and, above all, democracy. The phrase “disinformation campaigning”, which is new to me, is now being used to refer to these harms, and it is the subject of a very recent report by the Oxford Internet Institute, titled The Global Disinformation Order: 2019 Global Inventory of Organised Social Media Manipulation—I wanted to say “disorder”. The report provides an inventory of the use of algorithms, automation and big data to shape, and indeed distort, public life. It comments on the tools, strategies and resources employed by those whom it refers to as global “cyber troops”, including state and corporate agencies, and sometimes rich individuals, that use covert means to shape and distort public opinion.

Social media are of course used, or perhaps I should say misused, in these disinformation campaigns, but they are the conduit not the source of the misuse. Those who use social media are not the customers of their service providers, since they do not pay money for the service they receive, but merely provide their data in order to receive it. For that reason, consumer protection legislation does not come into the picture. In return for doing this, content will be directed to them by their service providers at the behest of others, who remain anonymous. The service providers are focused on selling opportunities for their actual customers —those who purchase their services—to target their service users. It is that distinction between customers and users that is commercially different from other parts of the economy. That content might of course merely amount to advertisements, but it may consist of political and other messages, including disinformation. Targeted disinformation can damage democracy at its roots.

That recent report from the Oxford Internet Institute notes that the dangers to democracy are growing rapidly. We are all aware of a few past disinformation campaigns, such as the Cambridge Analytica scandal, on which the Select Committee for Digital, Culture, Media and Sport in the other place did an excellent report some months ago. Many are also aware of some regrettable disinformation campaigning before the referendum, involving, for example, inaccurate claims about the cost of the UK’s membership of the EU and the imminence of Turkish accession and mass migration. However, this misleading campaigning was not, or at least not entirely, anonymous and, as is often pointed out, evidence for its effectiveness remains incomplete.

Since then, things have moved on and become more dangerous. The report estimates that organised social media manipulation has more than doubled since 2017 and that 70 states are now using computational propaganda to manipulate public opinion. It also notes that politicians and political parties in 45 democracies are using it. They are using these tools to, for example, amass fake followers, spread manipulated content and secure voter support. The report also notes that, in authoritarian states, government entities have used these methods of information control to suppress public opinion and press freedom, discredit criticism and oppositional voices, and drown out political dissent. It estimates that 25 states are working with private companies or strategic communications firms that provide computational propaganda as a service. It seems to me highly unlikely that democracy can survive—