Social Media: News - Motion to Take Note

Part of the debate – in the House of Lords at 4:42 pm on 11 January 2018.

Alert me about debates like this

Photo of Viscount Waverley Viscount Waverley Crossbench 4:42, 11 January 2018

My Lords, the manipulation of social media and control over differing value sets present regulatory and ethical challenges in today’s world. Manipulation can undermine political and social life, shaping Governments and governance, colouring decision-making and economic espionage. Thirty countries are said to use online tactics to manipulate outcomes, yet Governments currently have limited or no control over this environment. Identifying perpetrators with certainty is difficult. So what is to be done, by whom, moving forward?

The short answer is that the social media platforms could and should step up to the plate and publish their own analyses. Distinguishing one threat group from another is possible when sufficient information, analytical know-how and technology tools combine. Cyber intruders leave digital footprints with links that enable computer forensic analysts to separate one intrusion from another. Major platforms, most particularly Twitter and Facebook, retain the vital data to pinpoint state-sponsored accounts operating on their platforms—but they are not willing to share it. They say that their own systems work internally to find and shut down bot and misinformation accounts. But, whenever they delete an account or when the account holder deletes it, the information is lost, with trolls simply making new accounts and reviving the process.

Foreign influence does necessitate formulating a plan to counter interference. Should the Government reject calls for censorship and regulation, trigger a process to enshrine protection and penalties into domestic legislation and so rein in, control and protect through the rule of law?

What realistically could be done? We could devise support programmes of fact checking, verification and digital forensic initiatives capable of exposing falsehoods and false claims of authority that underpin fake and propaganda pieces, and ensure platforms crack down on automated amplification networks that impersonate humans—botnets. Social media networks could develop and administer algorithms for identifying and removing fake news by marshalling the same engines that spread fake news in the first place. They should identify repeat disinformation offenders and have them demoted, if not taken offline.

Government should also invest in media literacy and education programmes. Emotional targeting is the central tactic of disinformation. People have to be taught how to recognise it. When it is used as a direct tool of the state, we must expose it, not ban or censor it. We must work with the social platforms and civil society groups, not against them, to close such loopholes as anonymous accounts and the use of hyperpartisan rhetoric. Platforms should make verification necessary and easier. Traditional media must also responsibly verify the social media accounts they cite.

HMG can lead internationally by devising and promoting a new global treaty to nail this issue, and here at home by creating an independent commissioner with oversight, accountable to Parliament. Self-regulation is to be supported but scrutinised. There will always be loopholes, but signals from the major platforms are encouraging and consequently should be applauded. Co-operation, not complacency, must win the day.