Data Protection Bill [HL] - Second Reading

Part of the debate – in the House of Lords at 6:02 pm on 10th October 2017.

Alert me about debates like this

Photo of Baroness Lane-Fox of Soho Baroness Lane-Fox of Soho Crossbench 6:02 pm, 10th October 2017

My Lords, happy Ada Lovelace Day. How prescient of the Whips and the Minister to pick today for Second Reading. To remind colleagues who might be wondering: she was one of the great innovators of computing in the 19th century. She worked with Charles Babbage on his computational engine, she was the first to recognise that the machine had applications beyond pure calculation, and, in fact, she probably created the first algorithm intended to be carried out by that machine. As part of that, she is often regarded as the first to recognise the full potential of computing, so it could hardly be more apt to pick today for this Second Reading debate, in which we are probably looking at the consequences of the work that she started all those years ago.

The Government’s ambition is to,

“make Britain the best place to start and run a digital business; and … the safest place in the world to be online”,

as detailed in the Conservative manifesto. This Bill is intended to,

“ensure that our data protection framework is suitable for our new digital age, and cement the UK’s position at the forefront of technological innovation, international data sharing and protection of personal data”.

This aspiration to be the best, to make the UK a world leader and set a precedent for good regulation of our digital worlds, is admirable, but that means that the Bill must set the bar high. It must be the very best it can be, especially as we head towards Brexit, where having the highest standards around the collection and use of data will be vital not just to digital businesses but to our continued ability to trade. This Bill must be the foundation for that. There is much that is good in the Bill, but I do not believe that it is yet the best that it can be.

I must start with a confession. Despite the kind references today to my career and supposed expertise, I found this Bill incredibly hard to read and even harder to understand. I fear that we will not do enough to stop the notion, referred to by the noble Lord, Lord McNally, that we are sleepwalking into a dystopian future if we do not work hard to simplify the Bill and make it accessible to more people, the people to whom I feel sure the Government must want to give power in this updated legislation. Let us ensure that the Bill is a step forward for individual power in the rapidly changing landscape in which we sit, a power that people understand and, importantly, use. Let us make it an indicator to the world that the UK balances the importance of tech start-ups, innovation, foreign investment and big businesses with consumer and citizen rights.

The Government should be commended for getting ahead of movements that are growing all over the world to free our data from the tech giants of our age. As data becomes one of our most valuable resources—as we have heard, the new oil—individuals have begun to want a stake in determining for themselves when, how and to what extent information about them is held and communicated to others. So I welcome the clear data frameworks, which are important not only for the best digital economy but for the best digital society.

I agree with much that has been said today but want to make three specific points on the Bill. First, from any perspective, the GDPR is difficult to comprehend, comprising sweeping regulations with 99 articles and 173 recitals. The Bill contains some wonderful provisions, of which my favourite is:

Chapter 2 of this Part applies for the purposes of the applied GDPR as it applies for the purposes of the GDPR … In this Chapter, “the applied Chapter 2” means Chapter 2 of this Part as applied by this Chapter”.

Giving people rights is meaningful only if they know that they have them, what they mean, how to exercise them, what infringement looks like and how to seek redress for it. There are questions about the practical workability of a lot of these rights. For example, on the right to portability, how would the average person know what to do with their ported data? How would they get it? Where would they keep it? There was a funny example in a newspaper recently where a journalist asked Facebook to send them all the data that it had collected over the previous eight years and received a printed copy of 800 pages of data—extremely useful, as I think you will agree. What about your right to erase your social media history? I should declare my interest as a director of Twitter at this point. How can you remove content featuring you that you did not post and in which people may have mentioned you? What happens as the complexity of the algorithm becomes so sophisticated that it is hard to separate out your data? How does the immense amount of machine learning deployed already affect your rights, let alone in the future?

Awareness among the public about the GDPR is very low—the Open Data Institute has done a lot of work on this which is soon to be published. It is very unlikely that ordinary people understand this legislation. They will have no understanding of how their rights affect them. A lot of education work needs to be done.

For businesses, too, the learning curve is steep, especially for foreign investors in European companies. Some are betting that the sheer scope of the GDPR means that the European regulators will struggle to enforce it. When the GDPR came up at a recent industry start-up event, one industry source said that none of the people to whom they had spoken could confidently say that they had a plan. Every online publisher and advertiser should ensure that they do, but none of them is taking steps to prepare.

So much has been done by this Government on building a strong digital economy that it is important to ensure that small and start-up businesses do not feel overwhelmed by the changes. What substantial help could be planned and what education offered? What help is there with compliance? By way of example, under Clause 13, companies have 21 days to show bias from algorithms, but what does this mean for a small AI start-up which may be using anonymised intelligence data to build a new transport or health app? What do they have to think about to make good legal decisions? As my noble friend Lord Jay so brilliantly argued, how can we ensure post-Brexit legislative certainty for them in building global successful businesses?

This brings me to my second question: why has the right of civil groups to take action on behalf of individuals been removed from the UK context for the GDPR? Instead, the Bill places a huge onus on individuals, who may lack the know-how and the ability to fight for their rights. As has been mentioned, article 80(1) of the GDPR allows for representative bodies—for example, consumer groups—to bring complaints at the initiation of data subjects. Article 80(2) allows those groups to bring complaints where they see infringements of data rights without an individual having to bring the case themselves. These give consumers power. It supports their rights without them having to specifically understand that the rights exist, or how to exercise them. Unfortunately, article 80(2) is an optional clause and the UK has omitted it. This omission is worrying, given how stretched the ICO’s resources are and the impact this could have on its support for the public. Granting rights over data to individuals is meaningless if individuals lack the understanding to exercise those rights and there is no infrastructure within civic society to help them exercise those rights. However, we have many organisations in this country—Citizens Advice, Which?—which have these kinds of rights of free-standing action in relation to other regulations. There does not seem to be any good reason why the UK has chosen not to take up the option in EU law to allow consumer privacy groups to lodge independent data protection complaints as they can currently do under consumer rights laws.

Citizens face complex data trails. It is impossible for the average person to be able to know which organisations hold their personal data. Enabling privacy groups to take independent action will ensure these rights are enforced. As it stands, under the Bill the ICO is currently the main recourse for this.

Resourcing the ICO, Part 5 of the Bill, is essential and my third main area of interest. The ICO has considerable responsibilities and duties under the Bill towards both business and individuals: upholding rights, investigating reactively, informing and educating to improve standards, educating people and consumer groups, and maintaining international relationships. I feel exhausted thinking about it. The ICO’s workload is vast and increasing. It lacks sufficient resources currently. In March 2017, the Information Commissioner asked Parliament if it could recruit 200 more staff but the salaries it offers are significantly below those offered by the private sector for roles requiring extremely high levels of skills and experience. These staff are going to become ever more important and more difficult to recruit in the future.

The ICO currently funds its data protection work by charging fees to data controllers. It receives ring-fenced funding for its freedom of information request work from the Government. This income can increase the number of data controllers only as it increases: it is not in line with the volume or complexity of work, and certainly not with that in the Bill. Perhaps it is time for another method of funding, such as statutory funding.

Finally, I would like briefly to add my thoughts on how the Bill affects children. As many noble Lords have said, the YouGov poll does indeed say that 80% of the public support raising the age to 18—currently it is 13, as detailed by the Government. However, there are many other surveys, particularly one by the Children’s Society, which show that 80% of 13 year-olds currently have a social media account and 80% of people under 13 have lied or twisted their age in order to establish one. This is the realpolitik in the war of understanding the internet with our children. I respectfully disagree with the noble Baroness, Lady Howe, and others in the Chamber: I feel strongly that it is wrong to place policing at the heart of how we deal with relationships between children and the internet. We need to take a systems-based approach. I have seen my godchildren set up fake accounts and whizz around the internet at a speed I find alarming. We have to deal on their terms. We have to help educators, parents and people supporting children, not use the long arm of the law.

There are many anomalies, as has already been detailed, as well as discrepancies with Scotland, differences between parental oversight and explicit giving of consent, problems with data collection and how the digital charter will work, and so on, and those are all important. However, I am optimistic too—I always am—and there is much to welcome in the Bill. I am particularly optimistic if we can work in tandem on the wider digital understanding of our society, as so brilliantly detailed by the noble Baroness, Lady Jay. I wish I could discuss the important themes in the Bill with Ada Lovelace, but in her absence I will have many good discussions with people in this Chamber so that we can all work hard to ensure that citizens and consumers reap the benefits of the Bill.