Data Protection Bill [HL] - Committee (3rd Day) (Continued)

Part of the debate – in the House of Lords at 6:30 pm on 13th November 2017.

Alert me about debates like this

Photo of Lord Stevenson of Balmacara Lord Stevenson of Balmacara Opposition Whip (Lords) 6:30 pm, 13th November 2017

My Lords, we have a number of amendments in this group and I want to associate myself with many of the points made on the other amendments by the noble Lord, Lord Clement-Jones. I was only sorry that we did not get round to signing up to more of them in time to get some of the glory, because he has picked up a lot of very interesting points.

We will come to later groups of amendments that deal with a broader concern of effects and moral issues in relation to this Bill. It has been growing on me for a number of weeks now, but one of the most irritating things about the Bill, apart from the fact that it does not have the main clauses in it that one wants to discuss, is that every now and again we come up against a brick wall where there is suddenly a big intellectual jump on where we have got to and where we might want to get to through the Bill, and this is one of them.

This whole idea of automated data and how it operates is very interesting indeed. One of the people with whom I have been having conversations around this suggested that, in processing this Bill, we are in danger of not learning from history in your Lordships’ House and indeed Parliament as a whole, in relation to other areas in which deep moral issues are raised. The point was made, which is a good one, that when Parliament was looking at the Human Fertilisation and Embryology Act 1990 there had been four or five years, perhaps slight longer, of pre-discussion in which all the big issues had been thrashed out both in public and in private—in both Houses and in the newspapers, and in private Bills. There were loads of attempts to try to get to the heart of the issue. We are missing that now, in a way that suggests that it will become a lot clearer when we have discussions later about a data ethics body. I am sure that they will be good and appropriate discussions.

Having said that, the issue here is extremely worrying. We are at the very start of a rich and very interesting development in how computers operate and how machines take away from us a lot of the work that we currently regard as being human work. It is already happening in the world go championship. A computer played the human go champion and beat them easily. Deep Blue, the IBM computer, beat Garry Kasparov the chess player a few years ago. The point is not so much that these things were happening, but that nobody could understand what the machines were doing in relation to the results they were achieving. It is that apparent ability to exceed human understanding that is the great worry behind what people say. Of course, it is quite a narrow area and not one that we need to be too concerned about in terms of a broader approach. But in a world where people say with a resigned shrug that the computer has said no to a request they have made to some website, it is a baleful reflection of the helplessness we all feel when we do not understand what computers are doing to us. Automated processing is one facet of that, and we have to be careful.

We have to think of people’s fears. If they have fears, they will not engage. If they will not engage, the benefits that should flow from this terrific new initiative, new thinking and new way of doing things will be that we do not get the productivity or the changes that will help society as we move forward. We have to think of future circumstances in a reflective way. In a deliberative way we have to think about technical development and public attitudes. It again plays back to the work that was done by Mary Warnock and her team when they were trying to introduce the HFEA. She said, importantly, that reason and sentiment are not necessarily opposed to each other. It is that issue we are trying to grapple with today. The amendments that have been so well introduced by the noble Lord, Lord Clement-Jones, cover that.

The regulatory and legal framework may not be sufficient. Companies obviously have natural duties only to their shareholders. Parliament will have to set rules that make people in those companies take account of public fears, as well as shareholder interests. That approach is not well exemplified in this Bill yet. We need to think about how to allow companies to bring forward new initiatives and push back the boundaries of what they are doing, while retaining public confidence. That is the sort of work that was done on the HFEA and that is where we have to go.

Our Amendment 74 has already been spoken to by the noble Lord, Lord Clement-Jones. It is an important one. There is an issue about whether or not an individual—“a natural person”, as the amendment has it—is involved “in the decision-making process”. We should know that.

Amendment 77A would ensure that data controllers must,

“provide meaningful information … significance and legal consequences”,

of the processing they are doing. Amendment 77B states that:

“A data subject affected by”,

automated decision-making,

“retains the right to lodge a complaint to the”,

ICO.

These are all consequences of the overall approach we are taking. I look forward to further debates and the Minister’s response.