Automated decision-making authorised by law: safeguards

Data Protection Bill [Lords] – in a Public Bill Committee at 3:00 pm on 15th March 2018.

Alert me about debates like this

Amendments made: 23, in clause 50, page 30, line 11, leave out “21 days” and insert “1 month”.

Clause 50(2)(b) provides that where a controller notifies a data subject under Clause 50(2)(a) that the controller has taken a “qualifying significant decision” in relation to the data subject based solely on automated processing, the data subject has 21 days to request the controller to reconsider or take a new decision not based solely on automated processing. This amendment extends that period to one month.

Amendment 24, in clause 50, page 30, line 17, leave out “21 days” and insert “1 month”.—(Victoria Atkins.)

Clause 50(3) provides that where a data subject makes a request to a controller under Clause 50(2)(b) to reconsider or retake a decision based solely on automated processing, the controller has 21 days to respond. This amendment extends that period to one month.

Question proposed, That the clause, as amended, stand part of the Bill.

Photo of Liam Byrne Liam Byrne Shadow Minister (Digital, Culture, Media and Sport) (Digital Economy)

I remain concerned that the safeguards the Government have proposed to ensure people’s human rights are not jeopardised by the use of automated decision making are, frankly, not worth the paper they are written on. We know that prospective employers and their agents use algorithms and automated systems to analyse very large sets of data and, through the use of artificial intelligence and machine learning, make inferences about whether people are appropriate to be considered to be hired or retained by a particular company. We have had a pretty lively debate in this country about the definition of a worker, and we are all very grateful to Matthew Taylor for his work on that question. Some differences emerged, and the Business, Energy and Industrial Strategy Committee has put its views on the record.

The challenge is that our current labour laws, which were often drafted decades ago, such as the Sex Discrimination Act 1975 and the Race Relations Act 1965, are no longer adequate to protect people in this new world, in which employers are able to use such large and powerful tools for gathering and analysing data, and making decisions.

We know that there are problems. We already know that recruiters use Facebook to seek candidates in a way that routinely discriminates against older workers by targeting job advertisements. That is not a trivial issue; it is being litigated in the United States. In the United Kingdom, research by Slater and Gordon, a group of employment lawyers, found that one in five bosses admits to unlawful discrimination when advertising jobs online. Women and people over 50 are most likely to be stopped from seeing an advert. Around 32% of company executives admitted to discriminating among those over 50; 23% discriminated against women; and 62% of executives who had access to profiling tools admitted to using them to actively seek out people based on criteria such as age, gender and race. Female Uber drivers earn 7% less than men when pay is determined by algorithms. A number of practices in the labour market are disturbing and worrying, and they should trouble all of us.

The challenge is that clause 50 needs to include a much more comprehensive set of rights and safeguards. It should clarify that the Equality Act 2010 and protection from discrimination applies to all new forms of decision making that engage core labour rights around recruitment, terms of work or dismissal. There should be new rights about algorithmic fairness at work to ensure equal treatment where an algorithm or automated system takes a decision that impinges on someone’s rights. There should be a right to explanation where significant decisions are taken based on an algorithm or an automated decision. There is also a strong case to create a duty on employers, if they are a large organisation, to undertake impact assessments to check whether they are, often unwittingly, discriminating against people in a way that we think is wrong.

Over the last couple of weeks, we have seen real progress in the debate about gender inequalities in pay. Many of us will have looked in horror at some of the news that emerged from the BBC and at some of the evidence that emerged from ITV and The Guardian. We have to contend with the reality that automated decision-making processes are under way in the labour market that could make inequality worse rather than better. The safeguards that we have in clause 50 do not seem up to the job.

I hope the Minister will say a bit more about the problems that she sees with future algorithmic decision making. I am slightly troubled that she is unaware of some live examples in the Home Office space in one of our most successful police forces, and there are other examples that we know about. Perhaps the Minister might say more about how she intends to improve the Bill with regard to that issue between now and Report.

Photo of Victoria Atkins Victoria Atkins The Parliamentary Under-Secretary of State for the Home Department, Minister for Women

I will pick up on the comments by the right hon. Gentleman, if I may.

In the Durham example given by the hon. Member for Sheffield, Heeley, I do not understand how a custody sergeant could sign a custody record without there being any human interaction in that decision-making process. A custody sergeant has to sign a custody record and to review the health of the detainee and whether they have had their PACE rights. I did not go into any details about it, because I was surprised that such a situation could emerge. I do not see how a custody sergeant could be discharging their duties under the Police and Criminal Evidence Act 1984 if their decision as to custody was based solely on algorithms, because a custody record has to be entered.

Photo of Louise Haigh Louise Haigh Shadow Minister (Home Office) (Policing)

I thank the Minister for allowing me to clarify. I did not say that it was solely an algorithmic decision already. Durham is using an algorithm known as the harm assessment risk tool. A human makes a decision based on the algorithm’s recommendations. The point I was making was that law enforcement is using algorithms to make very important decisions that limit an individual’s right to freedom, let alone the right to privacy or anything else, but the Bill will enable law enforcement to take that further. I appreciate what the Minister is saying about PACE and the need for a custody sergeant, but the Bill will enable law enforcement to take that further and to remove the human right—

Photo of Victoria Atkins Victoria Atkins The Parliamentary Under-Secretary of State for the Home Department, Minister for Women

This has been a moment of genuine misunderstanding. Given how the hon. Lady presented that, to me it sounded as if she was saying that the custody record and the custody arrangements of a suspect—detaining people against their will in a police cell—was being done completely by a computer. That was how it sounded. There was obviously an area of genuine misunderstanding, so I am grateful that she clarified it. She intervened on me when I said that we were not aware of any examples of the police solely using automated decision making—that is when she intervened, but that is not what she has described. A human being, a custody sergeant, still has to sign the record and review the risk assessment to which the hon. Lady referred. The police are using many such examples nowadays, but the fact is that a human being is still involved in the decision-making process, even in the issuing of penalties for speeding. Speeding penalties may be automated processes, but there is a meaningful element of human review and decision making, just as there is with the custody record example she gave.

There was a genuine misunderstanding there, but I am relieved, frankly, given that the right hon. Member for Birmingham, Hodge Hill was making points about my being unaware of what is going on in the Home Office. I am entirely aware of that, but I misunderstood what the hon. Lady meant and I thought she was presenting the custody record as something that is produced by a machine with no human interaction.

Photo of Victoria Atkins Victoria Atkins The Parliamentary Under-Secretary of State for the Home Department, Minister for Women

Line-by-line scrutiny, but I was acting in good faith on an intervention that the hon. Member for Sheffield, Heeley made when I was talking about any examples of the police solely using automated decision making.

I hope it is, Mr Byrne.

Photo of Liam Byrne Liam Byrne Shadow Minister (Digital, Culture, Media and Sport) (Digital Economy)

May I ask for your guidance on this question? We are in a Bill Committee that is tasked with scrutinising the Bill line by line. Is it customary for Ministers to refuse to give way on a matter of detail?

Ultimately, whether the Minister gives way is a matter for the Minister—that is true for any Member who has the Floor—but it is normal practice to debate aspects of legislation thoroughly. Ultimately, however, it remains the choice of the Minister or any other Member with the Floor whether to give way.

Photo of Victoria Atkins Victoria Atkins The Parliamentary Under-Secretary of State for the Home Department, Minister for Women

I think it is fair to say that I have given way on interventions, but the right hon. Gentleman seemed to be seeking to argue with me as to my understanding of what his colleague, the hon. Member for Sheffield, Heeley, had said. Frankly, that is a matter for me to understand.

Order. We are debating clause 50 of the Bill, so may I suggest that in all parts of the Committee we focus our minds on the clause?

Photo of Victoria Atkins Victoria Atkins The Parliamentary Under-Secretary of State for the Home Department, Minister for Women

I have lost track of which point the right hon. Gentleman wants me to give way on.

Photo of Liam Byrne Liam Byrne Shadow Minister (Digital, Culture, Media and Sport) (Digital Economy)

Let me remind the Minister. What we are concerned about on the question of law enforcement is whether safeguards that are in place will be removed under the Bill. That is part and parcel of a broader debate that we are having about whether the safeguards that are in the Bill will be adequate. So let me return to the point I made earlier to the Minister, which is that we would like her reflections on what additional safeguards can be drafted into clauses 50 and 51 before Report stage.

Photo of Victoria Atkins Victoria Atkins The Parliamentary Under-Secretary of State for the Home Department, Minister for Women

Clause 49 is clear that individuals should not be subject to a decision based solely on automated processing if that decision significantly or adversely has an impact on them, legally or otherwise, unless required by law. If that decision is required by law, clause 50 specifies the safeguards that controllers should apply to ensure that the impact on the individual is minimised. Critically, that includes informing the data subject that a decision has been taken and giving that individual 21 days in which to ask the controller to reconsider the decision, or to retake the decision with human intervention.

A point was made about the difference between automated processing and automated decision making. Automated processing is when an operation is carried out on personal data using predetermined fixed parameters that allow for no discretion by the system and do not involve further human intervention in the operation to produce a result or output. Such processing is used regularly in law enforcement to filter large datasets down to manageable amounts for a human operator to use. Automated decision making is a form of automated processing that allows the system to use discretion, potentially based on algorithms, and requires the final decision to be made without human interference. The Bill seeks to clarify that, and the safeguards are set out in clause 50.

Question put and agreed to.

Clause 50, as amended, accordingly ordered to stand part of the Bill.

Clause 51