Amendment 292

Police, Crime, Sentencing and Courts Bill - Committee (10th Day) (Continued) – in the House of Lords at 10:00 pm on 22 November 2021.

Alert me about debates like this

Baroness Hamwee:

Moved by Baroness Hamwee

292: After Clause 170, insert the following new Clause—“Automated decision-making: safeguards(1) Where data is being processed for a criminal justice purpose, section 14 of the Data Protection Act 2018 is to be read as if the amendments in subsections (2) to (7) had been made.(2) In subsection (1) after “solely” insert “or significantly”.(3) In subsection (4) after “solely” insert “or significantly”.(4) In subsection (4)(a) after “solely” insert “or significantly”.(5) In subsection (4)(b)(ii) after “solely” insert “or significantly”.(6) In subsection (5) after paragraph (a) insert—“(aa) provide to the data subject all such information as may be reasonable regarding the operation of the automated processing and the basis of the decision,”(7) After subsection (5) insert—“(5A) The controller’s powers and obligations under this section are not limited by commercial confidentiality claimed by the provider of equipment or programmes used”.”

Photo of Baroness Hamwee Baroness Hamwee Chair, Justice and Home Affairs Committee, Chair, Justice and Home Affairs Committee

My Lords, changing the subject, the Data Protection Act 2018, reflecting the GDPR, in Section 14 provides that “decisions based solely”— solely—“on automated processing” are “subject to safeguards.” Such a decision

“produces legal effects concerning the data subject, or … similarly significantly affects the data subject.”

The decisions are subject to safeguards under the Act, notification of the data subject and the right of the data subject to request reconsideration or, importantly, a new decision not based on automated processing. Noble Lords will appreciate the potential importance of decisions affecting liberty and that the use of artificial intelligence may well involve profiling, which does not have an unblemished record.

This amendment would alter the term “solely,” because “solely” could mean one click on a programme. The term “significantly”, proposed in the amendment, is not the best, but I think it will serve the purpose for this evening. I do not claim that this is the best way to achieve my objective, but I did not want to let the moment pass. The Justice and Home Affairs Committee —I am not speaking as its chair—has had this issue raised a number of times. The Information Commissioner is one who has raised the issue. Elizabeth Denham, before she left the office, said it should not just be a matter of box-ticking. The guidance of the Information Commissioner’s Office provides that there should be the following three considerations:

“Human reviewers must be involved in checking the system’s recommendation and should not just apply the automated recommendation to an individual in a routine fashion; reviewers’ involvement must be active and not just a token gesture. They should have actual ‘meaningful’ influence on the decision, including the ‘authority and competence’ to go against the recommendation; and reviewers must ‘weigh-up’ and ‘interpret’ the recommendation, consider all available input data, and also take into account other additional factors.”

The Minister will, I am sure, refer to the current government consultation on data, Data: A New Direction, published in September. We dealt with this issue by putting the amendment down before then but, even so, the consultation questions the operation and efficacy of the Article 22 of the GDPR, which, as I said, is the basis for Section 14. I appreciate that the consultation will have to run its course but, looking at it, the Government seem very focused on the economic benefits of the use of data and supportive of innovation.

Of course, I do not take issue with either of those things, but it is important not to lose sight of how the use of data may disadvantage or damage an individual. Its use in policing and criminal justice can affect an individual who may well not understand how it is being used, or even that it has been used. I was going to say that whether those who use it understand it is another matter but, actually, it is fundamental. Training is a big issue in this, as is, in the case of the police, the seniority and experience of the officer who needs to be able to interpret and challenge what comes out of an algorithm. There is a human tendency to think that a machine must be right. It may be, but meaningful decisions require human thought more than an automatic, routine confirmation of what a machine tells us.

The government consultation makes it clear that the Government are seeking evidence on the potential need for legislative reform. I think that reform of Section 14 is needed. AI is so often black-box and impenetrable; even if it can be interrogated on how a decision has been arrived at, the practicalities and costs of that are substantial. For instance, it should be straightforward for someone accused of something to understand how the accusation came to be made. It is a matter of both the individual’s rights and trust and confidence in policing and criminal justice on the part of the public. The amendment would extend the information to be provided to the data subject to include

“information … regarding the operation of the automated processing and the basis of the decision”.

It also states that this should not be “limited by commercial confidentiality”; I think noble Lords will be familiar with how openness can run up against this.

Recently, the Home Secretary told the Justice and Home Affairs Committee twice that

“decisions about people will always be made by people.”

The legislation should reflect and require the spirit of that. A click of a button on a screen may technically mean that the decision has a human element, but it is not what most people would understand or expect. I beg to move.

Photo of Lord Paddick Lord Paddick Liberal Democrat Lords Spokesperson (Home Affairs)

My Lords, with the leave of the Committee, I will speak briefly. In my comments on the previous group on which I spoke—the one beginning with Amendment 278—I did not mean to suggest that the noble Lord, Lord Carlile of Berriew, was filibustering. I tried to inject a little humour into proceedings, bearing in mind the wide range of issues that we discussed in the debate on that group and the length of that debate. I joked that it was beginning to look like a filibuster. I have apologised to the noble Lord but I wanted to include that apology in the official record.

We support this important amendment. As my noble friend Lady Hamwee said, Section 14 of the Data Protection Act 2018 provides some safeguards against important decisions being taken by automated processing. It allows a human review on appeal with the subject having been told, but only if the decision was “solely” taken automatically, rather than “significantly”, as my noble friend’s amendment suggests. Experience in the American criminal justice system of using algorithms shows that bias in historical decisions is replicated, even enhanced, by algorithms. We therefore support this amendment.

Photo of Lord Rosser Lord Rosser Shadow Spokesperson (Home Affairs), Shadow Spokesperson (Transport)

As has been said, Article 22 of the general data protection regulation provides that a person has

“the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.”

It also provides that there is an exemption to this if the automated decision-making is explicitly provided in law. Section 14 of the Data Protection Act 2018 provides, as has been said, some safeguards based on Article 22 for cases where the law allows automated decision-making on things that may have a significant effect on a person. It provides that where a significant decision is made by automated means, the subject may request that the decision is retaken with human oversight. The section currently provides protections for a decision taken, as has once again been said, “solely” by automated means. The amendment would extend this provision to decisions taken solely “or significantly” by automated means.

The issue of automated decision-making will become, and indeed is becoming, increasingly prevalent in our lives—a point made by all sides during the passage of the 2018 Act, when we tried to add far stronger safeguards to the then Bill to prevent decisions that engaged an individual’s human rights being decided by automated means. On that basis, I am certainly interested in the points raised to extend the right of appeal to decisions that are based “significantly” on automated processing.

Finally, it is potentially concerning that the Government are currently consulting on removing Article 22 of the GDPR and the associated protections from UK law altogether. I believe that consultation closed last week. Can the Government give an indication of when we can expect their response?

Photo of Lord Sharpe of Epsom Lord Sharpe of Epsom Lord in Waiting (HM Household) (Whip)

My Lords, I am grateful to the noble Baroness, Lady Hamwee, for explaining this amendment, which relates to automated decision-making. Let me first say that the Government are committed to maintaining high standards of data protection and agree that the clarity of safeguards relating to automated decision-making is important. The Government are also aware of some of the difficulties faced by organisations in navigating the terminology of these automated processing provisions.

As all noble Lords have noted, to address this issue the Government are currently seeking evidence via a public consultation, which is being run by the Department for Digital, Culture, Media and Sport. As the noble Lord, Lord Rosser, noted, that consultation closed only last Friday. He also mentioned Article 22. The consultation is looking at the need for legislative reform of the UK data protection framework overall, including GDPR and the Data Protection Act 2018. It covers Article 22 of the UK GDPR, including organisations experienced with navigating the solely automated processing and similarly significant terminology. As I say, that consultation closed on 19 November.

In examining the responses to the consultation, the Government will consider the safeguards in respect of automated decision-making that involve personal data in the round. We will address this matter in the government response to the consultation, which we expect to publish in the spring. We also look forward to the report of the inquiry by the Justice and Home Affairs Committee, chaired by the noble Baroness, Lady Hamwee, and will take its conclusions and recommendations into account when bringing forward our proposals for legislation. In the meantime, with apologies for being brief, I invite the noble Baroness to withdraw her amendment.

Photo of Baroness Hamwee Baroness Hamwee Chair, Justice and Home Affairs Committee, Chair, Justice and Home Affairs Committee

My Lords, I am grateful for that reply. This amendment and this concern are about far more than navigating terminology. It is actually a fundamental point, but I do not intend to keep the Committee any longer. I think I have made it clear that I am probing but, I hope, probing to an end. I beg leave to withdraw the amendment.

Amendment 292 withdrawn.

Amendments 292A to 292D not moved.