Data Protection Bill [HL] - Report (2nd Day)

Part of the debate – in the House of Lords at 4:49 pm on 13 December 2017.

Alert me about debates like this

Photo of Lord Clement-Jones Lord Clement-Jones Chair, Artificial Intelligence Committee, Liberal Democrat Lords Spokesperson (Digital), Chair, Artificial Intelligence Committee 4:49, 13 December 2017

My Lords, I will speak also to a number of other amendments to Clause 13 in this group. I regret that the rules of drafting on Report mean that I was not able to produce a consolidated clause; it is rather bitty in the way it is presented in the amendments, but I very much hope that the Minister will be able to interpret the bits as eventually forming a perfectly-formed whole and a much preferable alternative Clause 13. In addition to those amendments I will speak to Amendment 41, which constitutes a new clause after Clause 13.

Clause 13 concerns the prohibition and exemptions around significant solely automated decisions. However, it can be confusing. There are three grounds on which such decisions are permitted under the GDPR: to enter or to perform a contract, to give explicit consent or to be authorised under UK law. Clause 13 concerns only the safeguards for the last category. Therefore, our amended version of Clause 13 has the following important four aims.

First, it clarifies that an individual’s ability to claim that a decision had a significant effect on them—a prerequisite for triggering any of the protections that the GDPR has to offer relating to automated decision-making—can be grounded in a significant effect on a protected group under the Equality Act 2010. The Equality Act is a strong piece of legislation, but it contains no information rights for individuals to investigate suspicions of machine bias or illegal discrimination. Given that the Information Commissioner will already be overloaded with work, given the changes accompanying the GDPR and the speed of technological development, this is a simple and crucial check and balance that will strengthen enforcement of not just data protection but many UK laws.

Secondly, the amendments further clarify that in order to claim that a decision was not solely automated—and therefore benefiting from none of this clause’s protections—there must be “meaningful human input”. The Minister argued in Committee that this is,

“precisely the meaning that that phrase already has”.—[Official Report, 13/11/17; col. 1869.]

Unfortunately, we have reason for concern because, in respect of identical wording in the 1995 data protection directive, German courts, for instance, have previously read “solely” in a restricted, narrow sense. Therefore, having such clarification in the Bill would ensure that the Minister’s understanding of the protection afforded to data subjects is the protection they will receive. This clarification is in line with the article 29 working party guidance—I recognise that the Minister corresponded with me on the subject of article 29 guidance—but it takes us closer to an adequacy agreement if one is sought upon leaving the EU.

Thirdly, the Explanatory Notes in paragraph 115 promise a safeguard that is not found in any of the articles of the GDPR, nor the safeguards laid out by the Government: a right to,

“an explanation of the decision reached after an assessment”.

The cause of this is that its position is in a non-binding recital, and there is a contradiction between the recitals and the main text. This is easily rectified for the decisions authorised by law, as the purpose of Clause 13 is to specify safeguards for these particularly impactful and largely public sector decisions.

It is included as well to indicate—in a very similar way to a recent French law on exactly the same issue—what such an explanation should provide to be useful. These explanations are possible even with black box algorithms. I have tabled an additional simple amendment to include this safeguard explicitly for automated decisions authorised by consent or contract, not just those authorised by law.

Fourthly, in line with the huge successes of open data and recommendation 4 of the 2013 Macpherson review into the quality assurance of business-critical analytical models in government, it is undoubtedly good practice to publish details of the models being used in the public sector. Clause 13 as reconstituted by these amendments would propose that where models are used solely or partially as the basis for significant decisions, with the grounds for processing drawing on UK law rather than on consent or contract, these models must have relevant metadata about them published, as well as information where applicable on how they meet the Equality Act’s public sector equality duty. This duty does not always require documentation. That is important, as it is clear that, by virtue of their significance, these systems are consequential enough to justify documentation. It also introduces a requirement for controllers that argue that their significant decisions are only partially automated, and therefore fall outside these protections, to publish information and analysis on how they prevent overreliance.

These changes to Clause 13 are part of the positioning of the UK as the world leader in trustworthy AI systems. Public procurement has typically been a lever to drive new ethical markets; an example is how the UK’s BREEAM green building standard has been built into tenders and spurred a market and national niche in environmentally friendly construction. Explainability in AI is a cutting-edge research field, with recent research calls on it from the EPSRC and DARPA, and it was discussed keenly in the recent Hall and Pesenti government AI review, as well as in the Royal Society/British Academy data stewardship report. Explainability of complex systems that centre on particular data subjects—in other words, outcomes—rather than trying to explain the whole of an algorithm in one go is perfectly doable. We want economies around the world to turn to us when they want a reliable, trustworthy and ethical system of assessing automated systems.

These grey areas are real and the outcome in the courts is far from settled. As this Bill passes through our hands, we have a real opportunity to give its protections certainty and renewed vigour.

Amendment 41 attempts to incorporate the end of recital 71 into the Bill. I am sure that the noble Lord is highly familiar with this recital, which deals with automated decision-making. It begins:

“The data subject should have the right not to be subject to a decision, which may include a measure, evaluating personal aspects relating to him or her which is based solely on automated processing”,

and so on and so forth. The final sentence of the recital says:

“Such measure should not concern a child”.

What is the Government’s answer to the lack of anything in the Bill that reflects that sentence as regards automated decision-making? Clause 13 as amended is intended to fill that gap and I very much hope that the Minister will see it as an attractive and practical way of improving the Bill. I beg to move.