Application of Equality Act (Services and public functions)

Data Protection Bill [Lords] – in a Public Bill Committee at 12:00 am on 22nd March 2018.

Alert me about debates like this

“(1) Part 3 (Services and public functions) of the Equality Act 2010 (‘the Equality Act’) shall apply to the processing of personal data by an algorithm or automated system in making or supporting a decision under this section.

(2) A ‘decision’ in this section means a decision or any part of a decision that engages a data subject (D)’s rights, freedoms or legitimate interests concerning—

(a) the provision of services to the public and

(b) the exercise of public functions by a service-provider.

(3) Nothing in this section detracts from other rights, freedoms or legitimate interests in this Act, the Equality Act or in any other primary or secondary legislation relating to D’s personal data, employment, social security or social protection.”—(

This new clause would apply Part 3 of the Equality Act 2010 to the processing of personal data by an algorithm or automated system or supporting a decision under this new clause.

Brought up, and read the First time.

Photo of Gary Streeter Gary Streeter Conservative, South West Devon

With this it will be convenient to discuss the following:

New clause 8—Application of the Equality Act (Employment)—

“(1) Part 5 (Employment) of the Equality Act (‘the Equality Act’) shall apply to the processing of personal data by an algorithm or automated system in making or supporting a decision under this section.

(2) A ‘decision’ in this section means a decision that engages a data subject (D)’s rights, freedoms or legitimate interests concerning—

(a) recruitment,

(b) the terms and conditions of employment,

(c) access to opportunities for promotion, transfer or training, and

(d) dismissal.

(3) Nothing in this section detracts from other rights, freedoms or legitimate interests in this Act, the Equality Act or in any other primary or secondary legislation relating to D’s personal data, employment, social security or social protection.”

This new clause would apply Part 5 of the Equality Act 2010 to the processing of personal data by an algorithm or automated system or supporting a decision under this new clause.

New clause 9—Right to algorithmic fairness at work—

“(1) A person (“P”) has the right to fair treatment in the processing of personal data by an algorithm or automated system in making a decision under this section.

(2) A “decision” in this section means a decision in which an algorithm or automated system is deployed to support or make a decision or any part of that decision that engages P’s rights, freedoms or legitimate interests concerning—

(a) recruitment,

(b) the terms and conditions of employment,

(c) access to opportunities for promotion, transfer or training, and

(d) dismissal.

(3) “Fair treatment” in this section means equal treatment between P and other data subjects relevant to the decision made under subsection (2) insofar as that is reasonably practicable with regard to the purpose for which the algorithm or automated system was designed or applied.

(4) In determining whether treatment of P is “fair” under this section the following factors shall be taken into account—

(e) the application of rights and duties under equality and other legislation in relation to any protected characteristics or trade union membership and activities,

(f) whether the algorithm or automated system has been designed and trained with due regard to equality of outcome,

(g) the extent to which the decision is automated,

(h) the factors and weighting of factors taken into account in determining the decision,

(i) whether consent has been sought for the obtaining, recording, using or disclosing of any personal data including data gathered through the use of social media, and

(j) any guidance issued by the Centre for Data Ethics and Innovation.

(5) “Protected characteristics” in this section shall be the protected characteristics defined in section 4 of the Equality Act 2010.”

This new clause would create a right to fair treatment in the processing of personal data by an algorithm or automated system in making a decision regarding recruitment, terms and conditions of employment, access to opportunities for promotion etc. and dismissal.

New clause 10—Employer’s duty to undertake an Algorithmic Impact Assessment—

‘(1) An employer, prospective employer or agent must undertake an assessment to review the impact of deploying the algorithm or automated system in making a decision to which subsection (1) of section [Application of Equality Act (Employment)] applies [an ‘Algorithmic Impact Assessment’].

(2) The assessment undertaken under subsection (1) must—

(a) identify the purpose for which the algorithm or automated system was designed or applied,

(b) test for potential discrimination or other bias by the algorithm or automated system,

(c) consider measures to advance fair treatment of data subjects relevant to the decision, and

(d) take into account any tools for Algorithmic Impact Assessment published by the Centre for Data Ethics and Innovation.”

This new clause would impose a duty upon employers to undertake an Algorithmic Impact Assessment.

New clause 11—Right to an explanation—

“(1) A person (“P”) may request and is entitled to be provided with a written statement from an employer, prospective employer or agent giving the following particulars of a decision to which subsection (1) of section [Right to algorithmic fairness at work] applies—

(a) any procedure for determining the decision,

(b) the purpose and remit of the algorithm or automated system deployed in making the decision,

(c) the criteria or other meaningful information about the logic involved in determining the decision, and

(d) the factors and weighting of factors taken into account in determining the decision.

(2) P is entitled to a written statement within 14 days of a request made under subsection (1).

(3) A complaint may be presented to an employment tribunal on the grounds that—

(a) a person or body has unreasonably failed to provide a written statement under subsection (1),

(b) the particulars given in purported compliance with subsection (1) are inadequate,

(c) an employer or agent has failed to comply with its duties under section [Employer’s duty to undertake an Algorithmic Impact Assessment],

(d) P has not been treated fairly under section [Right to algorithmic fairness at work].

(4) Where an employment tribunal finds a complaint under this section well-founded the tribunal may—

(e) make a declaration giving particulars of unfair treatment,

(f) make a declaration giving particulars of any failure to comply with duties under section [Employer’s duty to undertake an Algorithmic Impact Assessment] or section [Right to algorithmic fairness at work],

(g) make a declaration as to the measures that ought to have been undertaken or considered so as to comply with the requirements of subsection (1) or section [Employer’s duty to undertake an Algorithmic Impact Assessment] or section [Right to algorithmic fairness at work],

(h) make an award of compensation as may be just and equitable.

(5) An employment tribunal shall not consider a complaint presented under subsection (3) in a case where the decision to which the reference relates was made—

(i) before the end of the period of 3 months, or

(j) within such further period as the employment tribunal considers reasonable in a case where it is satisfied that it was not reasonably practicable for the application to be made before the end of that period of 3 months.

(6) Nothing in this section detracts from other rights, freedoms or legitimate interests in this Bill or any other primary or secondary legislation relating to P’s personal data, employment, social security or social protection.”

This new clause would create a right to an explanation in writing from an employer, prospective employer or agent giving the particulars of a decision to which the Right to algorithmic fairness at work applies.

Photo of Liam Byrne Liam Byrne Shadow Minister (Digital, Culture, Media and Sport) (Digital Economy)

New clauses 7 and 8 to 11 touch on the question of how we ensure a degree of justice when it comes to decisions that are taken about us automatically. The growth in decisions that are made through automated decision making has been exponential, and there are risks to that. We need to ensure that the law is modernised to provide new protections and safeguards for our constituents in this new world.

I should say at the outset that this group of new clauses is rooted in the excellent work of the Future of Work commission, which produced a long, thought-provoking report. The Committee will be frustrated to hear that I am not going to read through that this afternoon, but, none the less, I want to tease out a couple of points.

The basket of new clauses that we have proposed are well thought through and have been carefully crafted. I put on record my thanks to Helen Mountfield QC, an expert in equality law, and to Mike Osborne, professor of machine learning. Along with Ben Jaffey QC, a specialist in data law, they have been looking at some of the implications of automated decision making, which were discussed at length by the Future of Work commission.

Central to the new clauses is a concern that unaccountable and highly sophisticated automated or semi-automated systems are now making decisions that bear on fundamental elements of people’s work, including recruitment, pay and discipline. Just today, I was hearing about the work practices at the large Amazon warehouse up in Dundee, I think, where there is in effect digital casualisation. Employees are not put on zero-hours contracts, but they are put on four-hour contracts. They are guided around this gigantic warehouse by some kind of satnav technology on a mobile phone, but the device that guides them around the warehouse is also a device that tracks how long it takes them to put together a basket.

That information is then arranged in a nice league table of employees of who is the fastest and who is slowest, and decisions are then taken about who gets an extension to their contracted hours each week and who does not. That is a pretty automated kind of decision. My hon. Friend Clive Efford was describing to me the phenomenon of the butty man—the individual who decided who on a particular day got to work on the docks or on the construction site. In the pub at the end of the week, he divvied up the earnings and decided who got what, and who got work the following week. That kind of casualisation is now being reinvented in a digital era and is something that all of us ought to be incredibly concerned about.

What happens with these algorithms is called, in the jargon, socio-technical—what results is a mixture of conventional software, human judgment and statistical models. The issue is that very often the decisions that are made are not transparent, and are certainly not open to challenge. They are now quite commonly used by employers and prospective employers, and their agents, who are able to analyse very large datasets and can then deploy artificial intelligence and machine learning to make inferences about a person. Quite apart from the ongoing debates about how we define a worker and how we define employment—the subject of a very excellent report by my old friend Matthew Taylor, now at the RSA—there are real questions about how we introduce new safeguards for workers in this country.

I want to highlight the challenge with a couple of examples. Recent evidence has revealed how many recruiters use—surprise, surprise—Facebook to seek candidates in ways that routinely discriminate against older workers by targeting advertisements for jobs in a particular way. Slater and Gordon, which is a firm of excellent employment lawyers, showed that about one in five company executives admit to unlawful discrimination when advertising jobs online. The challenge is that when jobs are advertised in a targeted way, by definition they are not open to applicants from all walks of life, because lots of people just will not see the ads.

Women and those over the age of 50 are now most likely to be prevented from seeing an advert. Some 32% of company executives say that they have discriminated against those who are over 50, and a quarter have discriminated in that way against women. Nearly two thirds of executives with access to a profiling tool have said that they use it to actively seek out people based on criteria as diverse as age, gender and race. If we are to deliver a truly meritocratic labour market, where the rights of us all to shoot for jobs and to develop our skills and capabilities are protected, some of those practices have to stop. If we are to stop them, the law needs to change, and it needs to change now.

This battery of new clauses sets out to do five basic things. First, they set out some enhancements and refinements to the Equality Act 2010, in a way that ensures that protection from discrimination is applied to new forms of decision making, especially when those decisions engage core rights, such as rights on recruitment, terms of work, or dismissal. Secondly, there is a new right to algorithmic fairness at work, to ensure equal treatment. Thirdly, there is the right to an explanation when a decision is taken in a way that affects core elements of work life, such as a decision to hire, fire or suspend someone. Fourthly, there is a new duty for employers to undertake an algorithmic impact assessment, and fifthly, there are new, realistic ways for individuals to enforce those rights in an employment tribunal. It is quite a broad-ranging set of reforms to a number of different parts of legislation.

Photo of Daniel Zeichner Daniel Zeichner Labour, Cambridge

My right hon. Friend is making a powerful case. Does he agree that this is exactly the kind of thing we ought to have been discussing at the outset of the Bill? The elephant in the room is that the Bill seems to me, overall, to be looking backwards rather than forwards. It was developed to implement the general data protection regulation, which has been discussed over many years. We are seeing this week just how fast-moving the world is. These are the kind of ideas that should have been driving the Bill in the first place.

Photo of Liam Byrne Liam Byrne Shadow Minister (Digital, Culture, Media and Sport) (Digital Economy)

Exactly. My hon. Friend makes such a good point. The challenge with the way that Her Majesty’s Government have approached the Bill is that they have taken a particular problem—that we are heading for the exit door of Europe, so we had better ensure that we get a data-sharing agreement in place, or it will be curtains for Britain’s services exports—and said, “We’d better find a way of incorporating the GDPR into British law as quickly as possible.” They should have thought imaginatively and creatively about how we strengthen our digital economy, and how we protect freedoms, liberties and protections in this new world, going back to first principles and thinking through the consequences. What we have is not quite a cut-and-paste job—I will not describe it in that way—but neither is it the sophisticated exercise in public law making that my hon. Friend describes as more virtuous.

I want to give the Committee a couple of examples of why this is so serious, as sometimes a scenario or two can help. Let us take an individual whom we will call “Mr A”. He is a 56-year-old man applying for website development roles. Typically, if someone is applying for jobs in a particular sector, those jobs will be advertised online. In fact, many such roles are advertised only online, and they target users only in the age profile 26 to 35, through digital advertising or social media networks, whether that is Facebook, LinkedIn, or others. Because Mr A is not in the particular age bracket being targeted, he never sees the ad, as it will never pop up on his news feed, or on digital advertising aimed at him. He therefore does not apply for the role and does not know he is being excluded from applying for the role, all as a consequence of him being the wrong age. Since he is excluded from opportunities because of his age, he finds it much harder to find a role.

The Equality Act, which was passed with cross-party consensus, prohibits less favourable treatment because of age—direct discrimination—including in relation to recruitment practices, and protects individuals based on their age. The Act sets out a number of remedies for individuals who have been discriminated against in that way, but it is not clear how the Bill proposes to correct that sin. Injustices in the labour market are multiplying, and there is a cross-party consensus for a stronger defence of workers. In fact, the Member of Parliament for the town where I grew up, Robert Halfon, has led the argument in favour of the Conservative party rechristening itself the Workers’ party, and the Labour party was founded on a defence of labour rights, so I do not think this is an especially contentious matter. There is cross-party consensus about the need to stand up for workers’ rights, particularly when wages are stagnating so dramatically.

We are therefore not divided on a point of principle, but the Opposition have an ambition to do something about this growing problem. The Bill could be corrected in a way that made a significant difference. There is not an argument about the rights that are already in place, because they are enshrined in the Equality Act, with which Members on both sides of the House agree. The challenge is that the law as it stands is deficient and cannot be applied readily or easily to automated decision making.

Photo of Gareth Snell Gareth Snell Labour/Co-operative, Stoke-on-Trent Central

My right hon. Friend is making a powerful case about the importance of the Equality Act in respect of the Bill, but may I offer him another example? He mentioned the Amazon warehouse where people are tracked at work. We know that agencies compile lists of their more productive workers, whom they then use in other work, and of their less productive workers. That seems like a form of digital blacklisting, and we all know about the problems with blacklisting in the construction industry in the 1980s. I suggest that the new clauses are a great way of combating that new digital blacklisting.

Photo of Liam Byrne Liam Byrne Shadow Minister (Digital, Culture, Media and Sport) (Digital Economy)

My hon. Friend gives a brilliant example. The point is that employment agencies play an incredibly important role in providing workers for particular sectors of the economy, from hotels to logistics, distribution and construction. The challenge is that the areas of the economy that have created the most jobs in the 10 years since the financial crash are those where terms and conditions are poorest, casualisation is highest and wages are lowest—and they are the areas where productivity is poorest, too. The Government could take a different kind of labour market approach that enhanced productivity and wages, and shut down some of the bad practices and casualisation that are creating a problem.

As it happens, the Government have signed up to some pretty big ambitions in that area. Countries around the world recently signed up to the UN sustainable development goals. Goal 8 commits the Government to reducing inequality, and SDG 10 commits them to reducing regional inequality. However, when I asked the Prime Minister what she was doing about that, my question was referred to Her Majesty’s Treasury and the answer that came back from the Chancellor was, “We believe in raising productivity and growth.” The way to raise productivity and growth is to ensure that there are good practices in the labour market, because it is poor labour market productivity that is holding us back as a country.

If digital blacklisting or casualisation were to spread throughout the labour market in the sectors that happen to be creating jobs, there would be no increase in productivity and the Government would be embarked on a self-defeating economic policy. Although these new clauses may sound technical, they have a bearing on a much more important plank of the Government’s economic development strategy.

Our arguments are based on principles that have widespread support on both sides of the House and they are economically wise. The consequences of the new clauses will be more than outweighed by the benefits they will deliver. I commend them to the Minister and I hope she will take them on board.

Photo of Darren Jones Darren Jones Labour, Bristol North West 2:15 pm, 22nd March 2018

I want to add some further comments in support of the new clauses.

The Science and Technology Committee, one of the two Committees that I sit on, has had a detailed debate on algorithmic fairness. It is important to understand what the new clauses seek to do. There is a nervousness about regulating algorithms or making them completely transparent, because there are commercial sensitivities in the coding in respect of the way they are published or otherwise.

These new clauses seek to put the obligation on to the human beings who produce the algorithms to think about things such as equalities law to ensure that we do not hardcode biases into them, as my hon. Friend the Member for Cambridge said on Second Reading. It is important to understand how the new clauses apply to the inputs—what happens in the black box of the algorithm—and the outputs. The inputs to an algorithm are that a human codes and sets its rules, and that they put the data into it for it to make a decision.

The new clauses seek to say that the human must have a consistent and legal obligation to understand the equalities impacts of their coding and data entry into the black box of the algorithm to avoid biases coming out at the other end. As algorithms are increasingly used, that is an important technical distinction to understand, and it is why the new clauses are very sensible. On that basis, I hope the Government will support them.

Photo of Gary Streeter Gary Streeter Conservative, South West Devon

I call the Minister, whose birthday it is today.

Photo of Victoria Atkins Victoria Atkins The Parliamentary Under-Secretary of State for the Home Department, Minister for Women

Thank you, Mr Streeter, and what a wonderful birthday present it is to be serving on the Committee.

It is a joy, actually, to be able to agree with the Opposition on the principle that equality applies not only to decisions made by human beings or with human input, but to decisions made solely by computers and algorithms. On that, we are very much agreed. The reason that we do not support the new clauses is that we believe that the Equality Act already protects workers against direct or indirect discrimination by computer or algorithm-based decisions. As the right hon. Member for Birmingham, Hodge Hill rightly said, the Act was passed with cross-party consensus.

The Act is clear that in all cases, the employer is liable for the outcome of any of their actions, or those of their managers or supervisors, or those that are the result of a computer, algorithm or mechanical process. If, during a recruitment process, applications from people with names that suggest a particular ethnicity were rejected for that reason by an algorithm, the employer would be liable for race discrimination, whether or not they designed the algorithm with that intention in mind.

The right hon. Gentleman placed a great deal of emphasis on advertising and, again, we share his concerns that employers could seek to treat potential employees unfairly and unequally. The Equality and Human Rights Commission publishes guidance for employers to ensure that there is no discriminatory conduct and that fair and open access to employment opportunities is made clear in the way that employers advertise posts.

The same principle applies in the provision of services. An automated process that intentionally or unintentionally denies a service to someone because of a protected characteristic will lay the service provider open to a claim under the Act, subject to any exceptions.

Photo of Liam Byrne Liam Byrne Shadow Minister (Digital, Culture, Media and Sport) (Digital Economy)

I am grateful to the Minister for giving way, not least because it gives me the opportunity to wish her a happy birthday. Could she remind the Committee how many prosecutions there have been for discriminatory advertising because employers chose to target their adverts?

Photo of Victoria Atkins Victoria Atkins The Parliamentary Under-Secretary of State for the Home Department, Minister for Women

If I may, I will write to the right hon. Gentleman with that precise number, but I know that the Equality and Human Rights Commission is very clear in its guidance that employers must act within the law. The law is very clear that there are to be no direct or indirect forms of discrimination.

The hon. Member for Cambridge raised the GDPR, and talked about looking forwards not backwards. Article 5(1)(a) requires processing of any kind to be fair and transparent. Recital 71 draws a link between ensuring that processing is fair and minimising discriminatory effects. Article 35 of the GDPR requires controllers to undertake data protection impact assessments for all high-risk activities, and article 36 requires a subset of those impact assessments to be sent to the Information Commissioner for consultation prior to the processing taking place. The GDPR also gives data subjects the tools to understand the way in which their data has been processed. Processing must be transparent, details of that processing must be provided to every data subject, whether or not the data was collected directly from them, and data subjects are entitled to a copy of the data held about them.

When automated decision-making is engaged there are yet more safeguards. Controllers must tell the data subject, at the point of collecting the data, whether they intend to make such decisions and, if they do, provide meaningful information about the logic involved, as well as the significance and the envisaged consequences for the data subject of such processing. Once a significant decision has been made, that must be communicated to the data subject, and they must be given the opportunity to object to that decision so that it is re-taken by a human being.

We would say that the existing equality law and data protection law are remarkably technologically agnostic. Controllers cannot hide behind algorithms, but equally they should not be prevented from making use of them when they can do so in a sensible, fair and productive way.

Photo of Daniel Zeichner Daniel Zeichner Labour, Cambridge

Going back to the point raised by my right hon. Friend, I suspect that the number of cases will prove to be relatively low. The logic of what the Minister is saying would suggest that there is no algorithmic unfairness going on out there. I do not think that that is the case. What does she think?

Photo of Victoria Atkins Victoria Atkins The Parliamentary Under-Secretary of State for the Home Department, Minister for Women

I would be guided by the view of the Equality and Human Rights Commission, which oversees conduct in this area. I have no doubt that the Information Commissioner and the Equality and Human Rights Commission are in regular contact. If they are not, I very much hope that this will ensure that they are.

We are clear in law that there cannot be such discrimination as has been discussed. We believe that the framework of the law is there, and that the Information Commissioner’s Office and the Equality and Human Rights Commission, with their respective responsibilities, can help, advise and cajole, and, at times, enforce the law accordingly. I suspect that we will have some interesting times ahead of us with the release of the gender pay gap information. I will do a plug now, and say that any company employing more than 250 employees should abide by the law by 4 April. I look forward to reviewing the evidence from that exercise next month.

We are concerned that new clauses 7 and 8 are already dealt with in law, and that new clauses 9 to 11 would create an entirely new regulatory structure just for computer-assisted decision-making in the workplace, layered on top of the existing requirements of both employment and data protection law. We want the message to be clear to employers that there is no distinction between the types of decision-making. They are responsible for it, whether a human being was involved or not, and they must ensure that their decisions comply with the law.

Having explained our belief that the existing law meets the concerns raised by the right hon. Member for Birmingham, Hodge Hill, I hope he will withdraw the new clause.

Photo of Liam Byrne Liam Byrne Shadow Minister (Digital, Culture, Media and Sport) (Digital Economy)

I think it was in “Candide” that Voltaire introduced us to the word “Panglossian”, and we have heard a rather elegant and Panglossian description of a perfect world in which all is fine in the labour market. I am much more sceptical than the Minister. I do not think the current law is sufficiently sharp, and I am concerned that the consequence of that will be injustice for our constituents.

The Minister raised a line of argument that it is important for us to consider. The ultimate test of whether the law is good enough must be what is actually happening out there in the labour market. I do not think it is good enough; she thinks it is fine. On the nub of the argument, a few more facts might be needed on both sides, so we reserve the right to come back to the issue on Report. This has been a useful debate. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Clause 13