Visa Processing Algorithms

Part of the debate – in the House of Commons at 5:10 pm on 19th June 2019.

Alert me about debates like this

Photo of Chi Onwurah Chi Onwurah Shadow Minister (Department for Business, Energy and Industrial Strategy) (Industrial Strategy) 5:10 pm, 19th June 2019

I really thank my hon. Friend for that intervention, because he is of course absolutely right. He raises a heartbreaking case, but he also hints at the fact that, as a consequence, we as MPs are seeing more casework and having a higher case load. That in itself is putting more pressure on the Home Office because we raise cases and ask for them to be reviewed. It takes longer to effect a decision—a final, just decision—and the people concerned have their lives disrupted, in some cases heartbreakingly so, for a longer period of time.

I want to mention the case of a United Kingdom mayor who was denied the presence of their sister at their inauguration, presumably because they were not considered to be a credible sponsor. Finally of these national cases, Oxfam has highlighted that, because of visa rejections, only one of the 25 individuals from Africa expected to attend a blog-writing training course at the recent London School of Economics Africa summit was able to do so. Non-governmental organisations and so on are trying to support in-country skills development, but it is often the case that it is very difficult to bring people, particularly young people, working for Oxfam or other NGOs to this country for training.

The Minister should know that her Department is notorious for a culture of disbelief, with an assumption that visitors are not genuine. I will give one example from my own constituency. Last year, the University of Nigeria Alumni Association UK branch chose to hold its annual meeting in Newcastle—by the way, it is a fantastic location to hold all such events—but a significant number were initially denied visas on the grounds that they might not return to Nigeria. These were all businessmen and women, academics or Government workers with family in Nigeria. After my intervention, their visas were approved, but that should not have been necessary.

Entry clearance officers are set independent targets of up to 60 case decisions each day, and our all-party group investigation found that this impacted on the quality and fairness of decision making. Home Office statistics from September 2018 show that African applicants are refused UK visas at twice the rate of those from any other part of the world. When visitors are denied entry arbitrarily, the UK’s relationship and standing with those countries is damaged, as has been mentioned, and we lose culturally and economically. International conferences and events, new businesses, trading opportunities and cultural collaborations are being lost to the UK because of the failings of the Home Office.

The last report on visa services from the independent chief inspector in 2014 found that over 40% of refusal notices were

“not balanced, and failed to show that consideration had been given to both positive and negative evidence.”

Last month, it was announced that the six-month target for deciding straightforward asylum cases is being abandoned. This was a target that, as the Home Office’s own statistics show, was repeatedly missed. In 2017, one in four asylum cases was not decided within six months, while immigration delays have doubled over the past year, despite a drop in cases. As a constituency MP, I know from personal experience about the significantly longer delays to visa applications.

This is a failing system, but it is run for profit. Applicants are routinely charged up to 10 times the actual administrative costs of processing applications. For example, applying for indefinite leave to remain in the UK costs £2,389, while the true cost is just £243.

Fees for refused visas are not refunded and there is no right of appeal for the refusal of a visit visa application. Within the process, even communication with the Home Office is monetised: people are charged £5.48 to email the Home Office from abroad and non-UK-based phone calls cost £1.37 per minute.

The fact that the Department has reputedly lost 25% of its headcount under the austerity agenda must be part of the reason for these failures, but there is also the culture of disbelief, which I mentioned earlier, the hostile environment, of which we have heard much, and the impact of Brexit, because what staff do remain are being moved on to Brexit preparation. It is in this environment that the Home Office decided that the answer was an algorithm.

According to the Home Office, the use of algorithms in visa processing is part of an efficiency drive. They are being used not to improve the quality of decision making, but to make up for a lack of resources and/or to drive further resources out. As an engineer, I often say that whatever the problem is, the answer is never technology—at least, not on its own. I will say categorically that algorithms should not be used for short-term cost savings at this stage in their evolution as a technology.

Let me define what we are talking about. An algorithm is a set of instructions, acting on data entered in a particular format, to make a decision. If the algorithm learns from performing those instructions how to make better decisions, that might be called machine learning. If it both learns from performing its instructions and can act upon data in different and unpredictable formats, it might be considered to be artificial intelligence—might, but not necessarily is, because not everything that is artificial is intelligent.

Critically, algorithms are only as good as their design and the data they are trained on. They are designed by software engineers, who tend to come from a very narrow demographic—few are women, from ethnic minorities or working class. The design will necessarily reflect the limits of their backgrounds, unless a significant effort is made for it not to.

There are many examples of problems with the training data for algorithms, from the facial recognition algorithm that identified black people as gorillas because only white people had been used to train it, to the match-making or romantic algorithm that optimised for short-term relationships because the training data showed that they generated more income, due to the repeat business. Unless algorithms are diverse by design, they will be unequal by outcome.

Algorithms are now an integral part of our lives, but without any appropriate regulation. They drive Facebook’s newsfeeds and Google’s search results; they tell us what to buy and when to go to sleep; they tell us who to vote for and whom to hire. However, there is no regulatory framework to protect us from their bias. Companies argue that the results of their algorithms are a mirror to society and are not their responsibility; they say that the outcomes of algorithms are already regulated because the companies that use them have to meet employment and competition law. But a mirror is not the right metaphor; by automating decision making, algorithms industrialise bias. Companies and especially Governments should not rely on algorithms alone to deliver results.

I hope that the Government are not accepting algorithms in their decision making processes without introducing further regulation. The Home Office has denied that the algorithm for visa streaming takes account of race, but it refuses to tell us anything about the algorithm itself. Home Office guidance on the “genuine visitor” test allows consideration of the political, economic and security situation of the country of application, or nationality, as well as statistics on immigration compliance from those in the same geographical region, which can often be proxies for race.

When I announced this debate, many organisations and individuals sent me examples of how Home Office algorithmic decision making had effectively discriminated against them. Concerns were also raised about other automated decision making in the Home Office—for example, the residency checks in the EU settlement scheme, which uses a person’s Her Majesty’s Revenue and Customs and Department for Work and Pensions footprints to establish residency, but does not consider benefits such as working tax credit, child tax credit or child benefit. All those benefits are more likely to be received by women. Therefore, the automated residency check is likely to discriminate against women, particularly vulnerable women without physical documents.

We do not know whether the visa processing algorithm makes similar choices, whether it was written by the same people, or indeed whether it originated in the private sector or the public sector. The Home Office says that algorithmic decisions are still checked by people—a requirement of GDPR, the general data protection regulation—but not how much time is allowed for those checks, and has admitted that the purpose of the algorithm in the first place was to reduce costs.

Unfortunately, the Government’s track record on digital and data does not give confidence. When the Tories and Liberal Democrats entered Government in 2010, big data was a new phenomenon. Now it drives the business model of the internet, but the Government have done nothing to protect citizens beyond implementing mandatory European Union legislation—GDPR. They are happy to preside over a state of utter chaos when it comes to the ownership and control of data, and allow a free-for-all to develop in artificial intelligence, algorithms, the internet of things and blockchain. In 2016, for example, the DWP secretly trialled the payment of benefits using shared ledger or blockchain technology. Despite the privacy implications of using a private company to put sensitive, highly personal data on to a shared ledger that could not be changed or deleted, we still do not know what the process was for approving the use of this technology or the outcome of the trial. The Government should have learned from the debacle that the misuse of technology damages public trust for a long time.

I like to consider myself as a champion of the power of shared data. I believe the better use of data could not only reduce the costs of public services, saving money to be better used elsewhere, but improve those services, making them more individual, more personal, faster and more efficient. However, I am not the only one to raise concerns. Algorithmic use in the public sector was recently debated in the Lords, where it was estimated that some 53 local authorities and about a quarter of police authorities are now using algorithms for prediction, risk assessment—as in this case—and assistance in decision making. Now that we find it being used in the Home Office, it is essential that the Government—I am glad to see the Minister here today—answer the following questions. I have, I think, 11 questions for the Minister to answer.

Will the Minister say whether this algorithmic visa processing is part of machine learning or artificial intelligence? Is the algorithm diverse by design? Will the Minister say whether the algorithm makes choices about what data is to be considered, as with the settled status check example? Who was responsible for the creation of the algorithm? Was it the Home Office, the Government Digital Service or a private sector company? What rights do visa applicants have with regard to this algorithm and their own data? Do they know it is being used in this way? How long is their data being stored for and what security is it subject to?

What advice was taken in making the decision to introduce this algorithm? Did the Government consult their Centre for Data Ethics and Innovation, the Department for Digital, Culture, Media and Sport or the Cabinet Office? Does the duty of care in the online harms White Paper from DCMS apply to the Home Office in this case? What redress or liability do applicants have for decisions that are made in error or are subject to bias by the algorithm? What future algorithms are planned to be introduced into visa processing or elsewhere? Finally, why is it that journalists—in this case, from the Financial Times, as well as Carole Cadwalladr—seem to have identified and brought attention to the misuse of algorithms but the Government or any of their regulators who are supposedly interested in this area, such as Ofcom or the Information Commissioner’s Office, have not? Will the Minister say which regulator she feels is responsible for this area?

A Labour Government would work with industry, local authorities, businesses, citizen groups and other stakeholders to introduce a digital Bill of Rights. This would give people ownership and control over their data and how it is used, helping to break the power of the monopoly tech giants, while ensuring a right to fair and equal treatment by algorithms, algorithmic justice and openness. We need to be able to hold companies and Government accountable for the consequences of the algorithms, artificial intelligence and machine learning that drive their profits or cost-cutting. A Labour Government would protect us not just from private companies, but from the cost-cutting of this Government, who I suspect either do not understand the consequences of their technology choices or do not care.

I hope that the Minister can reassure me and answer my questions and that she can demonstrate that the use of algorithms in the Home Office and elsewhere across Government will be subject to proper transparency, scrutiny and regulation in future.