Autonomous Weapons Systems - Question

– in the House of Lords at 2:55 pm on 1 November 2021.

Alert me about debates like this

Photo of Lord Clement-Jones Lord Clement-Jones Liberal Democrat Lords Spokesperson (Digital) 2:55, 1 November 2021

To ask Her Majesty’s Government what assessment they have made of the calls made at the August meeting of the Group of Governmental Experts on Lethal Autonomous Weapons Systems at the Convention on Certain Conventional Weapons for a legally-binding instrument, including both prohibitions and positive obligations, to regulate autonomous weapons systems.

Photo of Baroness Goldie Baroness Goldie Lord in Waiting (HM Household) (Whip), The Minister of State, Ministry of Defence

My Lords, the UK is an active participant in United Nations discussions on lethal autonomous weapons systems, working with partners to build norms to ensure safe and responsible use of autonomy. The UK and our partners are unconvinced by the calls for a further binding instrument. International humanitarian law provides a robust principle-based framework for the regulation of weapons deployment and use. A focus on effects is most effective in dealing with complex systems in conflict.

Photo of Lord Clement-Jones Lord Clement-Jones Liberal Democrat Lords Spokesperson (Digital)

My Lords, the Minister’s reply is pretty disappointing. It puts the Government, despite statements in the integrated review, at odds with nearly 70 countries and thousands of scientists in their unwillingness to rule out lethal autonomous weapons. Will the Minister commit to rethinking government policy in terms of giving our representatives at the next meeting of the Convention on Certain Conventional Weapons on 2 December a mandate to go ahead with negotiations for a legally binding instrument, which, after all, has been called for by the UN Secretary-General?

Photo of Baroness Goldie Baroness Goldie Lord in Waiting (HM Household) (Whip), The Minister of State, Ministry of Defence

I am sorry that the noble Lord is disappointed, because I know the extent of his interest in this issue. I have tried to facilitate engagement with the department to enable him to better understand what the department is doing and why we take the views that we do. He will be aware that international consensus on a definition of laws has so far proved impossible. At this time, the UK believes that it is actually more important to understand the characteristics of systems with autonomy that would or would not enable them to be used in compliance with IHL, using this to set our potential norms of use and positive obligations.

Photo of Lord West of Spithead Lord West of Spithead Labour

My Lords, nations are sleepwalking to disaster. Engineers are already making autonomous drones the size of my hand that have cameras that act completely autonomously. They can, for example, have facial recognition and carry a small shaped charge, and will kill a person that that facial recognition shows. Once you release them, you release them and off they go. The firms producing these are talking in terms of, “Yes, if we had several thousands of these, gosh how wonderful, because we could kill a great chunk of a city without damaging it at all and get rid of the people there.” I find this quite horrifying. Also, these things are AI: they learn; therefore, they will learn how to kill even more than they have been programmed to. This is extremely dangerous. Do the Government agree completely that, wherever there is a kill-chain that ends up with a dead human being, there should be a human somewhere in that kill-chain to make that decision, rather than a robot?

Photo of Baroness Goldie Baroness Goldie Lord in Waiting (HM Household) (Whip), The Minister of State, Ministry of Defence

All weapon systems, whether with autonomous functions or not, must fully comply with the principle-based international humanitarian law framework. A robust application of that framework, I would suggest, is the best way of ensuring the lawful and ethical use of force in all circumstances. That applies to all states that might be developing autonomy in their weapons systems.

Photo of Lord Lancaster of Kimbolton Lord Lancaster of Kimbolton Conservative

Can my noble friend the Minister confirm that the UK has agreed not to develop autonomous weapons? Of course, we run the risk sometimes of confusing autonomous weapons with automated weapons, where there will be a human being in that decision-making cycle. While some are concerned about the UK’s definition of autonomous weapons, I think it is quite far-sighted because it will take into account future developments. Perhaps my noble friend could offer some clarity as to where in that chain, from targeting to operating that weapon, there will be human intervention.

Photo of Baroness Goldie Baroness Goldie Lord in Waiting (HM Household) (Whip), The Minister of State, Ministry of Defence

I thank my noble friend for acknowledging the difficulties that accompany definitions and prescriptive attempts to define. UK Armed Forces do not use systems that employ lethal force without context-appropriate human involvement. This is an important area; it is clearly an area of evolving policy and it is an area where we are absolutely clear that the best way forward is to continue our international engagement with the group of governmental experts.

Photo of Lord Coaker Lord Coaker Shadow Spokesperson (Defence), Shadow Spokesperson (Home Affairs), Opposition Whip (Lords)

Artificial intelligence is clearly an increasing part of the modern way of warfare but, as we have just heard from the noble Lord, Lord Lancaster, and my noble friend Lord West, it brings with it enormous moral challenges. I think what the House wants to hear is for the Minister to say unequivocally, and as a matter of principle, that there will always be human oversight when it comes to the use of artificial intelligence; in particular, that human oversight is involved whenever there is any decision about the lethal use of force.

Photo of Baroness Goldie Baroness Goldie Lord in Waiting (HM Household) (Whip), The Minister of State, Ministry of Defence

It is not possible to transfer accountability to a machine. Human responsibility for the use of a system to achieve an effect cannot be removed, irrespective of the level of autonomy in that system or the use of enabling technologies such as AI.

Photo of Baroness Smith of Newnham Baroness Smith of Newnham Liberal Democrat Spokesperson (Defence), Liberal Democrat Lords Spokesperson (Defence)

My Lords, I have been listening closely to the Minister and I am still not quite sure whether she has said that the Government will unequivocally state that no autonomous drone or other AI could take a life, and that every decision would have to have human engagement. Can she confirm that that is the case? I declare an interest as an officer of the APPG on Drones and Modern Conflict.

Photo of Baroness Goldie Baroness Goldie Lord in Waiting (HM Household) (Whip), The Minister of State, Ministry of Defence

I simply repeat to the noble Baroness what I said to my noble friend Lord Lancaster: that UK Armed Forces do not use systems that employ lethal force without context-appropriate human involvement.

Photo of Lord Browne of Ladyton Lord Browne of Ladyton Labour

My Lords, the National AI Strategy was published in September and promises were made that, before the end of the year,

“details of the approaches the Ministry of Defence will use when adopting and using AI” will be published. However, on 22 October the AI strategy for NATO, which presumably we agreed to, was published and it emphasised the principles of lawfulness, responsibility and accountability. Does the Minister not agree that it is now time for the UK to publicly reaffirm our commitment to ethical AI, including international law and human rights, and to tell our public and the international community that our Government are ready, as our Governments always have been, to show global leadership on these issues, particularly on lethal autonomous weapons?

Photo of Baroness Goldie Baroness Goldie Lord in Waiting (HM Household) (Whip), The Minister of State, Ministry of Defence

The noble Lord is quite correct that the department has said that it will publish a defence AI strategy. When I was told it would be in the autumn, I pointed out that the autumn had pretty well come and gone. I am reassured that significant work has been done on the strategy and we can expect publication in early course. It will set out our vision to be the most effective, efficient, trusted and influential defence organisation of our size, and have principled components to it. I would not wish to pre-empt what the strategy will say, but I would hope that it will serve to answer many of the noble Lord’s questions.

Photo of Lord Holmes of Richmond Lord Holmes of Richmond Conservative

My Lords, I declare my technology interests as set out in the register. Does my noble friend agree that, whether in safety or security, the public good or economic growth, the UK has a unique opportunity for the development and deployment of ethical AI? Further, does she agree that we urgently need public debate and engagement if we are to achieve, not just in defence but across all potential applications, optimum outcomes?

Photo of Baroness Goldie Baroness Goldie Lord in Waiting (HM Household) (Whip), The Minister of State, Ministry of Defence

I say to my noble friend, building on what I have already indicated to the Chamber, that AI and autonomy clearly have the potential to transform all aspects of defence, from the back office to the front line. They are a strategic priority for defence and we take that evolution of policy seriously. As I indicated to the noble Lord, Lord Browne of Ladyton, more will be disclosed when we publish our defence strategy in early course.