Amendment 29

Armed Forces Bill - Report – in the House of Lords at 8:45 pm on 23 November 2021.

Alert me about debates like this

Lord Browne of Ladyton:

Moved by Lord Browne of Ladyton

29: After Clause 19, insert the following new Clause—“Use of novel technologies by the UK Armed Forces: review (1) Within three months of this Act being passed, the Secretary of State must commission a review of the implications of increasing autonomy associated with the use of artificial intelligence and machine learning, including in weapons systems, for legal proceedings against armed forces personnel that arise from military operations, and produce recommendations for favourable legal environments for the United Kingdom’s armed forces, including instilling domestic processes and engaging in the shaping of international agreements and institutions.(2) The review must consider—(a) what novel technologies could emerge from the Ministry of Defence and the United Kingdom’s allies, and from the private sector, which could be used in military operations,(b) how international and domestic legal frameworks governing conflict need to be updated in response to novel technologies,(c) the United Kingdom’s engagement with current and new routes of international efforts to secure a new legally binding instrument governing the use of novel technologies in conflict, and(d) what protection and guidance armed forces personnel need to minimise the risk of legal proceedings being brought against them which relate to military operations in response to novel technologies. (3) Within the period of one year beginning on the day on which the review is commissioned, the Secretary of State must lay a report before Parliament of its findings and recommendations.”Member’s explanatory statementThe amendment mandates a review within three months of the passing of the Act of implications of increasing autonomy associated with the use of AI and machine learning in weapon systems. The review must focus on the protection and guidance that Armed Forces personnel need to ensure that they comply with the law, including international humanitarian law, and how international and domestic legal frameworks need to be updated.

Photo of Lord Browne of Ladyton Lord Browne of Ladyton Labour

My Lords, this amendment is also in the names of the noble Lord, Lord Clement-Jones, and the noble and gallant Lords, Lord Houghton of Richmond and Lord Craig of Radley. Once more, I am grateful to them for their continuing support of this amendment.

This is the fourth time this amendment, or a variant of it, has been debated in your Lordships’ House in a relatively short time. This version of it has been shaved. The specific references to overseas deployment and overseas operations have been taken out, but subsection (2)(c), which relates to

“engagement with current and new routes of international efforts to secure a new legally binding instrument governing the use of novel technologies in conflict” has been added to it as part of what the review that it would mandate needs to consider. I will explain that, hopefully in a relatively short period of time.

The amendment mandates, within three months of the passing of the Act,

“a review of the implications of increasing autonomy associated with the use of artificial intelligence and machine learning … in weapons systems”.

The review would be required to focus on the protection and guidance that Armed Forces personnel need to ensure that they comply with the law, including international humanitarian law, and how international and domestic legal frameworks need to be updated.

I have no intention of repeating the points I have previously made. I will just take a few seconds to remind noble Lords of assurances we have been given by the Minister thus far. I draw noble Lords’ attention to cols. GC 437-38 from the Grand Committee. I accept that we have been given some reassurances that the MoD is “alert to” the complex issues that this amendment raises and is working and

“has worked extensively on them over the … last 18 months.”

I also accept that presently the Government’s position is that the Minister

“cannot set out details until these positions have been finalised, but work to set a clear direction of travel for defence AI, underpinned by proper policy and governance frameworks, has reached an advanced stage”— so we are in keen anticipation—and that:

“Key to this is the defence AI strategy”, which, it is hoped, will be published

“in the coming months, along with details of the approaches we will use when adopting and using AI.”—[Official Report, 8/11/21; col. GC 437.]

These are substantially the Minister’s words. I do not intend to read all of this; people can read it for themselves.

Withdrawing the amendment in Committee, I indicated that I expected the issues, which are moving at a dramatic pace, to have moved on by the time we got to Report, and that the probability was that this amendment would come back, because there would be developments. There have been developments. Some of them are that my knowledge of matters relevant to this amendment has increased, but another of them was much more dramatic.

Last Wednesday the “Stories of Our Times” podcast published a podcast—do we publish podcasts?—entitled “The rise of killer robots: The future of modern warfare”. This was hosted by a journalist, a woman, a podcaster, Manveen Rana. The guests were Matthew Campbell, a Sunday Times foreign affairs features editor, General Sir Richard Barrons, former Commander of the UK Joint Forces Command, and General Sir Nick Carter, Chief of the Defence Staff. I think a British academic based in the United States also contributed. If I can find a way to do this—I think it might be possible—it is my intention to ensure that every parliamentarian in this building, here and in the other place, gets access to this podcast because, more dramatically and probably with better effect, it makes the points that I have been trying to get across in the last three attempts and this one, explaining why it is crucial that this work is done.

For reasons I will come to, it is crucially important that this work is done in a context in which responsibility for these weapon systems is taken by elected politicians at the highest level. We in Parliament must know that the politicians who are responsible for decisions about them fully understand the implications of these weapon systems and exactly what their capabilities are and may become. In my view—I cannot overstate this—this is the most important issue for the future defence of our country, future strategic stability, and potentially peace: that those who take responsibility for these weapon systems are civilians, that they are elected, and that they know and understand.

Anyone who listens to this podcast will dramatically realise why, because there are conversations going on among military personnel that, in my view, demand the control of politicians. I have no intention of going through all of this, as it takes 33 minutes; it would have been helpful if we had all heard it before I spoke. That was impossible, though I did share it with a limited number of your Lordships, and I sent it to the Minister. I gather that she was not able to access it, but she would be surprised at some of the vocabulary used in it. In it, there were some sentences deployed which the House must know and understand.

General Sir Richard Barrons says that

“artificial intelligence is potentially more dangerous than nuclear weapons.”

If that is a proper assessment of the potential of these weapon systems, then that is the reason they must come under the control of elected politicians who know and understand their implications. That debate, after nuclear weapons were first used, occupied the United States of America for the best part of a decade. They decided, at the end, to split responsibility for nuclear weapons between the civilian side of the government and the military, but that the civilian side would have responsibility. That is why we talk of these weapon systems, as we do in the United Kingdom, as being the Prime Minister’s weapons. They are awesome in their abilities. These weapon systems as described in this podcast are equally awesome. Even more worrying, once we make the development from AI to AGI, they potentially have the ability to develop at a speed we cannot physically keep up with.

There is an existing context, under the United Nations Convention on Certain Conventional Weapons known as the GGE process, which seeks to find a way at the UN level of making a regulatory agreement in relation to artificial intelligence and artificial general intelligence enabled weapon systems. There is a frustration developing in that discussion, and 68 countries are calling for a new legal instrument to regulate lethal autonomous weapons. These sorts of frustrations in that environment are not unusual; they led to the cluster munitions convention and antipersonnel landmines treaties. The UK was involved in helping to lead both. Developments there also led to the ban treaty relating to nuclear weapons. They are an unhelpful development sometimes but at others the only way that progress can be made in relation to certain weapon systems. If this happens, it is incumbent on us to decide where we will be in this discussion, which is why subsection (2)(c) is added to the amendment in its current form.

Specifically, these states that are pulling away are calling for a combination of both prohibitions and regulations in the form of a legally binding instrument. A smaller subset of them have mentioned their support for the less nuanced approach of a simple ban. Presently, there are no NATO states in this group, but Austria, which is a partner for peace with NATO, is on the list, and Belgium looks like it is close to national support for a new international instrument. Its Parliament recommended incorporating a ban into national legislation in 2018 and there is some indication that the defence committee in Belgium is considering making another powerful recommendation on this. Therefore, this will impact the alliance that we depend on for our strategic and other defence if it develops.

The second point in addition to what I have said before comes from the words that the Minister deployed in Committee. Several times in debates of this nature, parliamentarians, including my noble friend Lord Coaker and the noble Baroness, Lady Smith of Newnham, have asked for an unequivocal statement that there will always be a human in the loop when decisions over the use of lethal force are taken. Responding to these calls most recently, the noble Baroness repeatedly said that the UK does not use systems that employ lethal force without “context-appropriate human involvement”. It has been brought to my attention that this novel formulation offers less assurance over the UK’s possible future use of these weapons than the UK’s previous position, which was that Britain does not possess fully autonomous weapons systems and has no intention of developing them.

I have a letter written on 8 December 2017, from the Foreign and Commonwealth Office, and therefore dated now. It is to the United Nations Association of the UK:

“The UK commits to maintaining human control over its weapon systems as a guarantee of oversight and accountability. The UK does not possess fully autonomous weapon systems and has no intention of developing them.”

That was a strong reassurance, but it seems that the language has changed. If it has, can the Minister tell us why the UK is no longer stating that it has no intention of developing LAWS, and why the UK appears unwilling to state that humans will always remain in control of the decision to use lethal force?

Finally, on the issue of the long-awaited AI strategy, it appears that the MoD, through an FoI, has confirmed that it has carried out no public consultation in relation to this. There has been some informal consultation but, surprisingly, there is no public consultation or open consultation about this. When it is eventually published, will it be a done deal, or will it be in White Paper form for further discussion in a public and open way?

I have nothing more to add today. Bearing in mind everything that I have said about these weapons systems in the past, I have made my position plain. I do not think the issue is going to go away. The way the amendment has been formed has been interpreted as a one-off event but I have to make it clear to Parliament, the House and the Minister that this is not my intention. The review that I think has to take place, which has to be reported on to Parliament by senior Ministers, who must come and explain it in a way that makes it clear that they fully understand these weapons and why they have made these decisions, is just the beginning of a long-standing process. This is an issue that will be with us for a long time, and we need to start thinking, in a relationship between the Government, Parliament and the country, about where we want to be with these weapons systems.

Photo of Lord Craig of Radley Lord Craig of Radley Crossbench 9:00, 23 November 2021

My Lords, the noble Lord, Lord Browne of Ladyton, has given us a very thoughtful, well-researched and deeply troubling series of remarks about the future in this area. I wanted to concentrate on a rather narrower point. Those who are ordered to fight for the interests of this country must do so—now and in the future, as more novel technologies find their way into kinetic operations—in the certain knowledge that their participation, and the way in which they participate, is lawful in both national and international jurisdictions. As has become evident in some of the asymmetric operations of recent years, there is real evidence that post-conflict legal challenges arise, and future operations may prove impossible to clear up quickly and comprehensively unless we have thought deeply about it.

Risking one’s life is a big ask, but to combine it with a risk of tortuous and protracted legal aftermath is totally unacceptable. I support the simple thrust of the amendment to demonstrate that the Government indeed have this matter under active review, as one must expect them to. It is infinitely better that the answers to these issues are there before a further operation has to be waged, not after it is over, when issues that should have been foreseen and dealt with press on individuals and others in our Armed Forces. Should the protection of combat immunity not be brought into the frame of discussion and resolution of this seriously troublesome issue?

Photo of Lord Clement-Jones Lord Clement-Jones Liberal Democrat Lords Spokesperson (Digital)

My Lords, it is a great pleasure to follow the noble Lord, Lord Browne of Ladyton, and the noble and gallant Lord, Lord Craig, in supporting Amendment 29, which the noble Lord introduced so persuasively, as he did a similar amendment on the overseas operations Bill that I signed and in Grand Committee on this Bill—I apologise for being unable to support him then. Since we are on Report, I will be brief, especially given the hour. Of course I do not need to explain to the Minister my continuing interest in this area.

We eagerly await the defence AI strategy coming down the track but, as the noble Lord said, the very real fear is that autonomous weapons will undermine the international laws of war, and the noble and gallant Lord made clear the dangers of that. In consequence, a great number of questions arise about liability and accountability, particularly in criminal law. Such questions are important enough in civil society, and we have an AI governance White Paper coming down the track, but in military operations it will be crucial that they are answered.

From the recent exchange that the Minister had with the House on 1 November during an Oral Question that I asked about the Government’s position on the control of lethal autonomous weapons, I believe that the amendment is required more than ever. The Minister, having said:

The UK and our partners are unconvinced by the calls for a further binding instrument” to limit lethal autonomous weapons, said further:

“At this time, the UK believes that it is actually more important to understand the characteristics of systems with autonomy that would or would not enable them to be used in compliance with” international human rights law,

“using this to set our potential norms of use and positive obligations.”

That seems to me to be a direct invitation to pass this amendment. Any review of this kind should be conducted in the light of day, as we suggest in the amendment, in a fully accountable manner.

However, later in the same short debate, as noted by the noble Lord, Lord Browne, the Minister reassured us, as my noble friend Lady Smith of Newnham noted in Committee, that:

UK Armed Forces do not use systems that employ lethal force without context-appropriate human involvement.”

Later, the Minister said:

“It is not possible to transfer accountability to a machine. Human responsibility for the use of a system to achieve an effect cannot be removed, irrespective of the level of autonomy in that system or the use of enabling technologies such as AI.”—[Official Report, 1/11/21; col. 994-95.]

The question is there. Does that mean that there will always be a human in the loop and there will never be a fully autonomous weapon deployed? If the legal duties are to remain the same for our Armed Forces, these weapons must surely at all times remain under human control and there will never be autonomous deployment.

However, that has recently directly been contradicted. The noble Lord, Lord Browne, described the rather chilling Times podcast interview with General Sir Richard Barrons, the former Commander Joint Forces Command. He contrasted the military role of what he called “soft-body humans”—I must admit, a phrase I had not encountered before—with that of autonomous weapons, and confirmed that weapons can now apply lethal force without any human intervention. He said that we cannot afford not to invest in these weapons. New technologies are changing how military operations are conducted. As we know, autonomous drone warfare is already a fact of life: Turkish autonomous drones have been deployed in Libya. Why are we not facing up to that in this Bill?

I sometimes get the feeling that the Minister believes that, if only we read our briefs from the MoD diligently enough and listened hard enough, we would accept what she is telling us about the Government’s position on lethal autonomous weapons. But there are fundamental questions at stake here which remain as yet unanswered. A review of the kind suggested in this amendment would be instrumental in answering them.

Photo of Lord Houghton of Richmond Lord Houghton of Richmond Crossbench 9:15, 23 November 2021

My Lords, I support this amendment. I am sorry that my name has not found its way on to the Order Paper; I had Covid last week and I failed the IT test of getting it properly registered.

I come at this from perhaps a different angle. I have spent perhaps rather too much of my latter career in the Ministry of Defence and understand the way it functions. It spends the vast majority of its time—and I think this is understandable—managing the crisis of the moment. It spends very little time, in truth, on strategic foresight, and therefore it spends quite a bit of the other part of its time on making good that lack of strategic foresight—and much of what this whole Armed Forces Bill is about is making good that lack of foresight. The thing that I support so much about this amendment is that it is an attempt to get ahead of the game.

The MoD properly stops and looks to the future in the times of its periodic reviews, and there was much to commend the last integrated review. There are two things I would pluck from it that are relevant to this amendment. First, the review was littered with the idea that the country was making a strategic bet on the future by way of investment in technology: technology would be the source of our new prosperity; it would be the source of our technological edge; we would become a superpower; it was the reason that we could reduce the size of our Armed Forces; it was through the exploitation of novel technology that we could hold our heads up high and not fear for our safety.

At the same time, elsewhere in the review—this is my formulation, not the review’s—two forms of warfare were identified. There is the one we do not want to fight—the reversion to formalised war at a scale above the threshold of kinetic conflict—and then there is this grey area of hybrid war; the war that we are currently engaged in, where our malevolent and malicious enemies seek to exploit every trick in the book and the rules of warfare in order to exploit new vectors of attack to effectively defeat us during peacetime in mendacious ways.

You can read as much as you want into the second thing, but this idea of a permanent competition for relative survival and advantage is undoubtedly a feature of the current global security situation. Therefore, in those moments of strategic foresight in the integrated review, we have in some ways identified the fact that the advantage given by novel technologies will be decisive and that we have enemies who will be mendacious in ways that we cannot quite comprehend.

I worry that, in the months to come, this Chamber might revert to its defence arguments being about counting the number of ships, air squadrons or tanks. The amendment will hold the Ministry of Defence and its generals to account by parliamentarians for the ways in which these weapons evolve—they will evolve at pace—and the rules that are to be employed by not just us but our adversaries and what is and is not their proper exploitation.

Having paused in that integrated review and discerned the future, however darkly, it would be gross negligence if we did not wish upon ourselves an instrument by which the evolution of these weapons and the rules involved in their employment were not the closest interest of parliamentarians and this House. The Ministry of Defence should be held to account over the coming months and years to see how it all plays out. This amendment would do so, and it has my unreserved support.

Photo of Baroness Bennett of Manor Castle Baroness Bennett of Manor Castle Green

My Lords, I apologise again for not speaking in Committee due to being at COP. I offer support and regret that I did not attach my name to this amendment. What the noble Lord, Lord Browne, said about public consultation in this process is really important, as is what the noble and gallant Lord, Lord Houghton, said about parliamentary scrutiny. Those two things very much fit together.

I am very aware that the Minister started this day, many hours ago now, promising to read a book, so I will refer to a book but not ask her to read it. It is entitled Exponential: How Accelerating Technology is Leaving Us Behind and What to Do About It, and it is by Azeem Azhar. The thesis is that there is an exponential gap: technologies are taking off at an exponential rate, but society is only evolving incrementally. In terms of society, we can of course look at institutions like politics and the military.

Another book is very interesting in this area. Its co-author, Kai-Fu Lee, has described it as a scientific fiction book, and it posits the possibility of, within the next couple of decades, large quantities of drones learning to form swarms, with teamwork and redundancy. A swarm of 10,000 drones could wipe out half a city and theoretically cost as little as $10 million.

It is worth quoting the UN Secretary-General, António Guterres, who said:

“The prospect of machines with the discretion and power to take human life is morally repugnant.”

That relates to some of the words in the podcast that the noble Lord, Lord Browne, referred to; I have not listened to it, but I will.

Fittingly, given what the Secretary-General said, the United Nations Association of the UK has very much been working on this issue, and communicating with the Government on it. In February, the Government told it that UK weapons systems

“will always be under human control”.

What we have heard from other noble Lords in this debate about how that language seems to have gone backwards is very concerning.

This is very pressing because the Convention on Certain Conventional Weapons will hold an expert meeting on 2 December, I believe, which will look at controls on lethal autonomous weapons systems—LAWS, as they are known. It would be very encouraging to hear from the Minister, now or at some future point, what the Government plan to do if there are no positive outcomes from that—or, indeed, whatever the outcomes are. While the Government have ruled out an independent process, both the mine ban convention and the Convention on Cluster Munitions were ultimately negotiated outside the CCW.

Finally and very briefly, I will address proposed new subsection (2)(d) and how individual members of the Armed Forces might be held responsible. There is an interesting parallel here with the question on deploying autonomous vehicles—the issue of insurance and who will be held responsible if something goes wrong. Of course, the same issues of personal responsibility and how it is laid will face military personnel. This may sound like a distant thing, talking about decades, but I note that a report from Drone Wars UK notes that Protector, the new weaponised drone, is “autonomy enabled”. I think Drone Wars UK says it has been unable to establish what that means and what the Government intend to do with that autonomy-enabled capability, but the first of an initial batch of 16 Protectors is scheduled to arrive between 2021 and 2024, and the Protector is scheduled to enter service with the RAF in mid-2024.

So I think this is an urgent amendment, and I commend the noble Lord, Lord Browne, and the others on this, and I would hope to continue to work with them on the issue.

Photo of Baroness Smith of Newnham Baroness Smith of Newnham Liberal Democrat Spokesperson (Defence), Liberal Democrat Lords Spokesperson (Defence)

My Lords, I would like to support this amendment, in the name of the noble Lord, Lord Browne of Ladyton, the noble and gallant Lord, Lord Craig, and my noble friend Lord Clement-Jones. The noble Lord, Lord Browne, has probably spent an hour, this evening and in aggregate, explaining to the Chamber the need for this amendment.

As the noble Lord and my noble friend Lord Clement-Jones have pointed out, on 1 November, some of the issues raised about novel technologies and autonomy were raised; I am not sure the House was wholly persuaded by the answers the Minister was able to give on that occasion. I think it is essential that the Government think again about how they might respond to the noble Lord, Lord Browne, and to this amendment, because we have heard how vital it is that we understand the danger that the world is in. We cannot just ignore it or say we might think about it at some future date because it is not a matter for today.

If we are keen to recruit for the 21st century, recruitment is not just about cannon fodder; it is about people who are able to understand the legal aspects of warfare and the moral issues we need to be thinking about. We need service personnel, but we also need—as the noble Lord, Lord Browne, so eloquently argued—politicians and officers who are able to make decisions. There are questions about autonomy that need to be understood and focused on now, and it is crucial that we talk with our partners in NATO and elsewhere. We cannot simply say we are not interested at the moment in debating and negotiating international agreements; we absolutely have to. The time to act on this is now; it not at some future date when the Government think they might have time. We need to do it today.

Photo of Lord Coaker Lord Coaker Shadow Spokesperson (Defence), Shadow Spokesperson (Home Affairs), Opposition Whip (Lords)

My Lords, this is one of these debates that takes place very late at night that should have a packed Chamber listening. It is not a criticism, but the importance of the debate is immense. I thought the introduction from my noble friend Lord Browne was tremendous—I really did. We went from a situation where we all thought “Hopefully we won’t be too long on this amendment” to everybody listening to what he had to say and then thinking they had important contributions to make.

Lots of noble Lords have made outstanding contributions, but this is a bit of a wake-up call, actually. This is happening. My noble friend Lord Kennedy mentioned that he was in a Home Office debate and they were talking about what the police were looking at and, no doubt, what Border Force and all sorts of other people are looking at. But in the sense of the military here, as the noble and gallant Lord, Lord Craig, pointed out, we are going to ask people to operate within a context and a legal framework. What will that be? Because we are going to order them to do things.

This is the change—I make no apology for spending a couple of minutes on it—which my noble friend Lord Browne mentioned. The Government’s policy was:

The UK does not possess fully autonomous weapon systems and has no intention of developing them.”

I asked the Minister in Committee just a couple of weeks ago to unequivocally state that there will always be a human in the loop when decisions over the use of lethal force are taken. The Minister said, as my noble friend Lord Browne said:

UK Armed Forces do not use systems that employ lethal force without context-appropriate human involvement.”—[Official Report, 1/11/21; col. 995.]

We all know what that means. It is a very careful use of language, but it has shifted considerably from one statement to the next. As the noble and gallant Lord, Lord Houghton, asked, as did my noble friend and other noble Lords, where is the parliamentary accountability? Where has the decision for that been taken? What parliamentary debate took place that said it was now okay for the UK to make that quite considerable change of policy?

It may be, I suspect, that some of the answer we get from the Minister will be, “We can’t talk about this, it’s secret”. Yet it is not secret, so what is going on? This is why I said it was a bit of a wake-up call. Parliament needs to debate this; I could not agree more with the noble and gallant Lord, Lord Houghton. It is for this Parliament as the democratic part of the process of government that runs this country to determine what is appropriate. It is not for meetings—wherever they take place—to determine that.

The most important part of the amendment before us is proposed new subsection (3), which talks about the review commissioned by the amendment being reported to Parliament with its findings and recommendations, so that Parliament would have the opportunity to debate and discuss what the policy of Her Majesty’s Government, those who represent it and the establishment of this country, was with respect to the use of artificial intelligence, increasing autonomy, machine learning and all of those sorts of things.

The one thing I would say to the Minister is that she is a member of Her Majesty’s Government. She is the representative of the Government that all of us here are collectively talking to around this amendment. I think what noble Lords want—certainly what I want—is for the Minister to go back and say “These were the sorts of comments that were made in Parliament by numerous Lords”—and no doubt it would happen in the other place as well—“so what is it that we going to do about this? What is going to happen as a consequence of the crucial amendment that Lord Browne put before us?”

Knowing the Minister in the way that I do, I know she will go back and ask this. But the system needs to respond to us, to this Chamber, to this debate and to all the various points that have been made; that is what democracy is about. It is about the Chamber that represents the people speaking up for the people to the system and demanding that it change and respond to them. That is what we expect from this debate. I thank my noble friend Lord Browne again for putting the amendment and for his continued efforts with respect to this really important issue.

Photo of Baroness Goldie Baroness Goldie Lord in Waiting (HM Household) (Whip), The Minister of State, Ministry of Defence 9:30, 23 November 2021

My Lords, the noble Lord, Lord Coaker, is right: we have kept until the end of the day—unfortunately when few people are around—one of the best debates we have had during this stage of the Bill. I thank the noble Lords, Lord Browne and Lord Clement-Jones, and the noble and gallant Lord, Lord Craig, for tabling this amendment. I know that their interest is informed and determined, and I can tell them that it is welcome. Having debated this issue with them now on several occasions, I understand the depth of their concern in this important area. I am grateful to them for the way they have engaged with me and officials and I look forward to further engagement, for we will surely debate these issues in this House for many years to come. I say to the noble Lord, Lord Coaker, that any Government would expect to be accountable to Parliament in respect of matters of such significance.

As with so many issues relating to the rapid march of new technology, this is both complex and pressing. The Government continue to welcome the challenge and scrutiny being brought to this question, and, as I noted on previous engagements, I do not dispute the noble Lords’ analysis of the importance of proper legal consideration of novel technologies. Indeed, I attempted to access the podcast to which the noble Lord, Lord Browne, referred. I do not know whether the Chamber will be delighted or disappointed to learn that, such is the security of my MoD computer, I could not get anywhere near it, so I have still to enjoy the benefit of listening to that podcast, which I intend to do.

As I said, I know that the amendment is extremely well intended and timely, but I hope to persuade your Lordships that the proposed review is not the right means of addressing these issues. However, I assure your Lordships that the department is alert to these questions and has been working extensively on them over the course of the last 18 months. Indeed, the noble Lords, Lord Browne and Lord Clement-Jones, have been engaging with officials in the department. They might have a better understanding than most of what is taking place.

Setting a requirement for a review in law would actually risk slowing down the work needed to develop the policy, frameworks and processes needed to operate AI-enabled systems responsibly, and to address the legal risks that service personnel might otherwise face. That is an issue of profound importance and one in which the noble and gallant Lord, Lord Craig of Radley, is rightly interested.

Noble Lords will understand that I cannot set out details of the department’s position until these have been finalised, but I can assure your Lordships that work to set a clear direction of travel for defence AI, underpinned by proper policy and governance frameworks, has reached an advanced stage. The noble Lord, Lord Browne, will I am sure have a sense of where that is headed. Key to it is the defence AI strategy, which we hope to publish in early course, along with details of the approaches we will use when adopting and using AI.

These commitments, which are included in the National AI Strategy, reflect the Government’s broader commitment that the public sector should set an example through how it governs its own use of the technology. Taken together, we intend that these various publications will give a much clearer picture than is currently available, because we recognise that these are vital issues that attract a great deal of interest and we need to be as transparent and engaged as possible. I wish specifically to reassure the noble Lord, Lord Coaker, about that.

I know from their contributions, to which I listened, that noble Lords will understand that this AI strategy cannot be the last word on the subject, but I hope that, when we do publish details, your Lordships will be substantially reassured that we are on the right track, and that substantial effort and engagement will follow. There is no end to the march of technology—that is one of the reasons why we have questioned the utility of a snapshot review process—nor will there be an end to our challenge of ensuring that we do the right thing with that technology, especially where grave matters of life and death and national security are concerned.

As we undertake this work, one of our top priorities must be to develop the terminology and vocabulary necessary to ensure we illuminate, clarify and improve understanding and awareness, and to find the right way to debate these issues. This is by no means a comment on any of the discussions that we have engaged on in this House; it is more a general observation on the difficulty of debating concepts such as lethal autonomous weapon systems when there is no definition and different views are not always clearly differentiated.

Are we concerned that AI could usher in a new era of weapons which, whether controlled by a human or not, could result in devastation and atrocities? Or are we concerned at the ethical implications of a machine, rather than a human, taking decisions which result in the death of even a single human? The answer is both, but the discussion is not best served when it jumps between such disparate topics.

The MoD has to keep pace with the threats that confront this country and consider how to deal with them. When I spoke in Grand Committee, I commented, in response to the noble Baroness, Lady Smith, that context-appropriate human involvement could mean some form of real-time human supervision, which might be called “human in the loop”, or control exercised through the setting of a system’s operational parameters. The noble Lord, Lord Browne, correctly observed that some might call the latter a fully autonomous weapon. But I wonder whether they would use that term, or perhaps more importantly be concerned, if the use case they had in mind was a system mounted on a Royal Navy vessel to defend against hypersonic threats. Such a system might well be lethal—that is, capable of taking human life—but in many ways it would not be considered fully autonomous, even if it detected the threat and opened fire faster than a human could react.

We must be careful to avoid generalisations in this debate. We in the Ministry of Defence have a responsibility to ensure that our position is properly communicated. That is a responsibility we acknowledge, and I say again to the noble Lord, Lord Coaker, that it is a responsibility of which we are cognisant and about which we will be vigilant.

The crucial point, which is also the reason why this amendment is unnecessary, is that all new military capabilities are subject to a rigorous review process for compliance with international humanitarian law. Any determination as to the exercising of context-appropriate human involvement will similarly be done carefully on a specific case-by-case basis. We also adjust our operating procedures to ensure that we stay within the boundaries of the law that applies at the time.

International and domestic frameworks provide the same level of protection around the use of novel technologies as for conventional systems because their general principle is to focus on the action, rather than the tool. These frameworks therefore offer appropriate levels of protection for our personnel. We are committed to ensuring that our Armed Forces personnel have the best possible care and protection, including protection against spurious legal challenges. I think I said in Committee that, earlier this year, we acted to bolster this protection in historical cases through the overseas operations Act.

This is a fascinating and complex area. I hope my remarks provide reassurance to your Lordships that the Ministry of Defence takes these matters very seriously, is already doing all that needs to be done and is planning to be proactive in communicating its approach appropriately to Parliament and the public. On this basis, I suggest that this amendment is not needed. The noble Lord, Lord Browne, has been kind enough to indicate that he will not press it, but I hope that he and other Members of this House will remain engaged with us in the MoD, as we will remain engaged with our international partners and allies, and our own public and civil society, so that we can make rapid progress on these important and challenging questions.

Photo of Lord Browne of Ladyton Lord Browne of Ladyton Labour

My Lords, I thank all noble Lords who contributed to this debate, including the noble Baronesses, Lady Bennett of Manor Castle and Lady Smith of Newnham, my noble friend Lord Coaker, the noble and gallant Lord, Lord Craig of Radley, and the noble Lord, Lord Clement-Jones. I am sorry that the noble and gallant Lord, Lord Houghton, could not add his name to the amendment, but in my head it is there.

I thank the Minister, who was characteristically engaged with the debate and the issues. At this time of night, I do not want to start debating with her on whether some of her comments about this amendment and what it would do are justified. I do not believe that this would slow down the work; it is just a compilation of the things that the Government ought to be doing anyway. I do not care about the three months; a promise that this will be done, and done transparently, is what I, as a parliamentarian, demand of the Government. At some point, this will need to be done and need to be shared with Parliament. We will need to take joint responsibility for these weapons systems if we seek to deploy them in any fashion—even limited versions of them.

My second point is that I am glad to see that our country is complying with its international legal obligations to subject new technology to a rigorous review to make sure that it is compatible with international humanitarian law. I am satisfied that that is happening. I do not understand why my Government do not publish those reviews. The United States and many other countries publish such reviews. Why are they not published, so that we, the politicians who engage, not so much in this House but in the other House, in paying for them with taxpayers’ money, know that we are complying with this? Other countries can do so perfectly well.

I have been obsessed with this issue since 2013, when I read the Resilient Military Systems and the Advanced Cyber Threat report of the US Department of Defense’s Defense Science Board. It said specifically that the United States did not have a resilient weapons system that could not be penetrated by cyber, because it had penetrated them. It went on to say that the same was true of “all of our allies”. It did not say in the report that it did that to all of their allies, but I would not be surprised if it did.

In 2013, I took that to the then Ministers in the Ministry of Defence and said, “Have you read this? We are deploying some of this tech that has been penetrated, and it can be penetrated by cyber threat.” I have to say that it was penetrated with software downloaded from the web; no one wrote a single line of code in order to do it. I have yet to meet a Defence Minister of that generation who ever even bothered to read the report.

This is where we are now—this will be my last word on this. General Sir Richard Barrons, Commander Joint Forces Command from 2013 to 2016, is publicly saying of autonomous weapon systems that it is not a question of tomorrow—the technology exists now, it is unstoppable and we need to get on to that bandwagon. He has been saying that for years. I do not know how many senior military officers who have worn our uniform are involved in this and saying this, but one of them doing so publicly terrifies me, because I am far from satisfied that I—a former Secretary of State for Defence —or any of our current Ministers understand this well enough to keep people who think like that under proper control. That is what concerns me. I beg leave to withdraw the amendment.

Amendment 29 withdrawn.

House adjourned at 9.48 pm.