Lord Browne of Ladyton:
Moved by Lord Browne of Ladyton
32: After Clause 12, insert the following new Clause—“Liability for using novel technologies: review(1) Within 3 months of this Act being passed, the Secretary of State must commission a review of the implications of increasing autonomy associated with the use of artificial intelligence and machine learning, including in weapons systems, for legal proceedings against armed forces personnel that arise from overseas operations, and produce recommendations for favourable legal environments for UK armed forces operating overseas, including instilling domestic processes and engaging in the shaping of international agreements and institutions.(2) The review must consider—(a) what protection and guidance armed forces personnel need to minimise the risk of legal proceedings being brought against them which relate to overseas operations in response to novel technologies,(b) how international and domestic legal frameworks governing overseas operations need to be updated in response to novel technologies, and(c) what novel technologies could emerge from the Ministry of Defence and the United Kingdom's allies, and from the private sector, which could be used in overseas operations.(3) Within the period of one year beginning on the day on which the review is commissioned, the Secretary of State must lay a report before Parliament of its findings and recommendations.”
My Lords, Amendment 32 stands in my name and in the names of the noble and gallant Lord, Lord Houghton of Richmond, and the noble Lord, Lord Clement-Jones. It raises a very different matter from those with which we have been dealing until now in Committee. At first sight, the amendment may appear out of place in this Bill. I hope, however, to persuade your Lordships that, far from being irrelevant, it is directly relevant to many personnel who are, or will be, engaged in overseas operations, and that the numbers of those to whom it is relevant will only increase.
The amendment focuses on the protection and guidance that Armed Forces personnel need to ensure that they comply with the law, including international humanitarian law; the best way of minimising the risk of legal proceedings being brought against them; and explaining how international and domestic legal frameworks need to be updated. These are all as a consequence of the use of novel technologies which could emerge from or be deployed by the Ministry of Defence, UK allies or the private sector. In this day and age, the private sector is often deployed with our Armed Forces in overseas operations as part of a multinational force.
The amendment imposes an obligation on the Secretary of State, within three months of the passing of this Act, to commission a review of the relevant issues; sets out what that review must consider; and obliges the Secretary of State, within a year of the date from which it is commissioned, to lay a report before Parliament of its findings and recommendations.
It is remarkable that almost all the debate in Committee so far—both on the first day and today—has been about deployment of military force and the risk to which it exposes our forces, based on past experience. Little or no mention has been made of the changing face of war. I may have missed it, but I cannot recollect any mention being made of that element.
We often criticise armies who train “to fight the last war”. The real problem, however, is that training is based on mistaken notions of what the next war will be like. We have a fair idea of what a future conflict will be like, so we should not be a victim to that mistaken notion. I can easily think of a relatively straightforward current example of modern warfare which encapsulates the challenges that will be generated for our military.
The provisions of Clause 1(3) set out that the presumption against prosecution applies only in respect of alleged conduct which took place outside the British Isles and when the accused was deployed in overseas operations. If a UAV operator works from a control room here in the UK, in support of troops on the ground in a country beyond the British Isles, are they deployed on overseas operations for the purposes of this legislation? Is their conduct taking place beyond the British Isles? Consequently, are the protections afforded by this legislation offered to them? How can this legislation for overseas operations be kept up to date with the blurring of lines between what is and is not the battlefield, without provisions of this nature being made in the Bill?
On the face of it, these may appear simple questions, but I expect the answers are complex. At some time in the future, it is at least possible that a court will disagree with an answer given by a Minister today.
Next week, the integrated review will finally be published. This is the third defence and security review since 2010. It promises to be forward facing, recognising both current and future threats against the UK and describing the capabilities that will need to be developed to deter or engage them.
When the Prime Minister made his Statement on the review last November, he said that
“now is the right time to press ahead”— with a modernisation of the Armed Forces, because of
“emerging technologies, visible on the horizon.”—[
The CGS, General Sir Mark Carleton-Smith, recently said that he foresees the army of the future as an integration of “boots and bots”. The Prime Minister has said that the UK will invest another £1.5 billion in military research and development designed to master the new technologies of warfare, and establish a new centre dedicated to AI. He rightly stated that these technologies would revolutionise warfare, but the Government have not yet explained how legal frameworks and support for personnel engaged in operations will also change—because change they must.
The noble and gallant Lord, Lord Houghton of Richmond, has, in interventions in your Lordships’ House, warned about the risks posed by the intersection of artificial intelligence and human judgment, and has spoken wisely about the risks posed by technology interacting with human error. As military equipment gets upgraded, we do not know how the Government plan to upgrade legal frameworks for warfare, both on the domestic and the international level, what this will mean for legal protection for our troops, and where accountability will lie if mistakes are made. There is nothing in the Bill that reflects the forward-facing nature of the integrated review.
I am sure the Minister will have been briefed on the provisions of Article 36 of Protocol 1, additional to the 1949 Geneva conventions, which commits states to ensure the legality of all new weapons, means and methods of warfare by subjecting them to rigorous and multidisciplinary review. Unfortunately, as we, the United Kingdom, are not one of the eight nations in the world that publish their review of legal compatibility, and I have not been able to source a copy of such a review, I am unable to see just how up-to-date that process presently is. I have no doubt that we have complied with our legal obligations in that respect, and if they are tendered today, I will accept the Minister’s reassurances in that regard. If she is unable to comment, will she commit to write about this?
It is right that we tackle vexatious claims and improve investigations, but what happens when claims focus on personnel who were operating drones? The Government have said that they have no plans to develop fully autonomous weapons, but what if claims target the chain of command in charge of them? There remain many unanswered questions which could result in legal jeopardy for our troops. My assessment is that our engagement in future international conflict is more likely to involve military operatives of new technology than it is boots on the ground.
The seminal report of the Committee on Artificial Intelligence—ably chaired by the noble Lord, Lord Clement- Jones—expressed this concern:
“The Government’s definition of an autonomous system used by the military as one where it ‘is capable of understanding higher-level intent and direction’ is clearly out of step with the definitions used by most other governments.”
The committee recommended that
“the UK’s definition of autonomous weapons should be realigned to be the same, or similar, as that used by the rest of the world”,
but that has not happened. That, of course, generates serious questions, not only about interoperability but about the implications for the responsibilities of our troops when they are deployed in a multinational context. My expectation is that the noble Lord, Lord Clement-Jones, will expand on this aspect.
The UN chief, António Guterres, argues:
“Autonomous machines with the power and discretion to select targets and take lives without human involvement are politically unacceptable, morally repugnant and should be prohibited by international law.”
Does the Minister agree? If not, why not?
The final report of the US National Security Commission on Artificial Intelligence, helpfully published on
“The U.S. commitment to IHL”— international humanitarian law—
“is long-standing, and AI-enabled and autonomous weapon systems will not change this commitment.”
Do the Government believe the same?
In its consideration of autonomous weapons systems and risks associated with AI-enabled warfare, the commission came to several judgments and recommendations. I shall refer to only three of them. In its first judgment, it says:
“Provided their use is authorized by a human commander or operator, properly designed and tested AI-enabled and autonomous weapon systems have been and can continue to be used in ways which are consistent with IHL”— international humanitarian law. Have the Government reached the same judgment and, if so, are they willing to share their reasoning with Parliament? Publication of the current Article 36 review of legal compatibility, as the US does, would be a good first step. Is the Minister willing to at least consider doing so, and if not, why not?
Secondly, the commission concluded:
“Existing DoD procedures are capable of ensuring that the United States will field safe and reliable AI-enabled and autonomous weapon systems and use them in a manner that is consistent with IHL.”
Is the noble Baroness in a position to share a similar judgment in respect of MoD procedures and to explain why she has reached it?
Finally, among the commission’s recommendations was that the US
“Work with allies to develop international standards of practice for the development, testing, and use of AI-enabled and autonomous weapon systems.”
In the event that such an invitation is extended to the UK by the US, would the Government welcome it and participate in such a discussion?
We should not underestimate that drone operators face a worryingly high chance of developing post-traumatic stress disorder. In 2015, Reaper squadron boss Wing Commander Damian Killeen told the BBC that staff operating drone aircraft in Iraq and Syria may be at greater risk of mental trauma. Does the Minister recognise this effect of machines on their operators, despite the fact that they may be physically far away from the action? The Government have said that they want the Bill to protect service personnel from repeated investigations and vexatious claims. Do service personnel who operate UAVs not deserve to be protected, and will they be by this legislation?
No legislation designed to deliver on an overall policy intention to reassure our service personnel in the event that they are deployed overseas can deliver on that intention in this part of the 21st century without engaging the issues which this amendment addresses. Without this or a similar amendment, I fear that this legislation will be out of date as soon as it receives Royal Assent. I beg to move.
My Lords, it is a pleasure to follow the noble Lord, Lord Browne of Ladyton, in supporting his Amendment 32, which he introduced so persuasively and expertly. A few years ago, I chaired the House of Lords Select Committee on AI, which considered the economic, ethical and social implications of advances in artificial intelligence. In our report published in April 2018, entitled AI in the UK: Ready, Willing and Able?, we addressed the issue of military use of AI and stated:
“Perhaps the most emotive and high-stakes area of AI development today is its use for military purposes”,
recommending that this area merited a “full inquiry” on its own. As the noble Lord, Lord Browne of Ladyton, made plain, regrettably, it seems not yet to have attracted such an inquiry or even any serious examination. I am therefore extremely grateful to the noble Lord for creating the opportunity to follow up on some of the issues we raised in connection with the deployment of AI and some of the challenges we outlined. It is also a privilege to be a co-signatory with the noble and gallant Lord, Lord Houghton, who too has thought so carefully about issues involving the human interface with technology.
The broad context, as the noble Lord, Lord Browne, has said, is the unknowns and uncertainties in policy, legal and regulatory terms that new technology in military use can generate. His concerns about complications and the personal liabilities to which it exposes deployed forces are widely shared by those who understand the capabilities of new technology. That is all the more so in a multilateral context where other countries may be using technologies that we would either not deploy or the use of which could create potential vulnerabilities for our troops.
Looking back to our report, one of the things that concerned us more than anything else was the grey area surrounding the definition of lethal autonomous weapon systems—LAWS. As the noble Lord, Lord Browne, set out, when the committee explored the issue, we discovered that the UK’s then definition, which included the phrase
“An autonomous system is capable of understanding higher-level intent and direction”,
was clearly out of step with the definitions used by most other Governments and imposed a much higher threshold on what might be considered autonomous. This allowed the Government to say:
“the UK does not possess fully autonomous weapon systems and has no intention of developing them. Such systems are not yet in existence and are not likely to be for many years, if at all.”
Our committee concluded that, in practice,
“this lack of semantic clarity could lead the UK towards an ill-considered drift into increasingly autonomous weaponry.”
This was particularly in light of the fact that, at the UN Convention on Certain Conventional Weapons group of governmental experts in 2017, the UK opposed the proposed international ban on the development and use of autonomous weapons. We therefore recommended that the UK’s definition of autonomous weapons should be realigned to be the same or similar with that being used by the rest of the world. The Government, in their response to the report of the committee in June 2018, replied that:
“The Ministry of Defence has no plans to change the definition of an autonomous system.”
They did say, however,
Later, thanks to the Liaison Committee, we were able on two occasions last year to follow up on progress in this area. On the first occasion, in reply to the Liaison Committee letter of last January which asked,
“What discussions have the Government had with international partners about the definition of an autonomous weapons system, and what representations have they received about the issues presented with their current definition?”
The Government replied:
“There is no international agreement on the definition or characteristics of autonomous weapons systems. Her Majesty’s Government has received some representations on this subject from Parliamentarians”.
They went on to say:
“The GGE is yet to achieve consensus on an internationally accepted definition and there is therefore no common standard against which to align. As such, the UK does not intend to change its definition.”
So, no change there until later in the year in December 2020, when the Prime Minister announced the creation of the autonomy development centre to,
“accelerate the research, development, testing, integration and deployment of world-leading AI,” and the development of autonomous systems.
In our follow-up report, AI in the UK: No Room for Complacency, which was published in the same month, we concluded:
“We believe that the work of the Autonomy Development Centre will be inhibited by the failure to align the UK’s definition of autonomous weapons with international partners: doing so must be a first priority for the Centre once established.”
The response to this last month was a complete about-turn by the Government, who said:
“We agree that the UK must be able to participate in international debates on autonomous weapons, taking an active role as moral and ethical leader on the global stage, and we further agree the importance of ensuring that official definitions do not undermine our arguments or diverge from our allies.”
They go on to say:
“the MOD has subscribed to a number of definitions of autonomous systems, principally to distinguish them from unmanned or automated systems, and not specifically as the foundation for an ethical framework. On this aspect, we are aligned with our key allies. Most recently, the UK accepted NATO’s latest definitions of ‘autonomous’ and ‘autonomy’, which are now in working use within the Alliance. The Committee should note that these definitions refer to broad categories of autonomous systems, and not specifically to LAWS. To assist the Committee we have provided a table setting out UK and some international definitions of key terms.”
The NATO definition sets a much less high bar for what is considered autonomous, which is a
“system that decides and acts to accomplish desired goals, within defined parameters, based on acquired knowledge and an evolving situational awareness, following an optimal but potentially unpredictable course of action.”
The Government went on to say:
“The MOD is preparing to publish a new Defence AI Strategy and will continue to review definitions as part of ongoing policy development in this area.”
I apologise for taking noble Lords at length through this exchange of recommendation and response but, if nothing else, it demonstrates the terrier-like quality of Lords Select Committees in getting positive responses from government. This latest response is extremely welcome. In the context of the amendment from the noble Lord, Lord Browne, and the issues that we have raised, we need to ask a number of further questions. What are the consequences of the MoD’s thinking? What is the defence AI strategy designed to achieve? Does it include the kind of inquiry that our Select Committee was asking for? Now that we subscribe to the common NATO definition of LAWS, will it deal specifically with the liability and international and domestic legal and ethical framework issues which are central to this amendment? If not, a review of the type envisaged by this amendment is essential.
The final report of the US National Security Commission on Artificial Intelligence, referred to by the noble Lord, Lord Browne, has taken a comprehensive approach to the issues involved. He has quoted three very important conclusions and asked whether the Government agree in respect of our own autonomous weapons. Three further crucial recommendations were made by the commission:
“The United States must work closely with its allies to develop standards of practice regarding how states should responsibly develop, test, and employ AI-enabled and autonomous weapon systems”,
“United States should actively pursue the development of technologies and strategies that could enable effective and secure verification of future arms control agreements involving uses of AI technologies.”
Finally, of particular importance in this context,
“countries must take actions which focus on reducing risks associated with AI-enabled and autonomous weapon systems and encourage safety and compliance with IHL when discussing their development, deployment, and use”.
Will the defence AI strategy or indeed the integrated review undertake as wide an inquiry, and would it come to the same or similar conclusions?
The MoD seems to have moved some way towards getting to grips with the implications of autonomous weapons in the last three years but, if it has not yet considered the issues set out in the amendment, it clearly should as soon as possible update the legal frameworks for warfare in the light of the new technology, or our service personnel will be at considerable legal risk. I hope it will move further in response to today’s short debate.
My Lords, I can only commend my noble friend Lord Browne of Ladyton and the noble Lord, Lord Clement-Jones, on two of the most powerful, if terrifying, contributions to this Bill’s proceedings so far. In particular, I shall be having nightmares about their projections for the potential dissonance between varying international approaches to the definition of autonomous weapons and the way in which their deployment and development matches, or does not match, traditional approaches to humanitarian law.
Regarding the Bill, my noble friend has a very good point. He makes a specific observation about the fact that a drone operator in the UK will suffer many of the traumas and risks of a traditional soldier in the field but, on the face of it, that is not covered by this legislation at all. I look forward to the Minister’s response to that in particular, but also to the broader questions of risk—not just legal risk in a defensive way to our personnel but ethical and moral risk to all of us. In this area of life, like every other, the technology moves apace, but the law, politics, transparency, public discourse and even ethics seem to be a few paces behind.
My Lords, I am delighted to follow on from the noble Baroness, Lady Chakrabarti, who always seems to be a great source of common sense on complex moral issues. I am similarly delighted to support the amendment in the name of my one-time boss, the noble Lord, Lord Browne of Ladyton. I will not seek to repeat his arguments as to why this amendment is important, but rather to complement his very strong justification with my own specific thoughts and nuances.
I will start with some general comments on the Bill, as this is my only contribution at this stage. At Second Reading I made my own views on this Bill quite clear. I felt that it missed the main issues regarding the challenges of Lawfare. Specifically, I felt that the better route to reducing the problem of vexatious claims was not through resort to legal exceptionalism, but rather rested on a series of more practical measures relating to such things as investigative capacity, quality and speed; better training; improved operational record keeping; more focused leadership, especially in the critical area of command oversight; and a greater duty of care by the chain of command. On this latter, I wholly support the amendment of my noble friend Lord Dannatt.
Having listened to the arguments deployed in Committee, I am struck by the seeming inability of even this sophisticated Chamber to reach a common view as to whether the many provisions of this Bill offer enhanced protections or increased perils for our servicemen and women. This causes me grave concern. How much more likely is it that our servicemen and women—those whose primary desire is to operate within the law—will be confused; and how much more likely is it that are our enemies—those who want to exploit the law for mischief—will be encouraged?
I hold to the view that the law, in any formulation, cannot be fashioned into a weapon of decisive advantage in our bid to rid our people of vexatious claims. Rather, the law will increasingly be exploited by our enemies as a vector of attack, both to frustrate our ability to use appropriate force and to find novel ways of accusing our servicemen and women of committing illegal acts. The solution to this problem is a mixture of functional palliatives and better legal preparedness. This amendment addresses one element of this preparedness.
As we have already heard, one area of new legal challenge will undoubtedly be in the realm of novel technologies, particularly those which employ both artificial intelligence and machine learning to give bounded autonomy to unmanned platforms, which in turn have the ability to employ lethal force. We are currently awaiting the imminent outcome of the integrated review, and we understand that a defence command paper will herald a new era of technological investment and advancement: one that will enable a significant reduction in manned platforms as technology permits elements of conflict to be subordinated to intelligent drones and armed autonomous platforms.
However—and this is the basic argument for this amendment—the personal liability for action in conflict to be legal will not cease, although it may become considerably more opaque. We must therefore ask whether we have yet assessed the moral, legal, ethical and alliance framework and protocols within which these new systems will operate. Have we yet considered and agreed the command and control relationships, authorities and delegations on which will rest the legal accountability for much new operational activity?
Personally, I have a separate and deep-seated concern that a fascination with what is technically feasible is being deployed by the Government, consciously or unconsciously, primarily as the latest alchemy by which defence can be made affordable. It is being deployed without properly understanding whether its true utility will survive the moral and legal context in which it will have to operate. I therefore offer my full support to this amendment, in the hope that it will assist us in getting ahead of the problem. The alternative is suddenly waking up to the fact that we have created Armed Forces that are both exquisite and unusable in equal measure.
My Lords, I thank my noble friend Lord Browne, the noble Lord, Lord Clement-Jones, and the noble and gallant Lord, Lord Houghton, for bringing forward this important amendment and debate. I understand my noble friend Lord Browne’s concerns about the mismatch between the future-focused integrated review, which has had long delays but will be hopefully published next week, and the legislation we have in front of us.
Technology is not only changing the kinds of threats we face but changing warfare and overseas operations in general. In Committee in the other place, Clive Baldwin of Human Rights Watch neatly summed this up by suggesting that
“we are seeing a breakdown in what is the beginning and the end of an armed conflict, what is the battlefield and what decisions are made in which country … The artificial distinction of an overseas operation with a clear beginning, a clear theatre and a clear end is one that is very much breaking down.”—[
How is this reflected in the Bill?
When the Prime Minister gave his speech on the integrated review last year, he rightly said that “technologies …will revolutionise warfare” and announced a new centre dedicated to AI and an RAF fighter system that will harness AI and drone technology. This sounds impressive but, as my noble friend Lord Browne said, as military equipment gets upgraded, we do not know how the Government plan to upgrade legal frameworks for warfare and what this means in terms of legal protection for our troops.
We must absolutely tackle vexatious claims and stop the cycle of reinvestigations, but how will claims against drone operators or personnel operating new technology be handled? Do those service personnel who operate UAVs not deserve to be protected? And how will legal jeopardy for our troops be avoided?
As new technology develops, so too must our domestic and international frameworks. The final report of the US National Security Commission on Artificial Intelligence stated that the US commitment to international humanitarian law
“is longstanding, and AI-enabled and autonomous weapon systems will not change this commitment.”
Do the Government believe the same?
I would also like to highlight the serious impact on troops who might not be overseas, but who are operating drones abroad. A former drone pilot told the Daily Mirror:
“The days are long and hard and can be mentally exhausting. And although UAV pilots are detached from the real battle, it can still be traumatic, especially if you are conducting after-action surveillance.”
The RUSI research fellow Justin Bronk also said that, as drone operators switched daily between potentially lethal operations and family life, this could be extremely draining and psychologically taxing. What mental health and pastoral support is given to these troops currently? Drone operators may not be physically overseas, but they are very much taking part in overseas operations. With unmanned warfare more common in future conflicts, I would argue that failing to include those operations in the Bill may cause service personnel issues down the line.
I would like to hear from the Minister how this legislation will keep up to date with how overseas operations operate, and whether she is supportive of a review along the lines of Amendment 32—and, if not, why not?
My Lords, first, I thank the noble Lord, Lord Browne of Ladyton, for tabling this amendment, which is fascinating and raises substantial issues. One only had to listen to the informed but very different contributions from the noble Lord himself, the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Chakrabarti, then to a different perspective from the noble and gallant Lord, Lord Houghton of Richmond, and, finally, the noble Lord, Lord Tunnicliffe, to get a flavour of both the depth and the technical complexity of these issues.
There is no doubt that the increasing adoption of new and innovative technologies on the battlefield is changing how military operations are conducted. Gone are the three domains; we are now in the five domains. Military effects can now be delivered in cyberspace, and precision weapons systems can now be operated remotely from the UK and from third countries. I appreciate that the noble Lord, Lord Browne of Ladyton, is motivated by a genuine interest in these new technologies, how they influence military operations and the implications for our Armed Forces personnel involved in overseas operations—and that is an important question to ask.
However, I suggest to the noble Lord that it is not within the remit of the Bill to consider the effect that developing technologies might have on the future international and domestic legal frameworks of the battlefield. At this early stage I am perhaps going to give him a slightly disappointing response: I am not persuaded that it would be appropriate to insert a prescriptive provision for such matters into the Bill. I know he is slightly pessimistic about the fortunes of the Bill without that added dimension, but I am not sure that I share his pessimism. I assure your Lordships that emerging technologies are subject to a rigorous review process for compliance with the law of armed conflict, and we adjust our operating procedures to ensure that we stay within the boundaries of the law that applies at the time.
The noble Lords, Lord Browne and Lord Clement-Jones, had a wide range of complex questions covering many diverse issues on which—I am being quite frank—I do not have information, so I cannot respond to them from the Dispatch Box. However, I found their points compelling, and I offer to write to them.
We invest consistently in research and development through NATO. The UK is a world leader in innovation in areas of new capability like cyber and, if I may say so, in our response to new world threats such as climate change. Because we place NATO at the heart of our defence, we set interoperability at the core of our developments. We very much do this in tandem and in partnership.
Having said all that, I am aware of the expertise that the noble Lord, Lord Browne of Ladyton, has in these technologies and new domains, conjoined, importantly, with his legal background. I should very much welcome a meeting with him in order to be further briefed on how he sees their potential impact on Armed Forces personnel and the law of armed conflict, and to hear his thoughts on the nature of that important component of engagement with international institutions. That is an invitation I extend to him with sincerity and in good faith, and I very much hope, in light of that overture, that he is persuaded to withdraw his amendment.
My Lords, I thank the Minister, for whom I have as much respect and regard as anyone else in this debate. She has been showered with this compliment throughout the whole course of this Committee—quite rightly, in my view. I welcome her invitation to a meeting as much as I welcome the undertaking she has given to write to answer the many questions that have been posed to her. I look forward to all of that information.
I say at the outset that whether it is appropriate for this Bill to contain a provision of this nature should be tested against the proxy question I asked, which is whether a UAV operator in this country controlling a UAV or a drone over another country in an overseas operation is covered by the provisions of this Bill. If that cannot be answered in the affirmative, it is appropriate to do exactly what has been proposed in Amendment 32, if not in this fashion then somehow before this Bill becomes law, because we are asking and will continue to ask people to operate machinery in that way and we should not expose them to risks that others are not exposed to. This amendment seeks to future-proof this Bill. It expects the Government not to have all the answers now but to carry out a review of the implications of the increasing autonomy associated with AI and machine learning for legal proceedings against Armed Forces personnel arising from overseas operations.
I thank all noble Lords and noble and gallant Lords who spoke in this debate. I thank the noble Lord, Lord Clement-Jones, who has an enviable reputation, well deserved, for understanding one of the most difficult issues that face our country for the future, and in the security and military environment in particular; that is, artificial intelligence and autonomous weapon systems of machine learning. His contribution was full of rich information about the nature of the challenges we face, and I thank him for his support for this amendment.
I thank my noble friend Lady Chakrabarti for her support, and I am grateful that she suggested, or perhaps implied, that my interpretation of the Bill as it stands is probably correct. I am reinforced in my desire to see this through because of her support. The noble and gallant Lord, Lord Houghton of Richmond, in his own characteristic way, made a clear argument for engagement with these issues. He has a record of service to our country, an experience which has informed his advice to your Lordships’ House. I would be interested to explore further with him his conclusion that we may end up with forces that are exquisite and unusable in equal measure.
My noble friend Lord Tunnicliffe clearly understands this issue and shared with the Committee on a human level why this matter is important. In a sense, the test that he set for the Minister is a test that she has set herself: that this legislation must deliver on the Government’s policy intention to reassure service personnel in the event they are deployed. It will not do so unless these issues are dealt with properly and openly, so that those whom we send on these operations and engage with understand our appreciation of the legal implications.
I will seek leave to withdraw this amendment, but I warn the Minister that it may come back again—maybe in a slightly different form—at the later stages of this Bill. I also warn her this is but a preface to an issue that will come back before the Government in this form and other forms—that is, debates in this House—because this is going to be the reality of our security and military operations of the future. I say as a caution to her that the committee report that both I and the noble Lord, Lord Clement-Jones, referred to is almost 800 pages long. This is a complicated and difficult subject. I beg leave to withdraw the amendment.
Amendment 32 withdrawn.
Amendment 33 not moved.