U.S. DELEGATION OPENING STATEMENT
As Delivered by Stephen Townley
Thank you, Mr. Chairman. The United States Delegation appreciates the work you have done in laying the groundwork for this important informal meeting on emerging technological challenges associated with increasing autonomy in weapons systems; we look forward to productive discussions under your guidance this week. We are confident that over the next four days we will all come to a better understanding of the varied and complex issues related to lethal autonomous weapons systems. You can count on our delegation to participate fully over the next four days.
We will provide specific comments during the sessions to come, but at the outset, we would like to make three framing points and then highlight one issue that is, to us, critical in thinking about autonomous features of weapons systems.
First, this important discussion is just beginning and we believe considerable work still needs to be done to establish a common baseline of understanding among states. Too often, the phrase “lethal autonomous weapons system” appears still to evoke the idea of a humanoid machine independently selecting targets for engagement and operating in a dynamic and complex urban environment. But that is a far cry from what we should be focusing on, which is the likely trajectory of technological development, not images from popular culture.
To move toward a common understanding does not mean that we need to define “lethal autonomous weapons systems” at the outset. Recent discussions in which we have participated, along with other states, and scientists, roboticists, lawyers, and ethicists, have shown that some ideas about lethal autonomous weapons systems are so widely divergent that it would be imprudent, if not impossible, to precisely define the term now. Much examination and discussion is necessary before we try to undertake that task. As we begin our discussions here, though, we must be clear on one point – we are here to discuss future weapons or, in the words of the mandate for this meeting, “emerging technologies.” Therefore we need to be clear, in these discussions we are not referring to remotely piloted aircraft, which as their name indicates are not autonomous and therefore, conceptually distinct from LAWS.
Second, it follows from the fact that we are indeed at such an early stage of our discussions that the United States believes it is premature to determine where these discussions might or should lead. In our view, it is enough for now for us collectively to acknowledge the value in discussing lethal autonomous weapons systems in the CCW, a forum focused on international humanitarian law, which is the relevant framework for this discussion.
Third, we must bear in mind the complex and multifaceted nature of this issue. This complexity means we need to carefully think through the full range of possible consequences of different approaches. For instance, our discussion here will necessarily touch on the development of civilian technology, which we expect to continue unrestricted by those discussions.
With that said, the United States would like to highlight one of the key issues we think states should focus on in considering autonomy in weapons systems — and that is risk. We will elaborate on this further in the coming days, but, to give just one example, how does the battle-field – whether cluttered or uncluttered – affect the risk of using a particular weapons system?
In order to assess risk associated with the use of any weapons system, States need a robust domestic legal and policy process and methodology. We think states may also need to tailor those legal and policy processes when considering weapons with autonomous features. For that reason, as you know, after a comprehensive policy review, the United States Department of Defense issued DoD Directive 3000.09, “Autonomy in Weapon Systems,” in 2012. The Department developed the directive in order to better understand and identify the risks posed by autonomy, as well as to consider possible ways to mitigate risks that are identified. It established a high-level, detailed process for considering weapons with autonomous features and issued specific guidelines designed to “minimize the probability and consequences of failures that could lead to unintended engagements.”
The United States intends to discuss the risks of autonomy, as well as possible benefits, and means of analyzing those risks, over the coming days, using the Directive as an example. We would likewise encourage other states to consider presenting their own ways and means of thinking about the risks of autonomy and whether they have their own domestic processes for evaluating those risks.
Mr. Chairman, as we have said, the issues related to LAWS are complex. We are here to share our thoughts on autonomy in weapons systems and learn from others. We have brought a broad range of experts with us and look forward to engaging with all interested delegations. At this early stage, we cannot say, and, to reiterate, should not prejudge, where the discussion will lead, but we do recognize that it is a good time for this discussion to begin.