U.S. Delegation Statement on “Appropriate Levels of Human Judgment”
The Convention on Certain Conventional Weapons (CCW) Informal Meeting of Experts on Lethal Autonomous Weapons Systems
U.S. Delegation Statement on “Appropriate Levels of Human Judgment”
Geneva, April 12, 2016
Delivered by Michael Meier.
Thank you Madame Ambassador. We would also like to thank the current panelists and those from this morning’s discussion for their interesting and informative presentations.
As an initial matter, these discussions can be greatly aided by a common understanding of what is meant by the term Lethal Automated Weapons Systems, or LAWS. Although we do not believe that we need to try to come to a formal definition for LAWS at this early stage, these discussions should be very clear as to what concept of LAWS is being discussed by participating States, civil society groups, and experts. These discussions of LAWS in CCW encompass potential future systems and not existing weapons systems using some form of autonomy, including purely defensive systems, like Patriot/AEGIS.
As many of you know, the U.S. Delegation previously has referred to the concept of “appropriate levels of human judgment over the use of force” during these discussions. As a result, we often get asked: “What are the aspects of designing and developing weapon systems that allow for the exercise appropriate levels of human judgment over the use of force?” “Why does the United States think that this concept is important?” “How does this concept relate to ‘meaningful human control’?”
We would therefore like to take this opportunity to explain what we mean by “appropriate levels of human judgment” as well as aspects of our policy that ensure appropriate levels of human judgments are capable of being exercised in the use of autonomous and semi-autonomous weapon systems, including potential LAWS.
The U.S. Department of Defense Directive 3000.09, Autonomy in Weapon Systems, requires that “[a]utonomous and semi-autonomous weapon systems shall be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force.” The Directive then articulates more specific requirements that supplement and contribute to fulfillment of this general requirement. For purposes of the Directive, an “autonomous weapon system” is defined as a “weapon system that, once activated, can select and engage targets without further intervention by a human operator.”
Of course, an essential aspect of designing weapon systems to make sure that “appropriate levels of human judgment over the use of force” may be exercised by commanders and operators is to ensure that any use of such weapons can comply with the requirements of the law of war. In particular, under the customary law of war, weapons must not be inherently indiscriminate; weapons must be able to be used in accordance with the principles of distinction and proportionality. Autonomous and semi-autonomous weapon systems, including potential LAWS, would need to be used in armed conflict consistent with the law of war, just like with any other weapon.
Notably, the standard in the U.S. Department of Defense Directive does not only refer to the existing requirements of the law of war. Our standard is a policy one that seeks to ensure weapons can be used, not only lawfully, but also appropriately – in accordance with rules of engagement and the mission objectives of commanders.
Some may criticize “appropriate” as an overly vague standard. But we chose “appropriate” because there is no “one-size-fits-all” standard for the correct level of human judgment to be exercised over the use of force with autonomous and semi-autonomous weapon systems, including potential LAWS. Rather, as a general matter, autonomous and semi-autonomous weapon systems vary greatly depending on their intended use and context. In particular, the level of human judgment over the use of force that is appropriate will vary depending on factors, including, the type of functions performed by the weapon system; the interaction between the operator and the weapon system, including the weapon’s control measures; particular aspects of the weapon system’s operating environment (for example, accounting for the proximity of civilians), the expected fluidity of or changes to the weapon system’s operational parameters, the type of risk incurred, and the weapon system’s particular mission objective. In addition, engineers and scientists will continue to develop technological innovations, which also counsels for a flexible policy standard that allows for an assessment of the appropriate level of human judgment for specific new technologies.
Although our policy does not determine what the specific appropriate level of human judgment over the use of force is needed for each weapon system, it does provide some general guidance on how to ensure that the appropriate level of human judgment can be exercised.
One important part of ensuring that appropriate levels of human judgment over the use of force may be exercised is to ensure the weapons operate as intended in a fashion predictable to its users and in accordance with legal and policy requirements. For example, rigorous hardware and software verification and validation should be applied to weapons that have certain autonomous functions, and those weapons should go through realistic system development and operational testing and evaluation.
Such measures are integral to ensuring that those weapon systems:
- “[f]unction as anticipated in realistic operational environments against adaptive adversaries”;
- “[c]omplete engagements in a timeframe consistent with commander and operator intentions and, if unable to do so, terminate engagements or seek additional human operator input before continuing the engagement”; and
- “[a]re sufficiently robust to minimize failures that could lead to unintended engagements or to loss of control of the system to unauthorized parties.”
These requirements are to help ensure that the weapon system operates as expected. In order to allow for an appropriate level of human judgment to be exercised in the use of a weapon system, the weapon system must operate in a predictable manner and in accordance with legal, policy, and mission requirements.
Another aspect of ensuring appropriate levels of human judgment in the use of force is making sure that human beings who use the weapons system are properly trained on how to use them. This is a lesson learned from the study of incidents in which weapons with autonomous capabilities have resulted in unintended engagements, such as the well-known incidents in 2003 when PATRIOT air defense batteries shot down a U.S. jet and an allied jet. As in those cases, for example, the problem might not be with the engineering of the machine, but with the users of the weapon system not understanding how to use it properly. Thus, our policy requires that appropriate training, doctrine, and tactic, techniques and procedures for the use of such weapons should be established.
This point is critical to understanding the issues involved with autonomy and weapon systems. How humans and machines interface with one another can be just as important as what types of machines should be designed and developed. Thus, our policy further provides that “[i]n order for operators to make informed and appropriate decisions in engaging targets, the interface between people and machines for autonomous and semi-autonomous weapon systems shall: “[b]e readily understandable to trained operators”; “[p]rovide traceable feedback on systems status”; and “[p]rovide clear procedures for trained operators to activate and deactivate system functions.”
In sum, these three aspects—namely first, reliable and tested weapons that are engineered to perform as expected; second, established training, doctrine, and procedures for users of the weapon systems; and third, clear and readily understandable interfaces between weapons systems and users—collectively help ensure that weapon systems can be used with the appropriate level of human judgment over the use of force, and in turn, consistent with the law of war and any applicable policy and mission requirements.
Over the past two days, as well as in the previous meetings, many States have stated that human control is essential for the use of LAWS. The United States agrees with the importance of humans in the development and use of LAWS, but considers “appropriate levels of human judgment” to be a more comprehensive and useful working concept than “meaningful human control” for purposes of addressing the significant issues raised when designing and developing autonomous and semi-autonomous weapon systems, including potential LAWS. The U.S. Department of Defense Directive therefore focuses on “appropriate levels of human judgment” in articulating appropriate procedures for designing and developing autonomous and semi-autonomous weapon systems, including a legal review in all cases, and a policy review by senior officials of weapon systems with certain autonomous functions.
We appreciate the States and non-governmental observers at this meeting who have spoken favorably of the concept of “appropriate levels of human judgment” and its subsidiary requirements, which ensure that autonomous and semi-autonomous weapon systems, including potential LAWS, can be used, not only lawfully, but also appropriately – in accordance with rules of engagement and the mission objectives of commanders.