Delivered by Katherine Baker
Geneva, April 11, 2018
Further consideration of the human element in the use of lethal force; aspects of human-machine interaction in the development, deployment and use of emerging technologies in the area of lethal autonomous weapons systems
Humans do, and must, play a role in authorizing lethal force. This does does not change with the type of weapon system that delivers that force. The lawful use of force is context dependent and a human must authorize such use of force against an appropriately targeted objective.
With all weapons systems, the commander’s authorization is made within the bounds established by the rules of engagement and international humanitarian law (IHL) based on:
- The commander’s understanding of the tactical situation, informed by [his or] her training and experience
- The weapon’s system performance, informed by extensive weapons testing as well as operational experience; and
- The employment of tactics, techniques, and procedures for that weapon
In all cases, the commander is accountable and has the responsibility for authorizing weapon release in accordance with IHL.
The human element in the use of lethal force is an important topic in the area of lethal autonomous weapons systems, and we share our own experience in this regard. In order to provide commanders with the necessary information to evaluate whether or not a weapon system is appropriate for use in a certain situation, U.S. Department of Defense policy mandates that “autonomous and semi-autonomous weapon systems shall be designed to allow commanders and operators to exercise appropriate levels of human judgement over the use of force.”
As a general matter, autonomous and semi-autonomous weapon systems vary greatly depending on their intended use and context. Therefore, the methods and degree of machine-human interaction will vary; there is and can be no “one-size-fits-all” standard.
Ensuring that weapons operate as intended in a fashion predictable to its users and in accordance with legal requirements is essential to implementing this policy. In general, this requires rigorous testing under realistic conditions be applied to weapons that have certain autonomous functions, and that human beings who use the weapons are properly trained.
Later today, my delegation will provide a briefing on a real-world system that makes use of autonomy in certain critical functions and that is in operation today, helping provide enhanced protection for U.S. and friendly forces and personnel in high-threat areas. The Counter-Rockets, Artillery, and Mortars system, or “C-RAM,” bolsters force protection at forward bases in a manner that also helps support humanitarian interests.
We hope that this presentation will highlight a number of important issues regarding the human element in the use of lethal force and aspects of human-machine interaction in the development, deployment and use of emerging technologies. Importantly, this presentation will demonstrate that while the C-RAM uses autonomy to perform a number of critical functions, human judgment is exercised at various points in the development, fielding, and deployment of the system to ensure that the weapon functions as anticipated and is used consistent with IHL. It will also explain how the weapon system is designed to collect and analyze data at speeds that far exceed human capabilities, and to then relay the most relevant information to the human operator. This allows the human operator to make informed decisions that promote greater compliance with IHL, as well as greater protection of his/her own forces, friendly forces, and the civilian population.
In this regard, the C-RAM provides a real-world example of the importance of the operational context in understanding the use of autonomy in different weapon systems. For example, C-RAM and the Close-in Weapon System that is used by many modern navies are built around the Phalanx Weapon System, but the operational contexts for C-RAM and Close-in Weapon System are very different. As a result, you will see that there are key differences between the two systems, such as how they employ autonomy and the ammunition they use because of their differing operational contexts, even though the two systems might otherwise seem very similar at first. We would invite other delegations to also consider sharing their practice in address real issues related to the use of autonomy in weapon systems.