Convention on Certain Conventional Weapons Group of Governmental Experts on Emerging Technologies in the Area of LAWS.
Characterization of the systems under consideration in order to promote a common understanding on concepts and characteristics relevant to the objectives and purposes of the Convention
Geneva, March 25, 2019
as delivered by Katherine Baker
The United States Supports identifying general characteristics of systems that should be under the GGE’s consideration. The U.S. Department of Defense definitions, found in our working paper from November 2017, turn on the functions that the autonomy performs in the weapon systems, in particular the function of selecting and engaging targets. The U.S Department of Defense definitions do not rest upon articulating specific definitions of autonomy or types of machine reasoning and are not intended to decide in the abstract what may be good or bad uses of autonomy. The use of autonomy in weapons systems in compliance with IHL necessarily depends upon its specific intended application, rather than the abstract.
A weapon system can only be properly assessed under IHL in light of the operational context in which it is intended to be used. Much like weapons that do not use autonomy, controls such as temporal, spatial, and warhead/effector could enable legal and ethical use of weapon systems that use autonomy. Assessing in the abstract ignores the most relevant considerations about their legal or ethical use.
As we discussed in our intervention this morning, the use of autonomy in weapon systems could enhance the way law of war principles are implemented in military operations, Moreover, it is expected that further developments in autonomous and semi-autonomous weapon systems will allow military forces to apply force more precisely and with less collateral damage than would be possible with existing systems.
Our sense is that characterizing systems through technological categories or seeking to define concepts like “artificial intelligence” would be especially ill-advised because there are already diverse taxonomies along these lines and because scientists and engineers continue to develop technological advancements. Such characteristics could soon be rendered obsolete by technological developments. Also, seeking to characterize systems based on the sophistication of the machine intelligence would also incorrectly focus on the machine, rather than understanding what is important for the law – how human beings are using the weapon and what they expect it to do. For example, it is irrelevant under the law of war whether a rocket engine is powered by a solid fuel or a liquid propellant. Rather, the law of war is concerned with how that power is used in combat. Similarly, focusing on the sophistication of the “analytical engine” powering a weapon (e.g., what type of algorithm or method of machine learning is employed) risks ignoring the focus of the law — how humans will use that weapon (e.g., using the machine to select and engage targets without further intervention by a human operator).
Regarding the chair’s specific questions, we recommend focusing our discussions on the functions that autonomy is performing, rather than considering autonomy as an attribute of a weapon system as a whole.
The factors that the chair noted – the environment of deployment, specific constraints in time of operation, or scope of employment over an area – can be important from the IHL/CCW perspective in how these factors contribute to the assessment of possible civilian casualties from the intended use of the weapon. These are important factors to consider in applying IHL requirements. However, we do not believe these are properly deemed characteristics of lethal autonomous weapon systems as such.
Lastly, we do not think that a differentiation between anti-personnel and anti-materiel weapons is a particularly meaningful distinction to draw from an IHL/CCW perspective.
We are happy to discuss in more detail our thinking on these issues.