Meeting of the Group of Governmental Experts to the CCW on Lethal Autonomous Weapons Systems (LAWS)

Meeting of the Group of Governmental Experts of the High Contracting Parties to the CCW on Lethal Autonomous Weapons Systems

Statement by the United States Delegation
As delivered by Shawn Steene, Senior Force Planner, U.S. Department of Defense

Geneva, August 27, 2018

Thank you to the Chair for his excellent leadership and for other delegations for their thoughtful comments and working papers.  The good discussion that we’ve had, including valuable contributions by military and technical experts, reaffirms our view that the CCW is the right forum to have these discussions.

The law of war provides a robust and coherent system of regulation for the use of all weapons, including weapons with autonomous functions.  For example, the U.S. Department of Defense reviews the legality of all weapons that it acquires for consistency with international humanitarian law.

Therefore, it is not necessary or desirable at this time, to define LAWS.  However, it is useful to develop a common understanding of the concepts involved.

As the working paper from Finland and Estonia pointed out, the distinctions between “automated” and “autonomous” functions are not clear cut.

Similarly, there is no clear technological separation point between systems that are “autonomous” and “fully autonomous.”

Autonomy is fundamentally a tool.  It is a tool that militaries can use to better ensure that a weapon system achieves the military effects intended by the commander.  As stated in our working paper, the use of “smart” weaponry with autonomous functions has increased the degree of control that States exercise over the use of force.  Of course, humans cannot maintain complete control over weapons at all times.  Once a bullet leaves a gun, the rifleman ceases to have control over that bullet.  Autonomy is a way extending human control beyond the time a munition is deployed.  The use of technology, such as sensors and artificial intelligence, allow personnel to set the parameters for when, where, and how force is deployed.  This means that commanders are more likely to achieve the military effects that they intend and that autonomy can enhance human control over the use of force.

With respect to characterization of lethal autonomous weapons, and the use of autonomy in the functions that relate to selecting and engaging targets, we would emphasize that autonomy in the these functions is not inherently problematic and should not be stigmatized. LAWS should not be defined with the purpose of banning or stigmatizing LAWS or emerging technologies.  As we noted in our working paper for the April session, and in the expert discussion this morning, these technologies have potential humanitarian and safety benefits.

Perhaps more important than specific characteristics, context is important with respect to the legality and ethics of employing weapon systems, including weapon systems that use autonomy in functions like selecting and engaging targets.

A given weapon can be used lawfully and legitimately in one operational context, but not in another.

Thus, it is important to consider the processes that ensure that weapons are used lawfully and ethically.  These include: testing and evaluation and legal reviews in development; training of personnel and the establishment of doctrine for the proper use of such weapons; and strategy, policy, and rules of engagement.

Moreover, regardless of the specific characteristics of the weapon system, it is essential to ensure that force is being used to effectuate the intention of commanders and the operators of weapon systems.  Practical steps should be taken to ensure that an appropriate level of human judgement can be exercised over the use of force and to reduce the risk of unintended engagements.

In general, this requires that rigorous testing under realistic conditions be applied to weapons that make use of autonomy in critical functions relating to the use of force – particularly target selection and weapon release.

It also requires that the human commanders and operators are properly trained on that system’s performance and understand the doctrine for its proper use.