Convention on Certain Conventional Weapons (CCW)
Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS)
Intervention on Appropriate Levels of Human Judgment over the Use of Force delivered by John Cherry
Geneva, November 15, 2017
We thank you for the “food for thought” paper that you circulated prior to this GGE and believe it raises some excellent questions, one of which we’d like to address.
Your paper asks whether potential LAWS could be accommodated under existing chains of military command and control. We believe the answer to this question is yes.
As with all weapons systems, commanders must authorize the use of lethal force against an appropriate targeted military objective. That authorization is made within the bounds established by the rules of engagement and international humanitarian law (IHL) based on:
- The commander’s understanding of the tactical situation, informed by his or her training and experience
- The weapon’s system performance, informed by extensive weapons testing as well as operational experience; and
- The employment of tactics, techniques, and procedures for that weapon
In all cases, the commander is accountable and has the responsibility for authorizing weapon release in accordance with IHL. Humans do, and must, play a role in authorizing lethal force.
States will not develop and field weapons that they cannot control. Uncontrollable weapons are not militarily useful. So, although the thought of uncontrollable weapons or machines may be prevalent in popular imagination we do not think that this is a realistic issue for States to consider for their work in the CCW.
Rather than focusing on the controllability of the weapon system, the United States has used the phrase “appropriate levels of human judgment over the use of force” in these discussions. This standard is used in U.S. Department of Defense policy concerning the use of autonomy in weapon systems, which requires that autonomous and semi-autonomous weapon systems “be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force.”
In our view, ensuring “appropriate levels of human judgment over the use of force” more accurately frames the question that States should consider:
- First, this formulation focuses on human beings to whom IHL applies, rather than suggesting a new requirement that machines make legal determinations. As we discussed in one of our working papers, IHL does not require that a weapon determine whether the target is a military objective, but rather that the weapon be capable of being employed consistent with the principle of distinction by a human operator.
- Second, the “appropriate levels of human judgment over the use of force” also reflects the fact that there is not a fixed, one-size-fits-all level of human judgment or the same “minimum level of control” that should be applied to every weapon system. Different weapons systems and different operational contexts mean that the appropriate level of human judgment can differ across weapon systems and even across different functions in a weapon system. Some functions might be better done by a computer than a human being, while other functions should be performed by humans.
Thank you Mr. Chair