Flag

An official website of the United States government

Group of Governmental Experts on Lethal Autonomous Weapons Systems (LAWS) – Agenda item 5(c)
7 MINUTE READ
September 30, 2020

Further consideration of the human element in the use of lethal force and aspects of human-machine interaction in the development, deployment and use of emerging technologies in the area of LAWS

Statement
September 11, 2020

The United States recognizes the keen interest many GGE participants have expressed in discussing further the human element and aspects of human-machine interaction in the development and use of emerging technologies in the area of LAWS. We agree that these are topics upon which further common understandings can and should be built, given the range of views currently expressed by various States. We believe that Guiding Principle (c) is an excellent basis on which we can build these additional common understandings. This guiding principle recognizes that human-machine interaction should ensure IHL compliance, and also recognizes the need to consider human-machine interaction comprehensively, across the life cycle of the weapon system. Therefore, in our view, a positive next step for the GGE in this area would be to elaborate on good practices in human-machine interaction that can strengthen compliance with IHL.

In our commentary on Guiding Principle (c), the United States proposed a new conclusion on human-machine interaction for the GGE’s consideration, along these lines. It begins by stating that:

“Weapons systems based on emerging technologies in the area of LAWS should effectuate the intent of commanders and operators to comply with IHL, in particular, by avoiding unintended engagements and minimizing harm to civilians and civilian objects.”

This conclusion is drawn from real-world practice in human-machine interaction and also recognizes that IHL imposes requirements on human beings. Therefore, good practices in human-machine interaction to strengthen compliance with IHL should effectuate human beings’ intent to comply with IHL. Our commentary then goes on to elaborate three categories of measures to effectuate this objective:

a. Weapons systems based on emerging technologies in the area of LAWS should be engineered to perform as anticipated. This should include verification and validation and testing and evaluation before fielding systems.

b. Relevant personnel should properly understand weapons systems based on emerging technologies in the area of LAWS. Training, doctrine, and tactics, techniques, and procedures should be established for the weapon system. Operators should be certified by relevant authorities that they have been trained to operate the weapon system in accordance with applicable rules. And,

c. User interfaces for weapons systems based on emerging technologies in the area of LAWS should be clear in order for operators to make informed and appropriate decisions in engaging targets. In particular, interface between people and machines for

autonomous and semi-autonomous weapon systems should: (i) be readily understandable to trained operators; (ii) provide traceable feedback on system status; and (iii) provide clear procedures for trained operators to activate and deactivate system functions.

We are interested in the views of GGE participants on these proposed new conclusions, which reflect good practices that can strengthen compliance with IHL. These proposed new conclusions provide the basis for more detailed discussion regarding good practices in engineering weapons systems, training personnel, and human-machine interfaces. We hope the GGE can productively articulate good practices that can strengthen compliance with IHL as a means of developing consensus recommendations on the issue of human-machine interaction.

Human responsibility is a critical dimension of the human element in the use of lethal force.

We also believe it would be productive for the GGE to address how well-established international legal principles of State and individual responsibility apply to States and persons who use weapon systems with autonomous functions. In its commentary on Guiding Principle (b), the United States has proposed eight new conclusions along these lines for the GGE’s consideration.

1. Under principles of State responsibility, every internationally wrongful act of a State, including such acts involving the use of emerging technologies in the area of LAWS, entails the international responsibility of that State.

2. A State remains responsible for all acts committed by persons forming part of its armed forces, including any such use of emerging technologies in the area of LAWS, in accordance with applicable international law.

3. An individual, including a designer, developer, an official authorizing acquisition or deployment, a commander, or a system operator, is responsible for his or her decisions governed by IHL with regard to emerging technologies in the area of LAWS.

4. Under applicable international and domestic law, an individual remains responsible for his or her conduct in violation of IHL, including any such violations involving emerging technologies in the area of LAWS. The use of machines, including emerging technologies in the area of LAWS, does not provide a basis for excluding legal responsibility.

5. The responsibilities of any particular individual in implementing a State or a party to a conflict’s obligations under IHL may depend on that person’s role in the organization or military operations, including whether that individual has the authority to make the decisions and judgments necessary to the performance of that duty under IHL.

6. Under IHL, a decision, including decisions involving emerging technologies in the area of LAWS, must be judged based on the information available to the decision-maker at the time and not on the basis of information that subsequently becomes available.

7. Unintended harm to civilians and other persons protected by IHL from accidents or equipment malfunctions, including those involving emerging technologies in the area of LAWS, is not a violation of IHL as such. And,

8. States and parties to a conflict have affirmative obligations with respect to the protection of civilians and other classes of persons under IHL, which continue to apply when emerging technologies in the area of LAWS are used. These obligations are to be assessed in light of the general practice of States, including common standards of the military profession in conducting operations.

We look forward to discussing these and other proposals with other delegations.

Resources

· Human-Machine Interaction in the Development, Deployment, and Use of Emerging Technologies in the Area of Lethal Autonomous Weapons Systems – United States (CCW/GGE.2/2018/WP.4). https://unog.ch/80256EE600585943/(httpPages)/7C335E71DFCB29D1C1258243003E8724?OpenDocument

· Remarks by Karl Chang and Amanda Wall, Rio Seminar on Autonomous Weapon Systems, February 2020 (forthcoming).