An official website of the United States government

Opening Statement at GGE on Lethal Autonomous Weapons Systems (LAWS)
August 3, 2021

U.S. Opening Statement at the Group of Governmental Experts on Lethal Autonomous Weapons Systems 

Delivered by Amanda Wall

1st session of the 2021 Group of Governmental Experts (GGE) on emerging technologies in the area of lethal autonomous weapons systems (LAWS)

Geneva, August 3, 2021


Thank you, Ambassador Pecsteen.   Let me first thank you and your team for the leadership you’ve provided this year and for facilitating such a useful and productive process so far.  We are grateful that the circumstances have permitted us to meet formally again after the hiatus.  Although we’ve all been hard at work during the last 18 months, it is a pleasure to be conducting our work side-by-side again.

Our goal as a delegation for this session, and I hope our shared goal as a GGE for these two weeks, is simple: we seek to make as much progress as possible to develop consensus on substantive conclusions or recommendations on aspects of the normative and operational framework.  This is the prime opportunity for us to do so.  We look forward to seizing this chance to make as much substantive progress as possible, and urge delegations to remain focused on fulfilling our present mandate.  Our future mandate is necessary and important, and we will be in a better position to recognize what that mandate should look like after we have done some of the additional substantive work that High Contracting Parties have already identified in their recent contributions during informal meetings and written submissions, which are for consideration at this formal meeting as well.

To that end, during our informal meetings in June and July, the U.S. Delegation presented a joint paper on behalf of Australia, Canada, Japan, the United Kingdom, and the United States, which recommended that the GGE focus on four aspects of the normative and operational framework on emerging technologies in the area of LAWS.  Those four aspects are: (1) Application of IHL; (2) Human Responsibility; (3) Human-Machine Interaction; and (4) Weapons Reviews.  We continue to believe that focusing our discussions on those four key aspects would be a productive way to advance our work with an eye toward the Review Conference later this year.  Our joint paper shows how much progress the GGE has already made in achieving consensus in these four aspects of the framework, but we believe the GGE can do more to achieve consensus on additional substantive conclusions or recommendations in advance of the Review Conference.  With that in mind, in June, the United States also submitted a working paper proposing additional conclusions under each of these four elements.  I’ll summarize them briefly here, and we hope to discuss these proposed conclusions in greater detail over the course of this week.

First, we believe that a critical aspect of the GGE’s work on the operative and normative framework is to articulate the limits that existing IHL places on the use of emerging technologies in the area of LAWS.   How IHL requirements are implicated could depend on how a weapon system is to be used.  Therefore, if we want to articulate more clearly the limits that IHL places on the use of emerging technologies in the area of LAWS in military operations, a necessary step is to specify the different ways these technologies can be used in military operations.  Consequently, under “application of IHL,” we have proposed clarifying how IHL requirements apply to three general scenarios for the use of autonomous functions in weapons systems:  1) homing munitions that involve autonomous functions; 2) decision support tools that that can inform decision-making about targeting; and 3) relying on autonomous functions in weapon systems to select and engage targets.

Under this same aspect, we have also proposed identifying ways in which emerging technologies in the area of LAWS could be used to reduce the risk of harm to civilians in military operations, and to strengthen compliance with IHL.  As many delegations have noted, new technologies can create humanitarian benefits like:  increasing a commander’s awareness of civilians on the battlefield; improving the commander’s ability to assess the risk of collateral damage; improving precision and accuracy of weapons; and automatically disabling munitions if they miss their targets.  And, a critical part of the normative and operational framework is using technology to advance the objects and purposes of the CCW.

Second, under “human responsibility,” we have proposed a number of conclusions about how existing international legal principles of responsibility apply to human conduct and decisions involving emerging technologies in the area of LAWS.  From a normative perspective, it is important to reaffirm that these general principles of legal responsibility apply in this specific context and to reject the notion that new technologies can be used to evade responsibility or to create an “accountability gap.”  From an operational perspective, we have also proposed a number of good practices that the GGE could endorse to promote accountability in military operations involving the use of emerging technologies in the area of LAWS.  These include general practices – like conducting operations under a clear operational chain of command and establishing and using procedures for the reporting of incidents involving potential violations.  They also include practices specific to the use of weapons systems – like rigorous testing of and training on the weapon system so commanders and operators understand the likely effects of employing the weapon system.

Third, under “human-machine interaction,” our working paper proposes a number of practices that the GGE could endorse that would help ensure that weapon systems based on emerging technologies in the area of LAWS effectuate the intent of commanders and operators to comply with IHL, in particular, by avoiding unintended engagements and minimizing harm to civilians and civilian objects.  For example, testing and evaluation of weapon systems during development, training of personnel, and clear human-machine interfaces can reduce the risk of accidents or unintended engagements in military operations.  These measures are drawn from U.S. military practice in developing and using autonomous and semi-autonomous weapon systems, and emphasize that what constitutes appropriate human-machine interaction at one stage of the life-cycle can depend on human-machine interaction at other stages in the life-cycle.  A one-size-fits-all approach would oversimplify this issue, in our view.

Finally, under the fourth aspect, we have proposed that the GGE endorse guidelines and good practices for conducting legal reviews of weapon systems based on emerging technologies in the area of LAWS.  These include, for example, conducting legal reviews when weapon systems are modified, or when new concepts for use of existing weapons are being developed.

We hope these proposals can serve as a starting point for real substantive discussions over the next two weeks.  We support a strong, substantive outcome for the Review Conference this year, and the best way to achieve that in our view is to find the areas of substantive consensus and develop them as fully as possible this week and next week.  We believe there is emerging consensus to focus on these four areas in order to achieve that goal.  And, we look forward to working from these areas and to discussing the substantive proposals we have put forth, as well as those that other delegations have put forth in that common effort.

Thank you.