An official website of the United States government

U.S. Delegation Statement on “LAWS and Human Rights/Ethics”
April 14, 2016

The Convention on Certain Conventional Weapons (CCW) Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)

U.S. Delegation Statement on “LAWS and Human Rights/Ethics”
As delivered by Catherine Amirfar

April 14, 2016


Thank you Ambassador. The United States is committed to ensuring the utmost respect for and adherence to human rights when it comes to the use of any weapon systems, including potential LAWS. The use of LAWS does not change a State’s applicable legal obligations: all international humanitarian law and international human rights law obligations that would otherwise be applicable would continue to apply in a situation where LAWS would be used. For one example, in the context of law enforcement and border or crowd control, international human rights obligations would play a vital role in discussions about the use of LAWS. In addition, in deciding whether or not to authorize the export of any weapon systems, which would include potential LAWS, the United States considers the human rights record and likelihood that a potential recipient will use those weapons to commit human rights abuses.

The United States also recognizes that it is important that the CCW discussions take up the moral or ethical issues related to the use of autonomous capabilities in weapon systems. However, we must be very clear in distinguishing between what constitute ethical considerations on the one hand and legal requirements on the other. Otherwise, we run the risk of confusing an already complex discussion in a manner that would not be helpful in promoting concrete outcomes.

Generally speaking, adherence to ethical and moral norms will depend less on the inherent nature of a technology and more on its potential use by humans. As with any emerging technology, it is important to consider fully the ethical implications of how that technology might be used or misused. At the same time, we believe that it is important that our discussion of autonomy does not lead to categorical statements about good versus evil. History has taught us that technology can be used to promote humanitarian goals, and we should not preclude or diminish our ability to use technology to mitigate human suffering in times of armed conflict. Nor should we succumb to the fear that taking advantage of emerging technology renders a future vision of killer robots run amok inevitable.

Indeed, sophisticated weapons with some autonomous functions have been used to reduce the risk of civilian casualties in armed conflict, exceeding the capabilities of past weapons. For example, precision-guided munitions use force with greater precision than otherwise possible by human beings without such tools. Some munitions self-deactivate or self-destruct so as to reduce the risk that they may pose to civilians and a military’s own forces during or after hostilities. LAWS may continue that trend, particularly as human-machine collaboration develops to help humans make better decisions faster. Autonomy in weapons also may find use in peacekeeping operations, and help to prevent the commission of atrocities and other serious IHL or human rights violations. It might be possible to develop autonomous systems that would be used in hazardous, risky environments, for example, for search and rescue missions.

A critical consideration is how cultural and ethical norms applicable to military conduct can function to promote humanitarian goals with respect to the use of autonomous weapons systems. Such norms are generally deep-seated in militaries and promote the exercise of caution and prudence in any potential integration of autonomy into military forces. For example, human-machine teaming in targeting has brought not only enhanced situational awareness to help reduce the immediate risk to soldiers, but also better discrimination and the ability to exercise tactical patience, where additional time can be taken to ensure accurate target identification and avoid civilian casualties. The U.S. Department of Defense will employ human-machine teaming in future systems like the next generation of fighter aircraft. We are testing systems in which a computer receives, correlates, and analyzes vast amounts of data and displays it to its human pilot in a helmet-mounted display. This display will enable the human to make better decisions.

Even as we try to capture the concrete benefits provided by autonomous systems, we must take all necessary steps to ensure that the use of potential LAWS adhere to legal requirements, uphold moral and humanitarian goals, and are not subject to misuse. This is why the United States considers it vital that States understand and implement a comprehensive weapons review process of the type we described yesterday. Such a process, properly implemented, not only should account for a weapon system’s positive contribution to compliance with applicable law and policy, but also the potential for failures or misuse. Such a process, properly implemented, should ensure that appropriate levels of human judgment are exercised over the use of force.

We remain committed to promoting discussions in the CCW around these core values.

Thank you.