Volume 48

Of Robots and Rules: Autonomous Weapon Systems in the Law of Armed Conflict

by Michael Press

In September 2016, hoping to quiet fears about “killer robots,” then-Defense Secretary Ashton Carter promised that whatever weapons systems the United States might develop in the future, “there’s always going to [be] human judgment and discretion.” Carter’s policy assurance on the future development of autonomous weapons systems (AWS) was clearly informed by ethical considerations, but it also had a distinct basis in the law of armed conflict (LOAC), embodied in various international treaties. This Note will conduct an analysis of the legal regime surrounding the construction and use of these AWS. In order to do so, it will examine similar weapons systems utilized by the U.S. military in various operations across the globe and current U.S. doctrine on the use of robotic autonomy in weapons, as well as the arguments for and against their fielding. An overview of LOAC and international humanitarian law (IHL) principles will also be explored through a clearly articulated legal review that should be undertaken before any weapon is legally allowed to operate in a battlespace. Subsequently, that legal review will be applied to AWS to investigate whether these weapons systems should be legally prohibited and how certain uses should be restricted. This Note asserts that there is nothing from a legal perspective that fundamentally prohibits the use of AWS in combat situations. However, similar to other weapons, it is the implementation of AWS that could come into conflict with LOAC and IHL. Recommendations for creating and using AWS in line with international legal principles will be interspersed throughout the Note and summarized in the conclusion. Key recommendations include limiting the use of AWS to situations where a system can reliably and predictably abide by the core principles of LOAC, as well as establishing standards and guidelines to ensure that AWS are fielded in such a manner.

Keep Reading Of Robots and Rules

Subscribe to GJIL