The American Society of International Law has released a new “ASIL Insight” on law applicable to autonomous weapon systems. (ASIL Insights are short, descriptive pieces on topical issues meant as non-technical “backgrounders” for journalists, the general public, and anyone looking for a quick path into an international law topic; they represent solely the author’s views, but are written to give an understanding of the background legal issues.)
“The Law That Applies to Autonomous Weapon Systems” is written by Jeffrey S. Thurnher, a JAG officer on faculty at the Naval War College; it is short, crisp, and a useful guide to understanding the legal issues raised by the possibility of increasingly automated weapon systems that might one day be fully autonomous. (Also recommended is Major Thurnher’s more detailed October 2012 article in Joint Force Quarterly (National Defense University, Washington DC, Vol. 67, No. 4, Oct. 2012), “No One at the Controls: Legal Implications of Fully Autonomous Targeting.”) The ASIL Insight piece is organized around two fundamental legal questions:
It is incontrovertible that the law of armed conflict applies to autonomous weapon systems. When determining the overall lawfulness of a weapon system, there are two distinct aspects of the law that need to be analyzed: weapons law and targeting law. The former verifies that the weapon itself is lawful. The latter determines whether the use of the weapon system during hostilities might be prohibited in some manner under the law of armed conflict. A weapon must satisfy both aspects before it may be lawfully used on a battlefield.
Weapons law, the article goes on, can be broken down into three fundamental legal requirements:
When analyzing whether the weapon system itself is lawful, there are two distinct rules that apply. The first rule is that the weapon system must not be indiscriminate by its very nature. A weapon is deemed indiscriminate by nature if it cannot be aimed at a specific target and would be as likely to strike civilians as combatants. Found in Article 51(4)(b) of Additional Protocol I to the Geneva Conventions, the rule is considered to be reflective of customary international law. Accordingly, all states, even those not a party to the Protocol (such as the United States), are bound to comply with this customary law rule against indiscriminate attack. The mere fact that an autonomous weapon system rather than a human might be making the final targeting decision would not render the weapon indiscriminate by nature. Instead, as long as it is possible to supply the autonomous system with sufficiently reliable and accurate data to ensure it can be aimed at a military objective, then the system would not be deemed indiscriminate by nature. In the end, any proposed autonomous weapon system must comply with this provision to be lawful.
The second rule, codified in Article 35(2) of Additional Protocol I, is that a weapon system cannot cause unnecessary suffering or superfluous injury. This rule, which is also reflective of customary international law, seeks to prevent needless or inhumane injuries to combatants. A classic example of an unlawful weapon under this rule is a warhead that is filled with glass. Such a warhead would unnecessarily complicate medical treatment and would consequently be unlawful. This rule only presents a problem for an autonomous system if the specific warheads or weapons installed on the system would violate the rule. The fact that the system autonomously decides to engage a target does not itself affect or violate the prohibition on unnecessary suffering or superfluous injury. To potentially be deemed lawful, a fully autonomous weapon system must only be armed with weapons and ammunition that comply with this rule.
To verify compliance with the two rules listed above, a state intent on fielding a new weapon must conduct a thorough legal review. This requirement for a legal review, which appears in Article 36 of Additional Protocol I, ensures that the weapon is not indiscriminate and that it would not cause unnecessary suffering or superfluous injury. The review also determines whether there is any other particular provision under the law of armed conflict which would prohibit the use of the weapon. Customary law requires this legal review of weapons and weapon systems, also referred to as the means of warfare, and these reviews are thereby required by all states, including those not a party to the Protocol. The DoD Directive sets forth the U.S. policy of requiring such a review at the early stages of development and again just prior to actually fielding the system. Furthermore, if a weapon system is significantly modified after its initial fielding, then an additional review would be necessary. The development of any fully autonomous weapon system would clearly require such legal reviews.
(For sources and links on autonomous weapons systems and the legal and ethical debates around them, see this periodically updated Lawfare Readings post. – His Serenity, The Book Reviews Editor.)