Skip to content

Readings: Autonomous Weapon Systems and Their Regulation

By
Tuesday, December 11, 2012 at 6:26 PM

This Lawfare post serves as a running list of links to articles, documents, or other materials related to the regulation and legal review of autonomous weapons systems (or increasingly automated weapons systems).  (As of February 18, 2013; this post will be updated periodically and runs in (roughly) reverse chronological order):

  • Michael N. Schmitt and Jeffrey S. Thurnher, “Out of the Loop: Autonomous Weapon Systems and the Law of Armed Conflict,” SSRN Draft, February 5, 2013 (Forthcoming, Harvard National Security Journal).  This article, by two members of the international law department of the US Naval War College, discusses various issues related to lethal autonomous weapon systems.  This article argues vigorously against recent calls by Human Rights Watch (see report below) and others for a sweeping, total international ban on autonomous weapon systems.  The law of armed conflict, the article begins, has “never been about ensuring a ‘fair fight’; rather, it comprises prohibitions, restrictions, and obligations designed to balance a State’s interests in effectively prosecuting the war (military necessity) with its interest in minimizing hard to those involved in a conflict (humanity).”  The legal question is therefore whether “autonomous weapon systems comply with the legal norms that States have put in place to achieve this balance.”  The article’s substantive conclusion is that whereas “some conceivable autonomous weapon systems might be prohibited as a matter of law, the use of others will be unlawful only when employed in a manner that runs contrary to the law of armed conflict’s prescriptive norms governing the ‘conduct of hostilities’.”  The article walks through important definitional, conceptual, and terminological distinctions in both law and engineering related to autonomous systems.  Its final conclusion is that an “outright ban of autonomous weapon systems is insupportable as a matter of law, policy, and operational good sense.”
  • Jeffrey S. Thurnher, “No One at the Controls: Legal Implications of Fully Autonomous Targeting,” Joint Force Quarterly 67 (National Defense University Press), October 2012.  Major Thurnher is a US Army JAG and faculty member at the US Naval War College.  This article, written for a non-legal military and policy audience, discusses the legal issues that might arise from fully autonomous targeting by future weapon systems.  The article argues, with regards to strategic issues, that lethal autonomous robots (LARs) might provide the best counter to future asymmetric threats, since they can “operate faster than humans and achieve lethal outcomes even where there are no communications links. LARs are apt to prove attractive to a number of players, and particularly given that these technologies are “apt to prove attractive to a number” of states, the US should “act at once to secure a commanding capability in fully autonomous targeting.” As to legal issues of LARs, Thurnher says that legal concerns “do not appear to be a game-ender,” and that with “appropriate control measures, these unmanned systems will be safe, effective, and legal weapons as well as force multipliers.”
  • Jeffrey S. Thurnher, “The Law That Applies to Autonomous Weapon Systems,” ASIL Insights, Vol. 17, Issue 4, January 18, 2013 (American Society of International Law).  ASIL Insights provide short backgrounders for the general public, journalists and nonspecialists on current topics in international law.  Major Thurnher is a US Army JAG and a faculty member at the Naval War College.  This short piece describes the legal elements of weapons review and discusses what they might mean applied to autonomous or semi-autonomous or gradually automating weapon systems.
  • Alan Backstrom and Ian Henderson, “New Capabilities in warfare: an overview of contemporary technological developments and the associated legal and engineering issues in article 36 weapons reviews,” SSRN Draft, October 22, 2012 (Forthcoming, International Review of the Red Cross).  Backstrom is a mechanical engineer and Henderson is the well-known author of a leading work on targeting law in Protocol I, The Contemporary Law of Targeting.  The increasing complexity of weapon systems, they write, requires “an interdisciplinary approach to the conduct of weapon reviews. Developers need to be aware of international humanitarian law principles that apply to the employment of weapons. Lawyers need to be aware of how a weapon will be operationally employed and use this knowledge to help formulate meaningful operational guidelines in light of any technological issues identified in relation to international humanitarian law.”  The paper calls for cooperative work by the designers, legal and policy regulators, end users in the field.  It adds that as details of a weapon’s capability are “often highly classified and compartmentalised, lawyers, engineers and operators need to work cooperatively and imaginatively to overcome security classification and compartmental access limitations.”
  •  Michael N. Schmitt, International Law Department, US Naval War College, “Autonomous Weapon Systems and International Humanitarian Law: A Reply to the Critics,” Harvard Law School National Security Journal, online edition (February 5, 2013).  Professor Schmitt, a leading laws of war scholar at the Naval War College, written this relatively brief (24 page) paper as a reply to the HRW report, “Losing Humanity.” The article analyzes weapons automation and autonomy through the lens of international humanitarian law, as well as the specific US practices to comply with international law obligations to review weapons systems, against HRW’s analysis.  The article finds that the “Human Rights Watch position” in favor of a preemptive ban on development, production, and use of autonomous weapon systems is “unlikely to find traction.” It concludes that “autonomous weapon systems are not unlawful per se. Their autonomy has no direct bearing on the probability they would cause unnecessary suffering or superfluous injury, does not preclude them from being directed at combatants and military objectives, and need not result in their having effects that an attacker cannot control. Individual systems could be developed that would violate these norms, but autonomous weapon systems are not prohibited on this basis as a category.”
  • Human Rights Watch and the Harvard Law School International Human Rights Clinic, “Losing Humanity: The Case Against Killer Robots,” November 19, 2012.  This report makes HRW’s case to  ”[p]rohibit the development, production, and use of fully autonomous weapons through an international legally binding instrument. Adopt national laws and policies to prohibit the development, production, and use of fully autonomous weapons. Commence reviews of technologies and components that could lead to fully autonomous weapons. These reviews should take place at the very beginning of the development process and continue throughout the development and testing phases.”
  • Ashton B. Carter, Deputy Secretary of Defense, “Autonomy in Weapons Systems,” Department of Defense Directive, Number 3000.09, November 21, 2012.  The Directive establishes “DoD policy and assigns responsibilities for the development and use of autonomous and semi-autonomous functions in weapon systems, including manned and unmanned platforms.”  It further “establishes guidelines designed to minimize the probability and consequences of failures in autonomous and semi-autonomous weapon systems that could lead to unintended engagements.”  “Fully autonomous” weapon systems are defined in the report as having the ability to independently both select a target and engage it; a “semi-autonomous” weapon system is one in which a human must make or authorize the selection of a target, while the machine can only independently engage the target once that selection has been made.  A key concept in the document – which is designed to be applicable across many different kinds of weapons systems over time and during their development – is that the “appropriate” level and kind of human involvement in the system will depend upon many factors and circumstances.
  •  William Marra and Sonia McNeil, “Understanding “The Loop’: Regulating the Next Generation of War Machines,” 36 Harvard Journal of Law and Public Policy 3 (2013), Lawfare Research Paper Series 1-2012. This paper (which inaugurated the Lawfare Research Paper Series in May 2012) provides a detailed explanation of the distinct technical meanings of “automation,” “autonomy,” “The Loop,” and other key concepts essential to intelligently regulating gradually automating weapons systems, using the evolution of today’s rapidly automating drones as an example.  The paper brings together international law, US government regulation and weapons review processes, engineering concepts, and military strategy in order to yield a precise, common vocabulary for debating the proper forms and modes of regulation of incrementally automating weapons systems.  In the language of engineers, say Marra and McNeil, “tomorrow’s drones are expected to leap from ‘automation’ to true ‘autonomy’.”  As a consequence, regulations for today’s drones must be “crafted with an eye towards tomorrow’s technologies. Yet today’s debates about humans and ‘the loop’ rely on language too imprecise to successfully analyze the relevant differences between drones and predecessor technologies.” They conclude that needless confusion and possibly policy error pervade discussions about “when an advanced technological system is autonomous and what the implications of autonomy might be … language useful to the policymaking process has already been developed in the same places as drones themselves — research and engineering laboratories around the country and abroad.”
  • Kenneth Anderson and Matthew Waxman, “Law and Ethics for Robot Soldiers,” Policy Review, December-January 2012-13 (final published version at Policy Review here, and working draft with footnotes at SSRN here).  This brief (6,000 word) policy essay points out that the while autonomous weapons systems will enter the battlefield, they will do so only incrementally, through small technological changes that are sometimes unrelated to weapons as such.  The essay rejects either a wait-and-see, “don’t try to regulate what doesn’t exist yet” attitude toward regulation of gradually automating weapons systems, or a preemptive ban treaty of the kind urged by HRW.  It argues instead that the US should work to create transparent international expectations about the legal requirements of autonomous weapons through the promulgation of its internal norms, standards, and best practices. These state-generated norms are not generally matters of international law, but what the US government would regard as a plausible way forward on the basis of existing law of war norms in an area of novel technological and other issues.  A wait-and-see attitude risks allowing technology to become locked into development paths that might lead to weapons that would be less desirable from a legal and ethical standpoint.  The HRW-type ban treaty would unacceptably give up potentially large gains in humanitarian protection that might emerge from new technologies of automation over coming decades.
  • Noel Sharkey, “Americas Mindless Killer Robots Must Be Stopped: The rational approach to the inhumanity of automating death by machines beyond the control of human handlers is to outlaw it,” Guardian, December 3, 2012.  UK Artificial intelligence scholar Noel Sharkey has been a tireless voice arguing what he regards as the limits of what machine programming can do – concluding that in fact it will fail short of the judgment necessary for autonomous weapons.  He has also been a leading voice calling for an international ban treaty, and it is not too far to say that HRW’s report simply takes up both his voice and his call for a ban.  In this opinion piece, released a week after HRW’s report, he makes the case.  It’s short, eloquent, and readable – perfect, for example, if you’re looking for a short non-technical piece to introduce the topic to a class or elsewhere; it’s an op-ed, however, and so it only gives one side of the argument.
  • Special Edition: Journal of Law, Information, and Science, “Laws Unmanned: Unmanned Vehicles: Legal, Social, and Ethical Issues,” 2011-12.  This special issue is not limited to autonomous weapon systems, but freely moves between remotely piloted drone aircraft and issues of autonomous weapon systems.  It also has one section devoted to military uses of unmanned systems, and a second devoted to civilian uses.  Authors include Noel Sharkey, Philip Alston, Mary Ellen O’Connell, and Markus Wagner, among others.