autonomous weapons systems
Decoding the Defense Department’s Updated Directive on Autonomous Weapons
The update to the 2012 directive provides clarity and establishes transparent governance and policy, rather than making substantial changes.
As the use of drones has expanded, so too has interest in autonomous weapon systems. Drones, unmanned but remotely piloted, are not themselves autonomous weapons which are characterized by their ability to cut humans “out of the loop.” A 2012 Department of Defense policy directive defines fully autonomous weapon systems as systems that, “once activated, can select and engage targets without further intervention by a human operator.” Similarly, weapons such as “fire-and-forget” missiles, which require no guidance after firing but will hit only targets pre-selected by a human, are sometimes described as “semi-autonomous.” Both semi-autonomous and autonomous weapons systems have triggered concerns that they will increase costs to civilian life in wartime and reduce accountability for war crimes.
Latest in Autonomous Weapon Systems
The update to the 2012 directive provides clarity and establishes transparent governance and policy, rather than making substantial changes.
To prevent an AI-enabled arms race resulting in semiautonomous or fully autonomous nuclear weapons, the U.S. and other nuclear-armed states need to negotiate a non-proliferation treaty sooner rather than later.
On Jan. 25, the DoD updated its directive on “Autonomy in Weapons Systems,” the guiding document for U.S. development, implementation, and supervision of autonomous and semi-autonomous weapons systems.
Iranian scientist Mohsen Fakhrizadeh reportedly may have been assassinated using a remote-controlled machine gun. Such devices are unfortunately easy to construct.
On March 25-29, the U.N.’s Group of Governmental Experts (GGE) will meet for the third consecutive year to discuss developments and strategies in the field of lethal autonomous weapons systems (LAWS).
This month, the Defense Advanced Research Projects Agency (DARPA) will assess the first phase of its Explainable AI program—a multi-year, multi-million dollar effort to enable artificial intelligence (AI) systems to justify their decisions.
As I discussed in a previous post, the Convention on Certain Conventional Weapons Group of Governmental Experts (GGE) on lethal autonomous weapons systems (LAWS) is meeting for the second time to discuss emerging issues in the area of LAWS.
Based on trends in advancing robotics technology, many experts believe autonomous—and even lethal autonomous—robots are an inevitable and imminent development.
Editor’s Note: One of the most successful NGO anti-war efforts was the campaign to ban landmines, which led to a treaty banning their use and production in 1997. Activists, not surprisingly, are using this model as they focus on other technologies, including autonomous weapons systems—and yes, here is a Terminator link.
For Lawfare readers interested in law and regulation of autonomous weapon systems (AWS), we’re pleased to note our new essay, recently posted to SSRN, “Debating Autonomous Weapon Systems, Their Ethics, and Their Regulation Under International Law.” It appears as a chapter in a just-published volume, The Oxford Handbook of Law, Regulation, and Technology, edited by Rog