A new project addresses how artificial intelligence might change how states decide to use force against one another.
Ashley Deeks is a Professor of Law at the University of Virginia Law School. She joined the Virginia faculty in 2012 after two years as an academic fellow at Columbia Law School. She served for ten years in the Legal Adviser's Office at the State Department, most recently as the Assistant Legal Adviser for Political-Military Affairs. In 2007-08 she held an International Affairs Fellowship from the Council on Foreign Relations. After graduating from the University of Chicago Law School, she clerked for Judge Edward Becker on the U.S. Court of Appeals for the Third Circuit.
Subscribe to this Lawfare contributor via RSS.
There is value in putting down a marker that using the technology this way is not acceptable.
The United Kingdom may be on the cusp of new Facial Recognition Software regulations, but for now, the technology is developing faster than the government’s ability to ensure its responsible use.
In the context of both criminal justice and military operations, predictive algorithms are likely to be used for a similar purpose: making individual predictions about dangerousness and anticipating the location of future acts of violence.
Governments and businesses are using facial recognition software more and more often. The costs and benefits extend beyond what’s in plain sight.
After Prime Minister Theresa May referred to “unlawful use of force” in her speech concerning the poisoning of Sergei Skripal, it is worth clarifying the possible role of NATO and the range of potential British actions.
The U.S. needs to start thinking about how to respond to domestic surveillance in other countries.