In the past few months there has been a debate about how best to address the development of completely autonomous weapon systems (aka "killer drones"): should we impose an outright ban as proposed by Human Rights Watch in Losing Our Humanity: The Case Against Killer Robots or should is the Law of Armed Conflict enough as argued in a forthcoming article by Michael Schmitt and Jeffrey Thurnher.
Kenneth Anderson offer a very useful summary of recent writing on autonomous weapon systems at Opinion Juris that is well worth a read. Here is his analysis of the differences between the Human Rights Watch approach and that taken by the Department of Defense:
Last November, two documents appeared within a few days of each other, each addressing the emerging legal and policy issues of autonomous weapon systems – and taking strongly incompatible approaches. One was from Human Rights Watch, whose report, Losing Our Humanity: The Case Against Killer Robots, made a sweeping, provocative call for an international treaty ban on the use, production, and development of what it defined as “fully autonomous weapons.” Human Rights Watch has followed that up with a public campaign for signatures on a petition supporting a ban, as well as a number of publicity initiatives that (I think I can say pretty neutrally) seem as much drawn from sci-fi and pop culture as anything. It plans to launch this global campaign at an event at the House of Commons in London later in April.
The other was the Department of Defense Directive, “Autonomy in Weapon Systems” (3000.09, November 21, 2012). The Directive establishes DOD policy and “assigns responsibilities for the development and use of autonomous and semi-autonomous functions in weapon systems … [and] establishes guidelines designed to minimize the probability and consequences of failures in autonomous and semi-autonomous weapon systems.”
By contrast to the sweeping, preemptive treaty ban approach embraced by HRW, the DOD Directive calls for a review and regulatory process – in part an administrative expansion of the existing legal weapons review process within DOD, but reaching back to the very beginning of the research and development process. In part it aims to ensure that whatever level of autonomy a weapon system might have, and in whatever component, the autonomous function is intentional and not inadvertent, and has been subjected to design, operational, and legal review to ensure that it both complies with the laws of war in the operational environment for which it is intended – and will actually work in that operational environment as advertised. (The DOD Directive is not very long, and makes the most sense, if you are looking for an introduction into DOD’s conceptual approach, read against the background of a briefing paper issued earlier, in July 2012, by DOD’s Defense Science Board, The Role of Autonomy in DOD Systems.)
Read it all here. What do you think?
Charles A. Blanchard
United States Air Force