We Robot 2012, a conference which took place last week at the University of Miami in Coral Gables, Florida, billed itself as "the inaugural conference on legal and policy issues relating to robotics." It covered multiple issues, including robotic law enforcement. The focus of this article is on one particular conference panel regarding the use of robotics in military actions, and the papers related to the discussion. A current generation of remotely controlled lethal robotic drones is deployed in the middle east. A human currently must target and control each action. The panel discussed the next generation. This was defined as autonomous drones that identify targets based upon measurements compared to parameters in their programming, which then attack the targets without human intervention. The panel focused upon the use of this next generation as it relates to International Humanitarian Law.
Participanting in the panel were specialists in the fields of robotics and military law:
- Markus Wagner, Associate Professor of Law, University of Miami School of Law. The Dehumanization of International Humanitarian Law: Legal, Ethical, and Political Implications of Autonomous Weapon Systems.
- Richard O'Meara, Brigadier General (U.S.A.), retired, Rutgers University. The Rules of War and the Use of Unarmed, Remotely Operated, and Autonomous Robotics Systems, Platforms and Weapons... Some Cautions
- Ian Kerr, Canada Research Chair in Ethics, Law and Techology, University of Ottawa, coauthor, Katie Szilagyi, J.D. Candidate 2012, University of Ottawa. Paper: Asleep at the Switch? How Lethal Autonomous Robots Become a Force Multiplier of Military Necessity
- Moderated by Bernard H Oxman, Richard A. Hausler Professor of Law, University of Miami School of Law
- Not present, but submitting paper to the conference on this topic: Oren Gross, Irving Younger Professor of Law and Director, Institute for International Legal & Security Studies, University of Minnesota Law School. Paper: When Machines Kill: Criminal Responsibility for International Crimes Committed by Lethal Autonomous Robots.
We Robot 2012 Conference - Panel Presentation on Military Robotics - 2012-04-22.Left to right: Markus Wagner, Richard O'Meara, Katie Szilagyi, Ian Kerr, Bernard H Oxman. by scribillare.com - John Iacovelli
Defining the Next Generation
The key difference between this generation and the next will be autonomy. The current generation may fly or hover for hours with little or no supervision, but targeting and the attack require "a human in the loop." The next generation will be programmed to survey an area, acquire targets based upon parameters such as observation of weapons, and fire at a target without human intervention.
Though no time frame was given for the introduction of the next generation, Kerr and Szilagyi's paper reminds us that it took little time for "RoboWar" to move from computer screens and game consoles to a live military theatre. In 2009, the U.S. Air Force trained more remotely controlled aircraft pilots than actual fighter pilots. This Congressional Research Service report provides a good snapshot as to the current state of autonomy in our autonomous drone fleet.
Gross' definition is specific and provides a clear line between remote control and the next generation of autonomous lethal weapons:
"It would be capable of identifying targets and carrying out military operations including, significantly, lethal attacks. It will have the decision-making capacity to identify a target, determine whether (or not) to use lethal force against it, decide what type of weapons (from a selection of payloads it carries) to use should it decide to attack etc."
This identify/decide/attack scenario can be illustrated with a situation common today, though currently the human makes the decision. A drone returns a video image of a man carrying a weapon to a console thousands of miles away; the console operator clicks a mouse and fires a missile at the man with the weapon.
There are a number of problems with the identification process. In some parts of the world, such as Afghanistan, men carry rifles in every day life without being combatants. In other parts, men carry rifles to hunt for food. Further, if the man is carrying a metal pipe or some other tool that could be mistaken for a rifle, how does a drone or remote operator distinguish the difference? Recall the "Collateral Murder" video in which a Reuters correspondent carrying a video camera was mistaken for a man carrying a rocket launcher by a human operator.
International Humanitarian Law (IHL)
There are two main Geneva protocols which apply, as pointed out by the panel. They are Geneva Protocol I Articles 36 and 48:
Article 48. Basic rule: In order to ensure respect for and protection of the civilian population and civilian objects, the Parties to the conflict shall at all times distinguish between the civilian population and combatants and between civilian objects and military objectives and accordingly shall direct their operations only against military objectives.
Article 36. New weapons: In the study, development, acquisition or adoption of a new weapon, means or method of warfare, a High Contracting Party is under an obligation to determine whether its employment would, in some or all circumstances, be prohibited by this Protocol or by any other rule of international law applicable to the High Contracting Party.
O'Meara noted that hundreds of thousands of citizens are alive today because operators were taught to comply with IHL restrictions. He noted that rules of warfare protect both sides' soldiers.
The terms "proportionality" and "discrimination" (as in distinguishing between civilian and combatant) are the key tests in regards to the use of drones (or indeed, any weapons) under IHL. Proportionality allows for a certain amount of civilian casualites. The measure is whether a military attack causes excessive civilian casualties in relation to the military advantage gained. Discrimination means that forces must discriminate between civilians and combatants, between civilian objects and military targets.
In a situation in which the drone does the quantifying (proportion) and qualifying (discrimination), how would it be determined that a drone transgresses IHL? In the simplest case, if a drone were not programmed at all to discriminate between civilians and combatants, it would be little more than a mine or cluster bomb. Its use would be illegal, as noted by Wagner. What should happen to the drone, its operators, the strategists and the political decision makers in such transgression?
Wagner notes a situation in which a general located on another continent watches hours of video of a compound. Civilians are visible. Insurgents enter and leave the compound openly carrying weapons. The general reasons that the presence of the weapons should have been a signal to the civilians that the compound might be a target. He authorizes the strike.
(Note: You can view every article as one long page if you sign up as an Advocate Member, or higher).