We Robot 2012, a conference which took place last week at the University of Miami in Coral Gables, Florida, billed itself as 'the inaugural conference on legal and policy issues relating to robotics.' The focus of this article is on one particular conference panel regarding the use of robotics in military actions, and the next generation of lethal autonomous robots and how they will relate to International Humanitarian Law.
We Robot 2012, a conference which took place last week at the University of Miami in Coral Gables, Florida, billed itself as "the inaugural conference on legal and policy issues relating to robotics." It covered multiple issues, including robotic law enforcement. The focus of this article is on one particular conference panel regarding the use of robotics in military actions, and the papers related to the discussion. A current generation of remotely controlled lethal robotic drones is deployed in the middle east. A human currently must target and control each action. The panel discussed the next generation. This was defined as autonomous drones that identify targets based upon measurements compared to parameters in their programming, which then attack the targets without human intervention. The panel focused upon the use of this next generation as it relates to International Humanitarian Law.
Participanting in the panel were specialists in the fields of robotics and military law:
Ian Kerr, Canada Research Chair in Ethics, Law and Techology, University of Ottawa, coauthor, Katie Szilagyi, J.D. Candidate 2012, University of Ottawa. Paper: Asleep at the Switch? How Lethal Autonomous Robots Become a Force Multiplier of Military Necessity
Moderated by Bernard H Oxman, Richard A. Hausler Professor of Law, University of Miami School of Law
Not present, but submitting paper to the conference on this topic: Oren Gross, Irving Younger Professor of Law and Director, Institute for International Legal & Security Studies, University of Minnesota Law School. Paper: When Machines Kill: Criminal Responsibility for International Crimes Committed by Lethal Autonomous Robots.
We Robot 2012 Conference - Panel Presentation on Military Robotics - 2012-04-22.Left to right: Markus Wagner, Richard O'Meara, Katie Szilagyi, Ian Kerr, Bernard H Oxman. by scribillare.com - John Iacovelli
Defining the Next Generation
The key difference between this generation and the next will be autonomy. The current generation may fly or hover for hours with little or no supervision, but targeting and the attack require "a human in the loop." The next generation will be programmed to survey an area, acquire targets based upon parameters such as observation of weapons, and fire at a target without human intervention.
Though no time frame was given for the introduction of the next generation, Kerr and Szilagyi's paper reminds us that it took little time for "RoboWar" to move from computer screens and game consoles to a live military theatre. In 2009, the U.S. Air Force trained more remotely controlled aircraft pilots than actual fighter pilots. This Congressional Research Service report provides a good snapshot as to the current state of autonomy in our autonomous drone fleet.
Gross' definition is specific and provides a clear line between remote control and the next generation of autonomous lethal weapons:
"It would be capable of identifying targets and carrying out military operations including, significantly, lethal attacks. It will have the decision-making capacity to identify a target, determine whether (or not) to use lethal force against it, decide what type of weapons (from a selection of payloads it carries) to use should it decide to attack etc."
This identify/decide/attack scenario can be illustrated with a situation common today, though currently the human makes the decision. A drone returns a video image of a man carrying a weapon to a console thousands of miles away; the console operator clicks a mouse and fires a missile at the man with the weapon.
There are a number of problems with the identification process. In some parts of the world, such as Afghanistan, men carry rifles in every day life without being combatants. In other parts, men carry rifles to hunt for food. Further, if the man is carrying a metal pipe or some other tool that could be mistaken for a rifle, how does a drone or remote operator distinguish the difference? Recall the "Collateral Murder" video in which a Reuters correspondent carrying a video camera was mistaken for a man carrying a rocket launcher by a human operator.
International Humanitarian Law (IHL)
There are two main Geneva protocols which apply, as pointed out by the panel. They are Geneva Protocol I Articles 36 and 48:
Article 48. Basic rule: In order to ensure respect for and protection of the civilian population and civilian objects, the Parties to the conflict shall at all times distinguish between the civilian population and combatants and between civilian objects and military objectives and accordingly shall direct their operations only against military objectives.
Article 36. New weapons: In the study, development, acquisition or adoption of a new weapon, means or method of warfare, a High Contracting Party is under an obligation to determine whether its employment would, in some or all circumstances, be prohibited by this Protocol or by any other rule of international law applicable to the High Contracting Party.
O'Meara noted that hundreds of thousands of citizens are alive today because operators were taught to comply with IHL restrictions. He noted that rules of warfare protect both sides' soldiers.
The terms "proportionality" and "discrimination" (as in distinguishing between civilian and combatant) are the key tests in regards to the use of drones (or indeed, any weapons) under IHL. Proportionality allows for a certain amount of civilian casualites. The measure is whether a military attack causes excessive civilian casualties in relation to the military advantage gained. Discrimination means that forces must discriminate between civilians and combatants, between civilian objects and military targets.
In a situation in which the drone does the quantifying (proportion) and qualifying (discrimination), how would it be determined that a drone transgresses IHL? In the simplest case, if a drone were not programmed at all to discriminate between civilians and combatants, it would be little more than a mine or cluster bomb. Its use would be illegal, as noted by Wagner. What should happen to the drone, its operators, the strategists and the political decision makers in such transgression?
Wagner notes a situation in which a general located on another continent watches hours of video of a compound. Civilians are visible. Insurgents enter and leave the compound openly carrying weapons. The general reasons that the presence of the weapons should have been a signal to the civilians that the compound might be a target. He authorizes the strike.
How would a programmer encode the protocols defining proportionality in such a decision making process? Would there be a mathematical proportion of weapons-carrying-persons vs. non-weapons-carrying persons set as a minimum level? What happens to a measure like this in a country such as Afghanistan, where a very high proportion of civilian men carry rifles whether they're combatants or not? What about parts of the United States in which hunting is popular, for that matter?
General O'Meara noted during discussion that "targeting decisions should be made by people steeped in ethics... not by 25 year olds." He went on to note that he has asked in many situations... "who does your ethics?" The answer, he said, is usually nobody... both in the military and in corporate settings.
As regards to Article 36, the obligation rests upon the state to evaluate whether use of remote weapons is prohibited or not. The ACLU has requested documents under the Freedom of Information Act regarding the "legal basis in domestic, foreign and international law" for such drone strikes, including who may be targeted with this weapon system, where and why..." The CIA has requested summary judgement, stating:
"The types of records sought include, for example, targeting information, damage assessments, information about cooperation with foreign governments, and legal opinions about general and specific uses of weaponized drones to conduct these alleged strikes. The CIA has informed Plaintiffs that it can neither confirm nor deny the existence or nonexistence of records responsive to this request without compromising the national security concerns that animate FOIA's disclosure exemptions -- specifically the exemptions set forth at 5 U.S.C. - - 552 (b)(1) and (b)(3) ("Exemption 1" and "Exemption 3"). The CIA's determination in this regard is proper and entitles it to summary judgment."
In this case, the U.S. says that any documents related to its responsibility under Article 36 are so secret, it can't even confirm or deny that such documents exist.
We should note that the U.S. signed the protocols noted above, but never submitted the protocols to Congress for ratification. 170 other countries ratified them. George Aldrich, the head of the U.S. delegation to the 1977 conference that adopted them criticized then President Reagan for not submitting them. The panelists acknowledged the ratification question, but stated that the U.S. is bound by the protocols because they are recognized as rules of customary international law valid for all states.
One of the more interesting parts of the panel discussion touched, briefly, on whether the use of drones might reduce transgression of IHL. Wagner referenced the shooting down by the U.S. of Iran Air flight 655, in which 290 civilians were killed. Wagner notes that subsequent reports speculated that "scenario fulfullment" played a role in the order to fire upon the airliner. Humans are subject to interpreting information in ways that fit their pre-existing thought patterns. The officer who ordered the strike expected an F-14 to be on the radar where the Airbus was, and so, saw an F-14. A drone trained to qualify the target might have been better able to distinguish between the F-14, with its 64 ft. wingspan, and the Airbus, with its 147 ft. wingspan.
The Global Hawk is considered the most autonomous of current Dept. of Defense unmanned aerial vehicles. It is for surveillance only, and does not carry missiles. by U.S. Dept of Defense
Political and Military Decision Making
The panel discussed whether the availability of drones would affect political decisions regarding going to war, and military decisions regarding how to wage war. There is little doubt their use already has affected such decisions. They made it easier, for example, to commit to military actions in Pakistan and Yemen.
All panelists agreed that by requiring fewer ground troops, the decision to commit to military actions using drones would be easier for politicians to make. Wagner noted that sending an army of machines to war might exact less of an emotional toll upon a population. This would be one factor in making such a decision easier. Drones, it was noted, are also less expensive than soldiers.
Which leads to an unasked question regarding the next generation of drones: who will make the decisions regarding whether to develop them, and will ordinary citizens have any input upon that decision?
As noted above, some of the memoranda regarding the use of offensive unmanned weapons has been clasified. Harold Koh, Legal Adviser, U.S. Department of State, made a speech regarding drones in March 2010. Koh, who had frequently criticized the Bush administration's actions in the "Global War on Terror" before joining the Obama administration, stated that the current administration adheres to the principles of discrimination (distinction) and proportionality in its operations, including the use of unmanned vehicles. He stated "great care is taken to adhere to these principles in both planning and execution, to ensure that only legitimate objectives are targeted and that collateral damage is kept to a minimum." The ACLU's suit, if successful upon appeal, may help reveal how much care is taken.
From a military point of view, the panel speculated that the decisions as to whether to produce this next generation will rest with bureaucrats who are "in a bureaucratic drill," and "unlikely to go to the decision makers and say we need to have a discussion." It is unlikely that Congress would disapprove any major weapons program.
Which leaves the generals in the field. If the weapons are developed, it is likely they will be used. As General O'Meara stated, first in regard to IHL: Cicero said, "inter arma leges silent"--in time of war the laws are silent--and then in regard to deployment, purely from a selfish military point of view, he "doesn't want to fight fair." A general wants "all the weapons that might possibly make an armed conflict shorter or less casualties that he can use. Commmanders know they have a responsibility to return the sons and daughters to their families whole."
The general takeaway from the panel was that drones, like all military tools, have their pluses and minuses. No public debate took place regarding the deployment of the first generation. Explanation by the Executive Branch only began in 2010, after years of strikes, and questioning on behalf of the public by the ACLU the same year. Columnists address related issues today.
The very well organized We Robot 2012 Conference was a specialists' gathering to address these issues in public. In addition to the papers themselves (see links above), it is expected that videos of the conference will be available online shortly, if they're not already in place at the time you read this article.
The second generation of drones will be upon us shortly. They will raise questions as to whether machines should have power over human life and who, if anyone, well be responsibile for robotic killing in time of war. The somewhat Frankensteinian nature of the question--an autonomous creation that kills the species that created it--makes it more likely that there will be some public discussion, though whether that influences the decisions to build and deploy is a different question.
I am a professional in the computer field whose specialty is databases. I grew up, went to school in, and lived in New York for many years. I have lived in Florida for twenty years now, and it is a wonderful place to see and experience nature. I am a liberal, and though sometimes both left and right use that term perjoratively these days, I think it's time for liberals to stand up and take pride in what they ar.