A recent, often impassioned, debate at the Naval Postgraduate School (NPS) on the ethics of robotic combat systems offered the public a window into NPS’ attempts to challenge its student body to explore both sides of cutting-edge, defense-focused problems.
NPS Department of Defense Analysis Assistant Professor Bradley J. Strawser moderated a debate between Visiting Associate Professor Heather M. Roff with the University of Denver and freelance journalist Joshua Foust.
The debaters sought to answer the question, “Does the future of unmanned and autonomous weapons pose greater potential ethical dangers or potential ethical rewards?”
Roff, whose writings have been critical of unmanned combat systems, is the author of “Killing in War: Responsibility, Liability and Lethal Autonomous Robots,” which was featured in the “Routledge Handbook for Ethics and War.”
Foust’s work has appeared in The Atlantic, New York Times and Foreign Policy, amongst other publications, and he is a frequent guest on BBC World News. He is also the author of the article, “A Liberal Case for Drones.” He takes a more favorable view of unmanned systems and has argued that, under limited conditions, they are an ethical, even preferable option to boots on the ground.
Strawser is himself an authority on the ethics of unmanned systems. He came to NPS last year after working with Oxford University’s Institute for Ethics, Law and Armed Conflict. His work, “Killing by Remote Control: The Ethics of an Unmanned Military” explores the potential ethical pitfalls and gains that unmanned systems pose.
While the debaters were cordial and shared some common ground, they were passionate about their respective positions.
Foust argued in favor of the development of unmanned autonomous systems contending that, “machines are quick, better at processing large amounts of data instantly” and therefore superior in some aspects to actual service members and that although machines are imperfect, that they are more effective than their human counterparts.
“Humans are deeply flawed moral actors in war,” Foust said. “Machines respond to criteria and input, they lack emotional choices” and the presumed negative affects of those emotions.
“If we can make a machine that is capable of decision making as well as a human being, why shouldn’t we?” asked Foust. “The idea behind autonomous weapons is that if you can somehow minimize some of the those bad decisions that humans make in warfare, then you have a net-good.”
He further argued that many believe, “almost religiously,” in the reliability of human agency to make lethal decisions during combat despite examples to the contrary.
“When we are talking about autonomous machines and humans, we are not talking about humans as perfect or even necessarily moral actors, that's an important aspect of this discussion, because people make really, really bad decisions,” Foust Said.
“Machines respond to specific criteria and input, they do not have the same kind of emotional failings,” said Foust. “Emotions cut both ways, they consist of both compassion and hatred, mercy and vengeance. If you can take out the desire to kill someone that just killed someone, you have the potential to make combat better. You can make it less deadly and more precise and less detrimental to civilian populations.”
Roff countered that it was the absence of the ability of unmanned autonomous systems to use human emotion that made unmanned system generally, and Lethal Autonomous Robots (LAR) specifically, a poor combat option.
“We are focusing on the vices, but we should be looking at the virtues. What about when a soldier shows empathy or mercy? Taking the emotion out of combat is not necessarily a good thing,” said Roff. “You can’t mimic human judgment.”
Freelance journalist Joshua Foust, left, and University of Denver Visiting Associate Professor Heather M. Roff, right, joined NPS Department of Defense Analysis Assistant Professor Bradley J. Strawser for a unique debate on the ethics of unmanned combat systems, Sept. 24.
A central theme throughout the debate was the issue of liability. Debaters exchanged verbal volleys over the ability to hold operators, commanders and even the manufacturers of unmanned systems liable in the event that a system performs contrary to its intended purpose.
“What would the machine be guilty of, algorithmic manslaughter?” quipped Roff.
Roff further contended that while a human chain of command can be held accountable for its actions, “LARs break the chain of command. The machine itself becomes the commander, and that is what we are morally opposed to.”
Still, Foust contends that it would be in the interests of the U.S. and foreign militaries to do the due diligence to ensure that any autonomous system that was deployed actually worked as well or better than a man or a woman on the ground. Thereby reducing the need to focus on liability.
“No military commander would want to deploy a system that is underperforming, unpredictable and leads to uncertain results, it doesn't happen. Ultimately, militaries are rational actors and if something cannot match human performance than there is very little reason to go through the expense of developing a system that will not be used,” Foust Said.
Roff wasn’t buying it.
“I am skeptical about the reality of a government taking the time to do due diligence to make a machine that works,” said Roff. She pointed to several historical examples to prove her point. In her estimation, the pushing through of flawed systems like the original M16 Assault Rifle are indicative of a pattern of governments rushing to get weapons on to the battlefield despite their inefficacy.
The debate was largely future focused, but Foust was quick to point out that there are already combat systems deployed in Afghanistan making lethal decisions without human input.
“When we look at defensive weapons, they fire largely without human input. Counter-mortar systems identify mortars that have been fired, determine where they were fired from, and fire upon and kill people faster then a human being could possibly react. We do not look down upon this because they [counter-mortar systems] are defensive, but they are used daily and kill people daily. So in a way, the cat is already out of the bag,” said Foust.
“We tell ourselves that because we use this weapons in a limited set of circumstances that it is okay, but that in another set of circumstances its not okay and I think that is a little bit to simplistic,” he continued.
Roff countered by making a distinction between automated and autonomous systems. Much of her argument focused on the ability or inability of machines to autonomously make complex moral judgments in accordance with International Human Rights Law (IHRL) and International Humanitarian Law (IHL) and to whom society would ascribe liability when these bodies of law were violated.
“If we grant that LARs will never have full moral agency, than we have two conclusions. The first is that LARs cannot actually violate a right and thus cannot uphold a right. That is because a violating a right is an intentional wronging of someone … your toaster cannot behave morally, it cannot do you wrong,” said Roff.
Given the complexity and nuances of IHL, Roff made the case that human judgment, flawed though it may be, is superior to what could be accomplished by a machine. In order for a LAR to be effective in combat, it would have to “be able to take all the context dependent and nuanced facets of International Human Rights law and convert them to series of ones and zeroes.”
At times the debate spilled over into areas outside the purview of military operations.
“These are not just military questions, they have moral implications that go beyond military use like bio-ethics, medical ethics, autonomous cars,” said Strawser.
The debate was attended by NPS students as apart of a Warfare Innovation Workshop co-sponsored by the Navy Warfare Development Command’s Chair of Warfare Innovation and NPS’ Consortium for Robotics and Unmanned Systems (CRUSER) as part of NPS’ continuing commitment to providing a intellectually diverse academic environment to Naval Officers, Department of Defense Civilians and allied partners.