Public Attitude Toward Autonomous Weapons Systems


Horowitz presents lecture on Autonomous Weapons Systems (AWS), or “killer robots.” He also discusses the importance of public opinion in policy changes. 

Michael Rasmussen, Maroon-News Staff

On Thursday, March 22, University of Pennsylvania Professor of Political Science Michael Horowitz lectured students and faculty on his recent findings regarding American public opinion on the use of Autonomous Weapons Systems (AWS), which he occasionally referred to as “killer robots” since they can select and engage targets “without further intervention” by a human controller. 

Horowitz discovered that public opposition toward AWS depends on how the question is framed; people are more likely to support AWS if told that the robots are deployed to protect United States forces or that other nations are currently developing them.

“When people were told U.S. adversaries were already developing and deploying the technology, all of a sudden those morals qualms go away,” Horowitz said. 

Groups such as the Campaign Against Killer Robots, a consortium of about 60 non-governmental organizations opposing the creation and use of AWS, often use moral values to support its stance. Horowitz said they argue that machines cannot be held accountable for human actions and that people feel more comfortable with deaths attributed to humans. 

Horowitz found that the public is more likely to support the deployment of AWS if the question is framed in terms of cost-benefit as opposed to moral grounds, even if survey participants were told that AWS would be less effective than human forces. 

He also stressed the importance of public opinion on policy decisions. 

“There’s a bunch of research that suggest that public opinion shapes the micro-foundations of foreign policy,” Horowitz said. “Public perception of emerging technologies actually makes a difference.”

Senior Jake Baynum, who is writing his honors thesis in political science on cybersecurity, brought up tactical concerns regarding the use of AWS. 

He cited a 2009 example of hacking AWS, wherein a group of Iraqi insurgents used $26 software to intercept the video feed of an American unmanned drone, according to The Guardian.

“The bottom-line is if something is automated, it is vulnerable and susceptible to manipulation,” Baynum said. “America’s nuclear arsenal is pretty much all analog – which some people think is antiquated – but it means that our nuclear system can’t be hacked and remotely detonated. That to me is a good thing,”  

Furthermore, major concerns about AWS stem from the numerous instances in which United States drone strikes in the Middle East have unintentionally caused civilian casualties.  

Junior Annie Hayes attended the lecture and said that she was uneasy about the use of AWS owing to the potential for unintended consequences.  

“How can an automated system [ensure] that there won’t be civilians caught up in the blast?” Hayes said. “I stand for anti-terrorism, but often many innocent people are killed in drone strikes…When innocents are killed that’s when AWS loses my support.” 

As Horowitz discussed the current political situation regarding AWS, he explained that while the United States has halted its production of autonomous aircraft, China is accelerating the manufacturing of its own AWS. He said that the construction of AWS by rival nations such as Russia and China may force the United States to keep up. 

Junior Pat Beljan does not think that new technological advances are necessarily a bad thing.  

“I think [the recent push to create more AWS] is part of a natural progression of an arms race,” Beljan said. “Some of our biggest jumps have come from war, so I’m not surprised that big business and society as a whole have grown to rely on conflict, because without it humanity would experience stagnant growth.”

Contact Michael Rasmussen at [email protected].