Will university researchers create ‘killer robots’?

Campaigners fear a huge increase in EU military research spending will lead to development of autonomous weapons

May 14, 2018
Multiple exposure of a drone flying at sunset
Source: iStock

“EuroSWARM” sounds like something out of Nigel Farage’s nightmares.

In fact, it is a European Union-funded research project that has experimented with drones, remote-controlled cars and other sensors to create an autonomously behaving “swarm” of bots that can communicate with each other.

In a demonstration scenario, researchers set the swarm to check out a “suspicious-looking” vehicle, explained Hyo-sang Shin, reader in guidance, navigation and control at Cranfield University, one of the project partners. The idea is that the swarm could be used for scouting an area before troops are deployed, he said.

The project, which came to an end in November last year, did not equip any of the drones or cars with weapons. The swarm is rather about “maximising the information you can collect”, said Dr Shin.

But EuroSWARM’s military uses have critics worried. It is one of the first trial projects in a new era of EU-funded military research; the budget for similar activities is set to explode over the next decade.

This funding splurge, triggered by fears of European backwardness in military technology, has seen the global debate around research into “lethal autonomous weapons” (Laws) – colloquially known as “killer robots” – move to Brussels.

“Although the EU hasn't given any funding (yet) to ‘killer robots’ in the strict sense,” said Bram Vranken, a researcher at Vredesactie, a Belgian peace organisation, “it is clearly prioritising robotic systems which are pushing the boundaries towards increasingly autonomous systems”, such as swarm systems or “integrated and autonomous surveillance technology”.

Vredesactie is one of several groups, hailing from Germany, Italy, the UK and Spain, that have formed Researchers for Peace to campaign against what they call the “further militarisation of the European research budget”. The group accuses the EU of developing autonomous weapons “without any public debate”. So far, more than 600 researchers have signed a petition in support.

Aside from EuroSWARM, Mr Vranken said that he was also worried about Ocean 2020, a €35 million (£30.8 million) project that aims to “integrate drones and unmanned submarines into fleet operations”. The project, led by Leonardo, an Italian weapons contractor, involves several European ministries of defence, plus the Fraunhofer Society, a German applied research network.

These projects are potentially just the beginning. Earlier this month, the EU announced its spending plans for 2021-27, and pledged €13 billion over the period for the European Defence Fund, even more than was expected. Of this funding, €4.1 billion will be set aside explicitly for research, a huge leap in resources compared with now, with the rest spent on development. 

This will place the EU “among the top four” defence research and technology investors in Europe, according to the European Commission. However, this will still be peanuts compared with the US, where the Department of Defense is spending about $16 billion (£11.8 billion) a year on science and technology. 

The fight in Brussels is now over how this money should be used. Back in 2014, the European Parliament was one of the first bodies to take seriously warnings about “killer robots”, calling on member states to “ban the development, production and use of fully autonomous weapons which enable strikes to be carried out without human intervention”.

In February this year, MEPs amended proposals from the commission – the EU’s executive arm – to prevent EU funds being spent on “fully” autonomous weapons that “enable strikes to be carried out without meaningful human intervention and control”. Asked whether it supports this prohibition, a European Commission spokeswoman declined to comment on the record. For now, it is not clear if the MEPs’ prohibition will stand.

Those pushing for increased EU-wide military research point out that the Continent lags behind rivals when it comes to developing new military technologies such as drones.

But this is not an argument that impresses Laëtitia Sédou, EU programme officer at the European Network Against Arms Trade. “One of the reasons [for the creation of the EU] is to try and prevent going back into this arms race,” she said.

Despite an international effort by the Campaign to Stop Killer Robots, governments are yet to agree to a ban on weapons where humans no longer have “meaningful control” over the use of force. What, if anything, can universities and researchers do in the meantime?

One option is to boycott institutions seen to be taking their research too far. In March, dozens of researchers threatened to boycott the Korea Advanced Institute of Science and Technology (KAIST), a university in South Korea, after it opened a Research Center for the Convergence of National Defense and Artificial Intelligence with an arms company. This spurred a pledge from KAIST’s president that the university would avoid developing “autonomous weapon lacking meaningful human control”.

But this poses the question of how far scientists should collaborate with research projects that get close to – but stop short of – creating a fully autonomous weapon; there are a huge range of processes that can be automated beforehand, some more ethically challenging than others.

The “biggest ethical issue” is automating the decision to fire, said Stuart Parkinson, executive director of Scientists for Global Responsibility, a UK-based organisation with about 750 members. But automatic take-off and landing for drones is arguably “less problematic”, he said.

These complexities mean that “it’s hard to say this project is ethical; this is not”, Dr Parkinson added. For this reason, universities need to make sure that researchers are ethically trained, while ethicists should be included in research teams, he said.

As with any area of fast developing research, a decent proportion of research spending should be devoted to looking into how the technology might be misused, Dr Parkinson argued. And at the moment “we don’t have that”, he said.

And when in doubt over the ethics of a project, just look at the funders, Dr Parkinson advised. If your backers are military, “whatever you do will be sucked into that world”, he said.

Once an electrical engineer, Dr Parkinson left the field after concluding that it was simply too dominated by military research funders. For some academics, “maybe it’s time to look for a different direction”, he said.

But there will be no shortage of young researchers willing to take the place of the disenchanted, hence the need for the military funders themselves to abide by proper research ethics guidelines, Dr Parkinson pointed out.

For his part, Dr Shin acknowledged that his EuroSWARM project might one day be a building block of a lethal autonomous weapon system, but argued that “any technology can be dangerous”.

He said that he would “probably” agree to work on a research project that actually involved weapons. “But I would restrict myself to things that might benefit or reduce risk to human troops or [reduce] civilian casualties,” Dr Shin added. He is against drones ever using their own judgement to fire.

Proper regulation, rather than academic boycotts such as the one proposed against KAIST, are likely to be more effective, Dr Shin said.

It will be “years rather than decades” before drones are able to fire on their own initiative, said Dr Parkinson, although then their “reliability will be in the eye of the beholder”.

But in a sense, fully autonomous weapons are already with us: the Korean border already has machine-gun turrets that can in theory fire automatically on movement, Dr Parkinson said (although the South Korean military has reportedly made sure that a human has to authorise any attack). He warned: “That’s an example of where something is already happening.”

david.matthews@timeshighereducation.com


Slaughterbots: necessary warning or overblown scaremongering?

Academics have taken the lead in warning the world about the havoc that uncontrolled autonomous weapons could cause if not banned.

Slaughterbots, a video released last year that has been viewed more than 2.5 million times on YouTube, depicts a world in which a corporation has developed flying drones – small enough to slip through a half-open window – that can hunt down and kill pre-selected victims with a single explosive bolt to the brain.

Unknown groups get their hands on the drones, however, and use them to massacre US politicians owing to their party affiliation, and university students who have shared a particular political video online, all using facial recognition technology and data scraped from social media.

“This short film is more than just speculation,” warns Stuart Russell, an artificial intelligence expert at the University of California, Berkeley, after the short film’s bloody conclusion. “It shows the results of integrating and miniaturising technologies that we already have.”

Others are unconvinced that this nightmare scenario could come to pass. Paul Scharre, a senior fellow at the Center for a New American Security in Washington DC, has written that while such drones are technologically feasible, they could be “defeated by something as simple as chicken wire”.

Even if states did start building drones that could target civilians rather than other militaries, there is no evidence that they are doing so now, Mr Scharre has argued – there is no reason to believe that terrorists would find it easy to get their hands on them. “Terrorists use airplanes and trucks for attacks precisely because successfully smuggling military-grade weapons into a Western country isn’t that easy,” he writes.

Overall, the film’s doomsday prophecy “plays into the hands of those who argue that these fears of autonomous weapons are overhyped and irrational”, Mr Scharre has argued.

Please login or register to read this article

Register to continue

Get a month's unlimited access to THE content online. Just register and complete your career summary.

Registration is free and only takes a moment. Once registered you can read a total of 3 articles each month, plus:

  • Sign up for the editor's highlights
  • Receive World University Rankings news first
  • Get job alerts, shortlist jobs and save job searches
  • Participate in reader discussions and post comments
Register

POSTSCRIPT:

Print headline: For every killer robot, a Prometheus

Have your say

Log in or register to post comments