Korean university ‘killer robots’ boycott called off

Robotics experts cancel boycott after KAIST denies developing autonomous weapons

April 9, 2018
A robot wearing a sign saying 'Campaign to Stop Killer Robots’
Source: Getty

Artificial intelligence experts have called off a threatened boycott of one of Korea’s top universities, after it undertook not to use a new research centre to develop so-called killer robots.

Fifty-six AI and robotics researchers said they would maintain academic contacts with the Korea Advanced Institute of Science and Technology (KAIST) after its president, Sung-Chul Shin, gave public assurances about the activities of its Research Centre for the Convergence of National Defence and Artificial Intelligence.

Korean media reports had suggested that the centre was working on autonomous weapon projects including “AI-equipped unmanned submarines and armed quadcopters”.

Researchers also had concerns about KAIST’s partner in the centre, Korean arms company Hanwha Systems, which they said had developed cluster munitions and an autonomous “sentry” robot called the SGR-A1.

The experts aired their concerns in an open letter scheduled for release on 4 April. However, KAIST vehemently denied any intention to work on autonomous weapons systems after it was contacted by Times Higher Education.

The researchers said they had now accepted KAIST’s guarantees. “Given this clear commitment, the signatories to the boycott have rescinded the action,” organiser Toby Walsh, Scientia professor of artificial intelligence at the University of New South Wales, said in a statement.

“They will once again visit and host researchers from KAIST and collaborate on scientific projects.”

KAIST was ranked 95th in the world, and the second top Korean institution behind Seoul National University, in this year’s THE World University Rankings. Professor Walsh described it as “the MIT of Korea”.

He added that AI had legitimate military applications. “No one, for instance, should risk life or limb clearing a minefield – this is a perfect job for a robot,” he said.

“But we should not hand over the decision of who lives or who dies to a machine. This crosses an ethical red line and will result in new weapons of mass destruction.”

A United Nations group is meeting in Geneva this week to discuss the humanitarian and international security challenges posed by lethal autonomous weapons systems. Twenty-two participating nations have backed a call for a pre-emptive ban on such weapons.

john.ross@timeshighereducation.com

You've reached your article limit.

Register to continue

Get a month's unlimited access to THE content online. Just register and complete your career summary.

Registration is free and only takes a moment. Once registered you can read a total of 3 articles each month, plus:

  • Sign up for the editor's highlights
  • Receive World University Rankings news first
  • Get job alerts, shortlist jobs and save job searches
  • Participate in reader discussions and post comments
Register

Have your say

Log in or register to post comments