Danger, danger

Noel Sharkey warns that robots do not possess the necessary discriminatory skills to be used as killers by the military

八月 13, 2009

With articles titled "March of the killer robots" and "Robot wars are a reality" to his name, Noel Sharkey is clearly keen to catch people's attention.

A press briefing last week stimulated more dramatic coverage in the mainstream press when he warned that human lives could be endangered by autonomous robots.

The professor of artificial intelligence (AI) and robotics at the University of Sheffield said he had been moved to court media attention by the military's use of unmanned vehicles not just for surveillance tasks but also to kill.

"The problem is that the military have a strange view of AI based on science fiction," he said, cautioning that advances in human-robot interactions - such as speech perception and facial recognition - should not be mistaken for true intelligence in machines.

Professor Sharkey believes that academics have a responsibility to alert the public to ethical issues in their work, and that some research in his field is misdirected.

"I have been working in artificial intelligence for 30 years and I love it - it's going really well, but we can't do this type of thing," he said.

He is quick to espouse the benefits of service robots for tasks such as cleaning, harvesting crops and even disarming bombs, but insists that they should not be used to kill. While his early interest in the ethics of robotics focused on such issues as giving machines childcare responsibilities, he is now preoccupied by more pressing concerns.

He said he became aware of the threat of killer robots in early 2007 and subsequently spent six months reading through US Air Force plans.

"The next thing that is coming, which really scares me, is armed autonomous robots (that) will decide where to kill, who to kill and when to kill," he said.

He said that the danger is not super-intelligent "Terminator-style" robots taking over the world, but people giving "dumb" machines the capacity to kill, and that "it is here today rather than in some distant future".

This is vastly overstretching the existing technology, he said, because "robots do not have the necessary discriminatory ability", let alone conscience or reason.

The ability to discriminate between civilians and soldiers and the concept of proportionality are pivotal parts of the Geneva Convention, but at present "there's no software that can make a robot proportional," he said.

David Schley

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.