The US military is developing devices that link the neural signals in a soldier’s brain to his equipment, but academics are just beginning to debate the ethical issues. Melanie Newman reports
The cyborg soldiers of science ­fiction will move a step closer to reality if a US military project is ­successful.
America’s Defence Advanced Research Projects Agency (Darpa) is currently developing a pair of high-tech binoculars that plug into soldiers’ brains to alert them to threats detected by their subconscious. The Cognitive Technology Threat Warning System will use electrodes to monitor the wearer’s neural signals. The system will spot “brain-wave signatures” that indicate if a soldier has subconsciously become aware of a threat or target, and will prompt the wearer to react, eliminating the delay that normally occurs while the brain processes the information.
The binoculars are the first known product of Darpa’s Augmented Cognition programme, which aims to enhance soldiers’ performance under stress. It is also investigating brain stimulation technology aimed at detecting when soldiers are becoming less alert and improving their focus and memory.
Jonathan Moreno, professor of medical ethics at the University of Pennsylvania and author of Mind Wars, predicts that over the course of the 21st century, boundaries between humans and artificial intelligence will become in­creas­ingly blurred. “At some point, it will become hard to know what is authentic human capacity and what is artificially en­hanced,” he said.
Neuroscience, he pointed out, is one of the fastest-growing interdisciplinary areas, with an explosion of studies over the past decade. Of the most exciting re­search, much has been funded by the US military, although it often also has great potential benefit for civilian lives.
Amy Kruse, programme manager of Darpa’s Defense Sciences Office, has made no secret of her interest in harnessing the speed of subconscious thought for military ends.
In 2005, Dr Kruse called for help in investigating and mea­suring cognitive functions under battlefield conditions. “A multidisciplinary effort is critical: engineers, mathematicians and signal-processing experts are needed,” she wrote in an article on the Darpa website. Dr Kruse did not call for ethicists or sociologists, but academics in these fields are increasingly probing the consequences of developments such as the binoc­ulars.
Darpa’s Human-Assisted Neural Devices project is looking at using brain signals to control prostheses and other devices, research that could prove invaluable to amputees and the paralysed. Studies into the role of beta-blockers in reducing post-­traumatic stress could benefit accident victims as well as returning ­soldiers.
Darpa’s attempts to combat the effects of sleep deprivation using drugs that enhance neural transmission and devices such as trans­cranial magnetic stimulation would have civilian applications. Modafinil, a psychostimulant and “anti-sleep” drug, reportedly used by US Air Force pilots, is also taken by university students.
Next week, a multidisciplinary group will gather at a seminar at the London School of Economics to discuss the consequences of some of these developments.
Richard Ashcroft, professor of biomedical ethics at Queen Mary, University of London, who will be an observer at the seminar, believes the US soldier “enhancement” research is an indicator of the sort of wars the military predicts it will be fighting in future. A shift from large-scale engagements to special forces in the field has occurred, he suggests.
“The establishment has stopped being interested in ways to win battles through mass action; it is concentrating on ways to make the individual soldier as effective as possible,” Professor Ashcroft said. “A lot of the ethical debate about that has centred on whether it is morally acceptable to alter somebody’s body in a certain way.”
The ethics of experimenting on soldiers and issues of their consent have already provoked much debate in the context of vaccines. The long-term effects on individuals of interfering with their thought processes are still un­explored.
Second, the ethics of the treatment itself must be considered. “Is a soldier who is overtired, but can’t sleep, more likely to do something dangerous or criminal?” Professor Ashcroft asked. “If beta-blockers can take away the emotional effects of war, will soldiers be less inhibited about committing war crimes?”
The possibility of coupling subconscious human thought to non-human reaction throws up all sorts of questions about future uses for the research. How long before the soldier’s subconscious signals of recognition of a target are plugged directly into a weapon?
A third dimension is that of military ethics. “We should also be examining what sort of wars we think it is ethical to fight,” Professor Ashcroft said.
In the context of sport, the professor points out, it is considered unethical for athletes to take steroids or to otherwise artificially enhance their bodies. “On the other hand, human enhancement technologies are considered ac­cept­able in the military. There is social uncertainty over these issues, yet neuroethicists, bioethicists and military ethicists aren’t really talking,” he said.
Nikolas Rose, professor of sociology at the LSE and one of the seminar’s organisers, is about to begin a project to chart the im­pact of recent developments in the neurosciences on society. “Some people are hoping we will be able to turn ourselves into more intelligent and alert individuals,” he said. “Others worry we will be robbed of the things that make us human.”
Professor Rose will be investigating the evidence behind both claims. He draws parallels with genetics, in which advances in genetic screening prompted fears that humans were walking into a future governed by eugenics. “The empirical evidence we have on genetic screening suggests that is not happening; human genetics is very much more complicated than people had thought,” he said. “I suspect I will find the same with neuroscience.”
The multidisciplinary seminar The Military and Security Uses of Neuroscience: Ethical, Governance and Policy Challenges is organised by the Institute for Science and Society
at Nottingham University, and the BIOS Centre for the Study of Bioscience, Biomedicine, Biotechnology and Society at the London School of Economics. It will be held at the
LSE on June . Participation is by invitation only. For details, contact: email@example.com
LIE DETECTION — ARE NEURO-IMAGING TECHNIQUES THE WAY FORWARD?
US defence agencies — as well as private companies — are trying to develop lie-­detection using ­neuro-imaging techniques to look for certain ­patterns of brain activity.
A recent paper, “Neuroethics and National Security”, by psychologist Turham Canli of Stony Brook University, New York, published in The American Journal of Bioethics , said: “There is currently insufficient evidence that these technologies are reliable, and yet one firm, No Lie MRI, is currently marketing functional magnetic resonance-based lie detection, and another, CEPHOS Corporation, is planning to enter the market soon.”
Another test being explored by security agencies looks for evidence of “guilty knowledge” in a suspect’s brain-wave pattern when they are shown a picture of a crime scene.
Professor Canli notes in his paper that a private firm is also marketing this type of test and claiming that it has been used by the FBI and US police departments.
Paul Wolpe, a professor of sociology at the University of Pennsylvania’s Centre for Bio­ethics, who will speak at the LSE seminar (see below), says the possibility of brain imaging suspects against their will throws up legal and ethical issues. “In the US, you can refuse to testify on the basis of self-incrimination, but you can be compelled to provide blood, DNA and even diaries. Since brain
imaging does not require speech, it will be a real test of jurisprudence.”
Who might be able to use the new technology is also under question.
“Will private firms begin offering deception detection to banks looking for honest employees, parents trying to determine whether their children are using drugs, and boy scout troops looking to weed out child molesters?” Professor Wolpe asked.