Focus on existential threats, philosopher tells researchers

Toby Ord urges academics to reflect far more deeply on the risks of everything from asteroids to rogue artificial intelligence

February 23, 2020
Picture of mushroom cloud from atomic bomb dropped on Hiroshima, Japan
Source: Getty

For Toby Ord, humanity is still in its adolescence – and a crucial research goal must be to ensure it reaches maturity and realises its full potential.

Now senior research fellow at the University of Oxford’s Future of Humanity Institute, Dr Ord studied both philosophy and computer science at the University of Melbourne before moving to Oxford to focus on philosophy. In 2009, while working on global health and poverty, he set up the society Giving What We Can. This, he told Times Higher Education, enables members to pledge to “give at least a 10th of their income to where they think it can do the most good. To date, just over 4,500 people have donated almost £100 million.” His own contribution has gone to “people in the poorest countries suffering from easily preventable diseases”. He has also been consulted on such issues by the World Health Organisation, the UK’s Department for International Development and No 10.

Yet, although he continues to work in this area, Dr Ord has now turned most of his attention to the even larger topic explored in his forthcoming book, The Precipice: Existential Risk and the Future of Humanity.

We can trace its origins back to his PhD. His supervisor and mentor, the philosopher Derek Parfit, ended his celebrated 1984 book Reasons and Persons by reflecting on a devastating nuclear war. If it killed 99 per cent of the human race, this would obviously be an unimaginable tragedy, yet we might eventually be able to rebuild some sort of civilisation. But if it destroyed all humanity, it would have an utterly different significance.

“With that last 1 per cent,” Dr Ord explained, “we would lose not only [many millions of] people but the entire future of humanity and all the trillions who could come to exist…Humanity has survived for 2,000 centuries so far. There’s nothing stopping us, other than [a number of ‘existential risks’], surviving for thousands more…We need to be proactive about [that] and avoid developing the kind of things which take us close to the brink.”  

That is the “precipice” we have to get past, and Dr Ord’s book assesses the level of existential threat posed by everything from asteroids to “unaligned artificial intelligence”. Yet he felt the topic had been largely neglected by researchers.  

“When it comes to something like climate change,” he explained, “a huge amount of work is being done, but only a fraction of it looks at the worst outcomes. How bad could they be? Could they realistically pose a threat to the collapse of civilisation or even human extinction? Is there any realistic chance that the warming will be a very extreme 10 degrees?...For each particular risk, people don’t pay special attention to [the small chance of something occurring] that could destroy not only all the lives of the people today but all the people to come and the entire future of humanity.”

Our lack of forward planning is vividly illustrated in The Precipice. On the significant risk of “engineered pandemics”, it points out, “the international body responsible for the continued prohibition of bioweapons (the Biological Weapons Convention) has an annual budget of just $1.4 million [£1.1 million] – less than the average McDonald’s restaurant”. Furthermore, “we can state with confidence that humanity spends more on ice cream every year than on ensuring that the technologies we develop do not destroy us”.

In the case of his own fellowship, Dr Ord writes, money from the European Research Council and a philanthropist has “allowed [him] years of uninterrupted work on a topic I consider so important”.

This had provided him with “a safety net”, he admitted, to “work on topics which are less academically fashionable” and might be seen as “too big for the profession. It is hard to place journal articles about them, compared to something thousands of people have already written about where there are clear technical questions.”

There were also issues around the culture of science. Although Dr Ord said that he understood “the case for openness and transparency”, we also had to take greater account of “information hazards”.

“In nuclear physics,” he went on, “there is an awareness that we have to be careful about publishing ideas which could cause nuclear proliferation. With the increasing power of bioengineering, it could be that subfield needs [similar safeguards]. We should be open to different ways of doing things. It’s not just an inherent right of academic freedom that we can publish whatever we want.”

matthew.reisz@timeshighereducation.com

  • The Precipice: Existential Risk and the Future of Humanity is published by Bloomsbury on 5 March.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Academia has gone green in a big way in recent years, but some doubt whether it will make much difference to the planet. Nick Mayo speaks to scholars and students to assess the sector’s environmental record

12 September

Sponsored