The scientists from the Global Challenges Foundation and the Future of Humanity Institute used their research to draw up a list of the 12 most likely ways human civilisation could end on planet earth.
“[This research] is about how a better understanding of the magnitude of the challenges can help the world to address the risks it faces, and can help to create a path towards more sustainable development,” the study’s authors said.
“It is a scientific assessment about the possibility of oblivion, certainly, but even more it is a call for action based on the assumption that humanity is able to rise to challenges and turn them into opportunities.”
Extreme climate change
The likelihood of global coordination to stop climate change is seen by the study’s authors as the biggest controllable factor in whether the environmental catastrophe can be prevented.
They also warn that the impact of climate change could be strongest in the poorest countries and that mass deaths from famines and huge migration trends could cause major global instability.
Nuclear war
While the researchers concede that a nuclear war is less likely than in the previous century, they say that evidence suggests “the potential for deliberate or accidental nuclear conflict has not been removed”.
The biggest fact which they say would influence whether one happens would be how relations between future and current nuclear powers develop.
Global pandemic
“There are grounds for suspecting that such a high impact epidemic is more probable than usually assumed,” the researchers believe.
The ability of the world’s medical systems to respond to a pandemic is important in preventing a catastrophe, researchers say – but the biggest threat is simply whether there is an uncontrollable infectious disease out there or not.
Major asteroid impact
An asteroid impact larger than 5km in size would destroy an area the size of the Netherlands, the researchers warn. They say these events happen every 20m years.
“Should an impact occur the main destruction will not be from the initial impact, but from the clouds of dust projected into the upper atmosphere,” the study warns.
“The damage from such an “impact winter” could affect the climate, damage the biosphere, affect food supplies, and create political instability.”
Super volcano
Like an asteroid impact, the greatest threat from a super-volcano is a global dust-cloud that would block the sun’s rays and cause a global winter. “The effect of [historic eruptions] could be best compared with that of a nuclear war,” the researcher said.
The ability to stop damage depends on the ability of nations to coordinate and limit damage they cause.
Ecological catastrophe
The researchers say that humanity either has to conserve the eco-system, or hope that civilisation is not dependent on it.
“Species extinction is now far faster than the historic rate,” the study warns. They say humanity must develop sustainable economies in order to survive this one.
Global system catastrophe
“The world economic and political system is made up of many actors with many objectives and many links between them,” the study warns. “Such intricate, interconnected systems are subject to unexpected system-wide failures caused by the structure of the network”.
Economic collapse could lead to social chaos, civil unrest and a breakdown in law and order.
Synthetic biology
The scientists are worried that someone will intentionally build an “engineered pathogen” to wipe out the human race.
“Attempts at regulation or self-regulation are currently in their infancy, and may not develop as fast as research does,” they warn.
Nanotechnology
Nanotechnology’s proponents tout it as a way to solve problems, but the researchers believe it could present serious problems.
“[Nanotechnology] could lead to the easy construction of large arsenals of conventional or more novel weapons made possible by atomically precise manufacturing,” they warn, before adding: “Of particular relevance is whether nanotechnology allows the construction of nuclear bombs.”
Artificial intelligence
The researchers believe that “extreme” artificial intelligence “could not be easily controlled” and would “probably act to boost their own intelligence and acquire maximal resources”.
Rather spookily, they say one of the key factors in our survival is whether “there will be a single dominant AI or a plethora of entities”.
In a bit of a twist, they concede that a powerful artificial intelligence might make solving all the other risks in the report much easier. All hail our new computer overlords?
Future bad global governance
The message here is that if politicians don’t come up with solutions to the other problems in the list, they are a risk in and of themselves.
“There are two main divisions in governance disasters: failing to solve major solvable problems, and actively causing worse outcomes,” the study explains.
Unknown consequences
Finally, the researchers warn of “unknown unknowns” and call for “extensive research” into “unknown risks and their probabilities”.
“One resolution to the Fermi paradox – the apparent absence of alien life in the galaxy – is that intelligent life destroys itself before beginning to expand into the galaxy.” It’s all very cheery.