Examples of typical events that most people think could cause humanity’s extinction are a large asteroid impact or a volcanic eruption of sufficient magnitude to cause catastrophic climate change. Although possible, these events actually have a relatively low probability of occurring, in the order of one in fifty thousand or less, according to numerous estimates found via a simple Google search.

However, there are other events with higher probabilities that may cause human extinction. In 2008, experts surveyed at the Global Catastrophic Risk Conference at the University of Oxford suggested a 19 percent chance of human extinction over this century, citing the top five most probable to cause human extinction by 2100 as:

  1. Molecular nanotechnology weapons (i.e., nanoweapons): 5 percent probability
  2. Superintelligent AI: 5 percent probability
  3. Wars: 4 percent probability
  4. Engineered pandemic: 2 percent probability
  5. Nuclear war: 1 percent probability

All other existential events were below 1 percent. There is a subtle point the survey does not explicitly express, namely, the risk of human extinction increases with time. You may wonder, Why? To answer this question, consider these examples:

  • Nanoweapons and superintelligence become more capable with the development of each successive generation. In the 2008 Global Catastrophic Risk Conference survey, superintelligent AI equates with molecular nanotechnology weapons as the number one potential cause of human extinction. In my view, molecular nanotechnology weapons and superintelligent AI are two sides of the same coin. In fact, I judge that superintelligent AI will be instrumental in developing molecular nanotechnology weapons.
  • In my new book, War At The Speed Of Light, I devoted a chapter on autonomous directed energy weapons. These are weapons that act on their own to take hostile action, resulting in unintended conflicts. Unfortunately, current autonomous weapons don’t embody human judgment. This being the case, wars, including nuclear wars, become more probable as more autonomous weapons are deployed.
  • Lastly, the world is currently facing a coronavirus pandemic. Although most researchers believe this is a naturally occurring pandemic, it still infected 121,382,067 people and caused 2,683,209 deaths to date on a worldwide basis. This suggests the death rate is a little over 2 percent. However, if the virus was more infectious and more deadly, it could render the Earth a barren wasteland. Unfortunately, that is what an engineered pandemic might do.

To my eye, the top five potential causes surfaced by the Global Catastrophic Risk Conference at the University of Oxford in 2008 are all possible, and the probabilities associated with them appear realistic. This means that humanity has a 19 percent chance of not surviving the 21st century on our current course.

In the next post, I will suggest measures humanity can take to increase the probability they will survive into the 22nd century.