Tag Archives: Will Humanity Survive The 21st Century?

human extinction

Will Humanity Survive the 21st Century?

In my last post, I stated, “In making the above predictions [about the singularity], I made one critical assumption. I assumed that humankind would continue the “status quo.” I am ruling out world-altering events, such as large asteroids striking Earth, leading to human extinction, or a nuclear exchange that renders civilization impossible. Is assuming the “status quo” reasonable? We’ll discuss that in the next post.

Let’s now discuss if humanity will survive the 21st century.

The typical events that most people consider as causing humanity’s extinction, such as a large asteroid impact or a volcanic eruption of sufficient magnitude to cause catastrophic climate change, actually have a relatively low probability of occurring, in the order of 1 in 50,000 or less, according to numerous estimates found via a simple Google search. In 2008, experts surveyed at the Global Catastrophic Risk Conference at the University of Oxford suggested a 19% chance of human extinction over the next century, citing the top five most probable to cause human extinction by 2100 as:

  1. Molecular nanotechnology weapons – 5% probability
  2. Super-intelligent AI – 5% probability
  3. Wars – 4% probability
  4. Engineered pandemic – 2% probability
  5. Nuclear war – 1% probability

All other existential events were below 1%. Again, doing a simple Google search may provide different results by different “experts.” If we take the above survey at face value, it would suggest that the risk of an existential event increases with time. This has led me to the conclusion that human survival over the next 30 years is highly probable.

It is interesting to note in the 2008 Global Catastrophic Risk Conference survey, super-intelligent AI equates with molecular nanotechnology weapons for number one. In my view, molecular nanotechnology weapons and super-intelligent AI are two sides of the same coin. In fact, I judge that super-intelligent AI will be instrumental in developing molecular nanotechnology weapons. I also predict that humanity, in some form, will survive until the year 2100. However, I predict that will include both humans with strong artificially intelligent brain implants and organic humans (i.e., no brain implants to enhance their intelligence). However, each may have some artificially intelligent body parts.

Let me summarize. Based on the above information, it is reasonable to judge humanity will survive through the 21st century.

human extinction

Will Humanity Survive The 21st Century?

Examples of typical events that most people think could cause humanity’s extinction are a large asteroid impact or a volcanic eruption of sufficient magnitude to cause catastrophic climate change. Although possible, these events actually have a relatively low probability of occurring, in the order of one in fifty thousand or less, according to numerous estimates found via a simple Google search.

However, there are other events with higher probabilities that may cause human extinction. In 2008, experts surveyed at the Global Catastrophic Risk Conference at the University of Oxford suggested a 19 percent chance of human extinction over this century, citing the top five most probable to cause human extinction by 2100 as:

  1. Molecular nanotechnology weapons (i.e., nanoweapons): 5 percent probability
  2. Superintelligent AI: 5 percent probability
  3. Wars: 4 percent probability
  4. Engineered pandemic: 2 percent probability
  5. Nuclear war: 1 percent probability

All other existential events were below 1 percent. There is a subtle point the survey does not explicitly express, namely, the risk of human extinction increases with time. You may wonder, Why? To answer this question, consider these examples:

  • Nanoweapons and superintelligence become more capable with the development of each successive generation. In the 2008 Global Catastrophic Risk Conference survey, superintelligent AI equates with molecular nanotechnology weapons as the number one potential cause of human extinction. In my view, molecular nanotechnology weapons and superintelligent AI are two sides of the same coin. In fact, I judge that superintelligent AI will be instrumental in developing molecular nanotechnology weapons.
  • In my new book, War At The Speed Of Light, I devoted a chapter on autonomous directed energy weapons. These are weapons that act on their own to take hostile action, resulting in unintended conflicts. Unfortunately, current autonomous weapons don’t embody human judgment. This being the case, wars, including nuclear wars, become more probable as more autonomous weapons are deployed.
  • Lastly, the world is currently facing a coronavirus pandemic. Although most researchers believe this is a naturally occurring pandemic, it still infected 121,382,067 people and caused 2,683,209 deaths to date on a worldwide basis. This suggests the death rate is a little over 2 percent. However, if the virus was more infectious and more deadly, it could render the Earth a barren wasteland. Unfortunately, that is what an engineered pandemic might do.

To my eye, the top five potential causes surfaced by the Global Catastrophic Risk Conference at the University of Oxford in 2008 are all possible, and the probabilities associated with them appear realistic. This means that humanity has a 19 percent chance of not surviving the 21st century on our current course.

In the next post, I will suggest measures humanity can take to increase the probability they will survive into the 22nd century.