Category Archives: War At The Speed Of Light

human extinction

Will Humanity Survive The 21st Century?

Examples of typical events that most people think could cause humanity’s extinction are a large asteroid impact or a volcanic eruption of sufficient magnitude to cause catastrophic climate change. Although possible, these events actually have a relatively low probability of occurring, in the order of one in fifty thousand or less, according to numerous estimates found via a simple Google search.

However, there are other events with higher probabilities that may cause human extinction. In 2008, experts surveyed at the Global Catastrophic Risk Conference at the University of Oxford suggested a 19 percent chance of human extinction over this century, citing the top five most probable to cause human extinction by 2100 as:

  1. Molecular nanotechnology weapons (i.e., nanoweapons): 5 percent probability
  2. Superintelligent AI: 5 percent probability
  3. Wars: 4 percent probability
  4. Engineered pandemic: 2 percent probability
  5. Nuclear war: 1 percent probability

All other existential events were below 1 percent. There is a subtle point the survey does not explicitly express, namely, the risk of human extinction increases with time. You may wonder, Why? To answer this question, consider these examples:

  • Nanoweapons and superintelligence become more capable with the development of each successive generation. In the 2008 Global Catastrophic Risk Conference survey, superintelligent AI equates with molecular nanotechnology weapons as the number one potential cause of human extinction. In my view, molecular nanotechnology weapons and superintelligent AI are two sides of the same coin. In fact, I judge that superintelligent AI will be instrumental in developing molecular nanotechnology weapons.
  • In my new book, War At The Speed Of Light, I devoted a chapter on autonomous directed energy weapons. These are weapons that act on their own to take hostile action, resulting in unintended conflicts. Unfortunately, current autonomous weapons don’t embody human judgment. This being the case, wars, including nuclear wars, become more probable as more autonomous weapons are deployed.
  • Lastly, the world is currently facing a coronavirus pandemic. Although most researchers believe this is a naturally occurring pandemic, it still infected 121,382,067 people and caused 2,683,209 deaths to date on a worldwide basis. This suggests the death rate is a little over 2 percent. However, if the virus was more infectious and more deadly, it could render the Earth a barren wasteland. Unfortunately, that is what an engineered pandemic might do.

To my eye, the top five potential causes surfaced by the Global Catastrophic Risk Conference at the University of Oxford in 2008 are all possible, and the probabilities associated with them appear realistic. This means that humanity has a 19 percent chance of not surviving the 21st century on our current course.

In the next post, I will suggest measures humanity can take to increase the probability they will survive into the 22nd century.

Laser Weapon

US Military Intends To Use Lasers To Defend Against Hypersonic Glide Missiles

In my new book, War At The Speed Of Light, I discuss why the US military is eager to deploy directed energy weapons, such as lasers. One important reason has to do with hypersonic (i.e., five or more times faster than the speed of sound) glide missiles, which no country currently can defend against. Potential US adversaries, like China and Russia, are developing and deploying hypersonic missiles as a means to destroy US aircraft, drones, missiles, aircraft carriers, and space-based assets, such as GPS and communication satellites. To counter this threat, the United States is developing and deploying laser weapons. However, the development of laser weapons is in its infancy. For example, in December 2014, the United States Navy installed the first-ever 30-kilowatt laser weapon on the USS Ponce. In field-testing, the United States Navy reported that the laser system worked perfectly against low-end asymmetric threats, such as small unmanned aerial vehicles. Following the field tests, the Navy authorized the commander of the Ponce to use the system as a defensive weapon. However, this is just the beginning. The US Navy’s strategy is to develop higher energy laser systems with the capability to destroy an adversary’s “carrier killer” missiles, as well as other asymmetric threats such as hypersonic missiles.

In January 2018, the Navy contracted Lockheed Martin for two HELIOS (High Energy Laser with Integrated Optical-dazzler and Surveillance), which Lockheed delivered in 2021. These new lasers are capable of a 60-kilowatt laser beam, which is double the energy punch of the laser weapons deployed on the USS Ponce. The Navy intends to deploy one on the USS Dewey Arleigh Burke-class destroyer. The other will be land-based at the White Sands Missile Range in New Mexico for testing. This is an excerpt from Lockheed Martin’s press release:

MOORESTOWN, N.J., JANUARY 11, 2021 – This year, the U.S. Navy will field the first acquisition program to deploy the High Energy Laser with Integrated Optical-dazzler and Surveillance, or HELIOS, a laser weapon system with high-energy fiber lasers for permanent fielding by the U.S. Department of Defense. This will be the only deployed laser system integrated into an operational Flight IIA DDG. This follows the Lockheed Martin (NYSE: LMT) and Navy’s recent demonstration of full laser power in excess of the 60 kW requirement. The scalable laser design architecture spectrally combines multiple kilowatt fiber lasers to attain high beam quality at various power levels.

In the 2020s, the US military plans to usher in the widespread use of laser weapons on land, sea, air, and space. It is reasonable to assume that these new lasers will continue the US military thrust to develop and deploy laser weapon systems capable of destroying an adversary’s hypersonic, intercontinental ballistic missiles, drone swarms, and space assets.

artificial intelligence

Artificial Intelligence Is Changing Our Lives And The Way We Make War

Artificial intelligence (AI) surrounds us. However, much the same way we seldom read billboards as we drive, we seldom recognize AI. Even though we use technology, like our car GPS to get directions, we do not recognize that at its core is AI. Our phones use AI to remind us of appointments or engage us in a game of chess. However, we seldom, if ever, use the phrase “artificial intelligence.” Instead, we use the term “smart.” This is not the result of some master plans by the technology manufacturers. It is more of a statement regarding the status of the technology.

By the late 1990s through the early part of the twenty-first century, AI research began its resurgence. Smart agents found new applications in logistics, data mining, medical diagnosis, and numerous areas throughout the technology industry. Several factors led to this success:

  • Computer hardware computational power was now getting closer to that of a human brain (i.e., in the best case about 10 to 20 percent of a human brain).
  • Engineers placed emphasis on solving specific problems that did not require AI to be as flexible as a human brain.

New ties between AI and other fields working on similar problems were forged. AI was definitely on the upswing. AI itself, however, was not in the spotlight. It lay cloaked within the application, and a new phrase found its way into our vocabulary: the “smart (fill in the blank)”—for example, we say the “smartphone.”

AI is now all around us, in our phones, computers, cars, microwave ovens, and almost any commercial or military system labeled “smart.” According to Nick Bostrom, a University of Oxford philosopher known for his work on superintelligence risks, “A lot of cutting edge AI has filtered into general applications, often without being called AI because once something becomes useful enough and common enough it’s not labeled AI anymore” (“AI Set to Exceed Human Brainpower,” CNN.com, July 26, 2006). Ray Kurzweil agrees. He said, “Many thousands of AI applications are deeply embedded in the infrastructure of every industry” (Ray Kurzweil, The Singularity is Near: When Humans Transcend Biology Funding [2005]). The above makes two important points:

  1. AI is now part of every aspect of human endeavor, from consumer goods to weapons of war, but the applications are seldom credited to AI.
  2. Both government and commercial applications now broadly underpin AI funding.

AI startups raised $73.4 billion in total funding in 2020 according to data gathered by StockApps.com. Well-established companies like Google are spending tens of billions on AI infrastructure. Google has also spent hundreds of millions on secondary AI business pursuits, such as driverless cars, wearable technology (Google Glass), humanlike robotics, high-altitude Internet broadcasting balloons, contact lenses that monitor glucose in tears, and even an effort to solve death.

In essence, the fundamental trend in both consumer and military AI systems is toward complete autonomy. Today, for example, one in every three US fighter aircraft is a drone. Today’s drones are under human control, but the next generation of fighter drones will be almost completely autonomous. Driverless cars, now a novelty, will become common. You may find this difficult or even impossible to believe. However, look at today’s AI applications. The US Navy plans to deploy unmanned surface vehicles (USVs) to not only protect navy ships but also, for the first time, to autonomously “swarm” offensively on hostile vessels. In my latest book, War At The Speed Of Light, I devoted a chapter to autonomous directed energy weapons. Here is an excerpt:

The reason for building autonomous directed energy weapons is identical to those regarding other autonomous weapons. According to Military Review, the professional journal of the US Army, “First, autonomous weapons systems act as a force multiplier. That is, fewer warfighters are needed for a given mission, and the efficacy of each warfighter is greater. Next, advocates credit autonomous weapons systems with expanding the battlefield, allowing combat to reach into areas that were previously inaccessible. Finally, autonomous weapons systems can reduce casualties by removing human warfighters from dangerous missions.

What is making this all possible? It is the relentless exponential growth in computer performance. According to Moore’s law, computer-processing power doubles every eighteen months. Using Moore’s law and simple mathematics suggests that in ten years, the processing power of our personal computers will be more than a hundred times greater than the computers we currently are using. Military and consumer products using top-of-the-line computers running state-of-the-art AI software will likely exceed our desktop computer performance by factors of ten. In effect, artificial intelligence in top-of-the-line computers running state-of-the-art AI software may be equivalent to human intelligence. However, will it be equivalent to human judgment? I fear not, and autonomous weapons may lead to unintended conflicts, conceivably even World War III.

I recognize this last paragraph represents dark speculations on my part. Therefore, let me ask you, What do you think?

artificial intelligence

Artificial Intelligence Threatens Human Extinction

While researching my new book, War At The Speed Of Light, I surfaced some important questions regarding the threat artificial intelligence poses to humanity. For example, Will your grandchildren face extinction? Even worse, will they become robotic slaves to a supercomputer?

Humanity is facing its greatest challenge, artificial intelligence (AI). Recent experiments suggest that even primitive artificially intelligent machines can learn deceit, greed, and self-preservation without being programmed to do so. There is alarming evidence that artificial intelligence, without legislation to police its development, will displace humans as the dominant species by the end of the twenty-first century.

There is no doubt that AI is the new scientific frontier, and it is making its way into many aspects of our lives. Our world includes “smart” machines with varying degrees of AI, including touch-screen computers, smartphones, self-parking cars, smart bombs, heart pacemakers, and brain implants to treat Parkinson’s disease. In essence, AI is changing the cultural landscape, and we are embracing it at an unprecedented rate. Currently, humanity is largely unaware of the potential dangers that strong artificially intelligent machines pose. In this context, the word “strong” signifies AI greater than human intelligence.

Most of humanity perceives only the positive aspects of AI technology. This includes robotic factories, like Tesla Motors, which manufactures electric cars that are ecofriendly, and the da Vinci Surgical System, a robotic platform designed to expand the surgeon’s capabilities and offer a state-of-the-art minimally invasive option for major surgery. These are only two of many examples of how AI is positively affecting our lives. However, there is a dark side. For example, Gartner Inc., a technology research group, forecasts robots and drones will replace a third of all workers by 2025. Could AI create an unemployment crisis?  As AI permeates the medical field, the average human life span will increase. Eventually, strong artificially intelligent humans (SAHs), with AI brain implants to enhance their intelligence and cybernetic organs, will become immortal. Will this exacerbate the worldwide population crisis already surfaced as a concern by the United Nations? By 2045, some AI futurists predict that a single strong artificially intelligent machine (SAM) will exceed the cognitive intelligence of the entire human race. How will SAMs view us? Objectively, humanity is an unpredictable species. We engage in wars, develop weapons capable of destroying the world and maliciously release computer viruses. Will SAMs view us as a threat? Will we maintain control of strong AI, or will we fall victim to our own invention?

I recognize that this post raises more questions than answers. However, I thought it important to share these questions with you. In my new book, War At The Speed Of Light, I devote an entire chapter to autonomous directed energy weapons. I surface these questions, Will autonomous weapons replace human judgment and result in unintended devastating conflicts? Will they ignite World War III? I also provide recommendations to avoid these unintended conflicts. For more insight, browse the book on Amazon

Two crossed lightsaber swords in front of a space background.

An Extract From the Intro of War At The Speed Of Light

The pace of warfare is accelerating. In fact, according to the Brookings Institution, a nonprofit public policy organization, “So fast will be this process [command and control decision-making], especially if coupled to automatic decisions to launch artificially intelligent autonomous weapons systems capable of lethal outcomes, that a new term has been coined specifically to embrace the speed at which war will be waged: hyperwar.”

The term “hyperwar” adequately describes the quickening pace of warfare resulting from the inclusion of AI into the command, control, decision-making, and weapons of war. However, to my mind, it fails to capture the speed of conflict associated with directed energy weapons. To be all-inclusive, I would like to suggest the term “c-war.” In Einstein’s famous mass-energy equivalent equation, E = mc2, the letter “c” is used to denote the speed of light in a vacuum. [For completeness, E means energy and m mass.] Surprisingly, the speed of light in the Earth’s atmosphere is almost equal to its velocity in a vacuum. On this basis, I believe c-war more fully captures the new pace of warfare.

Unfortunately, c-war, war at the speed of light, may remove human judgment from the realm of war altogether, which could have catastrophic ramifications. If you think this is farfetched, consider this Cold War account, where new technology almost plunged the world into nuclear war. This historical account is from RAND Corporation, a nonprofit institution that helps improve policy and decision making through research and analysis:

Lt. Col. Stanislav Petrov settled into the commander’s chair in a secret bunker outside Moscow. His job that night was simple: Monitor the computers that were sifting through satellite data, watching the United States for any sign of a missile launch. It was just after midnight, Sept. 26, 1983.

A siren clanged off the bunker walls. A single word flashed on the screen in front of him.

“Launch.”

Petrov’s computer screen now showed five missiles rocketing toward the Soviet Union. Sirens wailed. Petrov held the phone to the duty officer in one hand, an intercom to the computer room in the other. The technicians there were telling him they could not find the missiles on their radar screens or telescopes.

It didn’t make any sense. Why would the United States start a nuclear war with only five missiles? Petrov raised the phone and said again:

“False alarm.”

For a few terrifying moments, Stanislav Petrov stood at the precipice of nuclear war. By mid-1983, the Soviet Union was convinced that the United States was preparing a nuclear attack. The computer system flashing red in front of him was its insurance policy, an effort to make sure that if the United States struck, the Soviet Union would have time to strike back.

But on that night, it had misread sunlight glinting off cloud tops.

“False alarm.” The duty officer didn’t ask for an explanation. He relayed Petrov’s message up the chain of command.

The world owes Lt. Col. Stanislav Petrov an incalculable debt. His judgment spared the world a nuclear holocaust. Now, ask yourself this simple question: If those systems Petrov monitored were autonomous (i.e., artificially intelligent), would they have initiated World War III? I believe this is a profound question, and that it is possible to make persuasive arguments on either side. However, would you want to leave the fate of the world to an artificially intelligent system?

I have devoted a significant portion of my career to developing AI for military applications. My experience leads me to conclude today’s technology cannot replicate human judgment. Therefore, I think an AI system replacing Petrov may have initiated World War III. I also believe US military planners are acutely aware of this and are taking steps to defend the US against such a mishap. As we discussed earlier, their actions could disrupt the doctrine of MAD, which prevents nuclear war via the threat of mutually assured destruction. Some term this “the balance of terror.” If any country were able to disrupt the doctrine of MAD, they would tilt the balance of terror.

A news paper with the word news written in it.

Breaking News: War At The Speed Of Light

 

New Book by Louis A. Del Monte Grapples with US’ Development of Star Trek-like Weapons

 

Directed Energy Weapons and the Future of 21st Century Warfare

 

Minneapolis, Minnesota, March 2, 2021: For many Americans, the idea of laser weapons and force field shields may be more at place in a Star Trek film than on the battlefield, but the US development and deployment of directed energy weapons is rapidly changing that reality in 21st Century Warfare. Louis Del Monte’s new book, War at the Speed of Light (Potomac Books, March 2021), describes the revolutionary and ever-increasing role of directed-energy weapons, such as laser, microwave, electromagnetic pulse, and cyberspace weapons.

As potential adversaries develop hypersonic missiles, missile swarming tactics, and cyberspace weapons, the US military has turned to directed energy weapons for defensive and offensive purposes. Unfortunately, though, in War at the Speed of Light, Del Monte argues that these weapons can completely disrupt the fragile compromises that have kept the world safe through the Cold War.

“Directed energy weapons have the potential to disrupt the doctrine of Mutually Assured Destruction, which has kept the major powers of the world from engaging in a nuclear war,” said Del Monte.

Del Monte analyzes how modern warfare is changing in three fundamental ways: the pace of war is quickening, the rate at which weapons project devastation reaches the speed of light, and cyberspace is now officially a battlefield. In this acceleration of combat from “Hyperwar” to “C-War,” an acceleration from computer speed to the speed of light, War at the Speed of Light shows how disturbingly close the world is to losing any deterrence to nuclear warfare.

Book Reviews

  • “Louis Del Monte has given us a fascinating, sophisticated, and at times disturbing tour of the next stage of warfare, in which directed energy weapons inflict damage at the speed of light.  In terms readily accessible to the general public, he describes how weapons that use energy sources such as lasers, microwaves, and electromagnetic pulses have the potential to profoundly change the balance of power and revolutionize the nature of conflict.” Mitt Regan, McDevitt Professor of Jurisprudence, Co-Director, Center on National Security and the Law, Georgetown University Law Center
  • “Louis Del Monte provides a thought-provoking look at the ever-increasing and revolutionary role of directed energy weapons in warfare… Most importantly, Del Monte surfaces the threat that directed energy weapons pose to disrupting the doctrine of MAD (i.e., mutually assured destruction), which has kept the major powers of the world from engaging in a nuclear war.” COL Christopher M. Korpela, Ph.D.

The book is available at bookstores, from Potomac Books, and on Amazon.

Louis A. Del Monte is available for radio, podcast, and television interviews, as well as writing op-ed pieces for major media outlets. Feel free to contact him directly by email at ldelmonte@delmonteagency.com and phone at 952-261-4532.

To request a book for review, contact Jackson Adams by email at jadams30@unl.edu.

About Louis A. Del Monte

Louis A. Del Monte is an award-winning physicist, inventor, futurist, featured speaker, and CEO of Del Monte and Associates, Inc. He has authored a formidable body of work, including War At The Speed Of Light (2021), Genius Weapons (2018), Nanoweapons (2016), and Amazon charts #1 bestseller The Artificial Intelligence Revolution (2014). Major magazines like the Business Insider, The Huffington Post, The Atlantic, American Security Today, Inc., CNBC, and the New York Post have featured his articles or quoted his views on artificial intelligence and military technology.

us military laser weapons

Lasers Will Dominate The Battlefield By 2030

Every branch of the US military will deploy lasers. In fact, the US Army is building the world’s most powerful laser weapon (Read more about it here). Given the pace that the US military is developing and deploying laser weapons, I predict they will dominate the battlefield by 2030. You can read more about it in my new book, War At The Speed Of Light.

A red light is shining on the dark background.

New Book: War At The Speed Of Light

Announcement

My new book War at the Speed of Light: Directed-Energy Weapons and the Future of Twenty-First-Century Warfare is a hardcover, 288-page book, being released by Potomac Books on March 1, 2021.

Overview

War at the Speed of Light describes the revolutionary and ever-increasing role of directed-energy weapons (such as laser, microwave, electromagnetic pulse, and cyberspace weapons) in warfare. Louis A. Del Monte delineates the threat that such weapons pose to disrupting the doctrine of Mutually Assured Destruction, which has kept the major powers of the world from engaging in nuclear warfare.

Potential U.S. adversaries, such as China and Russia, are developing hypersonic missiles and using swarming tactics as a means to defeat the U.S. military. In response, the U.S. Department of Defense established the 2018 National Security Strategy, emphasizing directed-energy weapons, which project devastation at the speed of light and are capable of destroying hypersonic missiles and enemy drones and missile swarms.

Del Monte analyzes how modern warfare is changing in three fundamental ways: the pace of war is quickening, the rate at which weapons project devastation is reaching the speed of light, and cyberspace is now officially a battlefield. In this acceleration of combat called “hyperwar,” Del Monte shows how disturbingly close the world is to losing any deterrence to nuclear warfare.

Editorial Reviews:

  • “A fascinating, sophisticated, and at times disturbing tour of the next stage of warfare, in which directed-energy weapons inflict damage at the speed of light. [Such weapons] have the potential to profoundly change the balance of power and revolutionize the nature of conflict. . . . This book will be an indispensable reference for the kind of political, military, and ethical debate on these weapons that Del Monte strongly urges us to conduct.”—Mitt Regan, McDevitt Professor of Jurisprudence and co-director of the Center on National Security and the Law, Georgetown University Law Center
  • “Louis Del Monte provides a thought-provoking look at the ever-increasing and revolutionary role of directed-energy weapons in warfare. Del Monte’s background in developing advanced integrated circuits and sensors for some of the most advanced military weapons enables him to provide a unique perspective on emerging laser, microwave, electromagnetic pulse, and cyberspace weapons. Most important, Del Monte surfaces the threat directed-energy weapons pose to disrupting the doctrine of Mutually Assured Destruction, which has kept the major powers of the world from engaging in a nuclear war.”—Col. Christopher Korpela, U.S. Army, PhD
  • “Louis Del Monte gives a thorough and well-researched review of the next generation of laser, microwave, electromagnetic pulse, and cyberspace directed-energy weapons. He describes critical risks regarding the marriage of these weapons with artificial intelligence and discusses the ethical issues that result. He presents important comparisons of the capabilities of the United States versus those of Russia and China and other potential adversaries. He points out the tremendous risk for humanity if the world does not safeguard technology and weapons developments as these technologies mature.”—Ed Albers, retired director of engineering at Honeywell

Available

Pre-release copies are available now from Amazon. By March 1, 2021, it will be available from Barnes & Noble, as well as your local bookstore.

Order your copy now.