The pace of warfare is accelerating. In fact, according to the Brookings Institution, a nonprofit public policy organization, “So fast will be this process [command and control decision-making], especially if coupled to automatic decisions to launch artificially intelligent autonomous weapons systems capable of lethal outcomes, that a new term has been coined specifically to embrace the speed at which war will be waged: hyperwar.”

The term “hyperwar” adequately describes the quickening pace of warfare resulting from the inclusion of AI into the command, control, decision-making, and weapons of war. However, to my mind, it fails to capture the speed of conflict associated with directed energy weapons. To be all-inclusive, I would like to suggest the term “c-war.” In Einstein’s famous mass-energy equivalent equation, E = mc2, the letter “c” is used to denote the speed of light in a vacuum. [For completeness, E means energy and m mass.] Surprisingly, the speed of light in the Earth’s atmosphere is almost equal to its velocity in a vacuum. On this basis, I believe c-war more fully captures the new pace of warfare.

Unfortunately, c-war, war at the speed of light, may remove human judgment from the realm of war altogether, which could have catastrophic ramifications. If you think this is farfetched, consider this Cold War account, where new technology almost plunged the world into nuclear war. This historical account is from RAND Corporation, a nonprofit institution that helps improve policy and decision making through research and analysis:

Lt. Col. Stanislav Petrov settled into the commander’s chair in a secret bunker outside Moscow. His job that night was simple: Monitor the computers that were sifting through satellite data, watching the United States for any sign of a missile launch. It was just after midnight, Sept. 26, 1983.

A siren clanged off the bunker walls. A single word flashed on the screen in front of him.


Petrov’s computer screen now showed five missiles rocketing toward the Soviet Union. Sirens wailed. Petrov held the phone to the duty officer in one hand, an intercom to the computer room in the other. The technicians there were telling him they could not find the missiles on their radar screens or telescopes.

It didn’t make any sense. Why would the United States start a nuclear war with only five missiles? Petrov raised the phone and said again:

“False alarm.”

For a few terrifying moments, Stanislav Petrov stood at the precipice of nuclear war. By mid-1983, the Soviet Union was convinced that the United States was preparing a nuclear attack. The computer system flashing red in front of him was its insurance policy, an effort to make sure that if the United States struck, the Soviet Union would have time to strike back.

But on that night, it had misread sunlight glinting off cloud tops.

“False alarm.” The duty officer didn’t ask for an explanation. He relayed Petrov’s message up the chain of command.

The world owes Lt. Col. Stanislav Petrov an incalculable debt. His judgment spared the world a nuclear holocaust. Now, ask yourself this simple question: If those systems Petrov monitored were autonomous (i.e., artificially intelligent), would they have initiated World War III? I believe this is a profound question, and that it is possible to make persuasive arguments on either side. However, would you want to leave the fate of the world to an artificially intelligent system?

I have devoted a significant portion of my career to developing AI for military applications. My experience leads me to conclude today’s technology cannot replicate human judgment. Therefore, I think an AI system replacing Petrov may have initiated World War III. I also believe US military planners are acutely aware of this and are taking steps to defend the US against such a mishap. As we discussed earlier, their actions could disrupt the doctrine of MAD, which prevents nuclear war via the threat of mutually assured destruction. Some term this “the balance of terror.” If any country were able to disrupt the doctrine of MAD, they would tilt the balance of terror.