How America's Generals Can Learn from Their Errors

By Wolfe, Frank, White House photographer, Photographer (NARA record: 8466729) - U.S. National Archives and Records Administration, Public Domain, https://commons.wikimedia.org/w/index.php?curid=16738251
July 21, 2019 Topic: Security Region: Americas Blog Brand: The Buzz Tags: PentagonMilitaryWarStrategyNational Security

How America's Generals Can Learn from Their Errors

Can they do it?

Downplaying the possibility of future loss in war games and military planning is also hazardous.

SECRETARY OF Defense James Mattis reportedly said: “I don’t lose any sleep at night over the potential for failure. I cannot even spell the word.” To paraphrase Trotsky, American generals may not be interested in failure, but failure is interested in them. In recent decades, the United States has suffered a number of stalemates and defeats in Vietnam, Iraq and Afghanistan. Despite the recurrent experience of military fiascos, there is a puzzling discrepancy in how U.S. officials think about past versus future loss. When leaders learn from historical cases, debacles often loom large and powerfully shape policy. But when officials plan prospective operations, they tend to neglect the possibility of disaster. As a result, military planners focus too much on avoiding a repeat of prior reversals, and not enough on the possibility that the new strategy will itself unravel.

One solution is to take inspiration from the business realm, where the school of “intelligent failure” encourages a healthier relationship with loss. By adopting the right set of tools, the military can become more adaptable and resilient.

CIVILIAN AND military officials in the U.S. national security community tend to learn more from past failure rather than success. By failure we mean military operations that did not achieve the intended core aims, or where the balance sheet was skewed toward costs rather than benefits. Leaders, for example, often draw historical analogies with prior policies to clarify the strategic stakes in a current issue or suggest the optimum path forward. Strikingly, these analogies are overwhelmingly negative (do not repeat past errors), rather than positive (copy past successes). No more Munichs. No more Vietnams. No more Iraqs. And so on.

(This first appeared in December 2018.)

What’s more, failure is the primary catalyst for organizational or doctrinal change. It often takes a fiasco to delegitimize standard procedures. For example, America’s negative experience in Vietnam in the 1960s and 1970s, as well as in Lebanon in 1982–1984, spurred the Weinberger-Powell doctrine, which outlined a set of principles to assess the wisdom of military operations. More recently, the desire to avoid a repetition of the Iraq War lay at the core of the Obama doctrine.

The tendency to learn more from failures than successes is rooted in what psychologists call “negativity bias,” which is a core predisposition in the human brain where bad is stronger than good. Negative factors loom larger than positive factors in almost every realm of psychology, including cognition, emotion and information processing, as well as memory and learning. Bad events are recalled more easily than good events, lead to more intense reflection and “why” questions, and have a much more enduring impact. “Prosperity is easily received as our due, and few questions are asked concerning its cause or author,” observed David Hume. “On the other hand, every disastrous accident alarms us, and sets us on enquiries concerning the principles whence it arose.”

Recent failures are especially salient because of the “availability heuristic,” where people recall vivid events that just happened, and then mistakenly think these events are representative or likely to reoccur. For example, the purchase of earthquake insurance tends to increase immediately after an earthquake and then drop off as people forget the disaster.

GIVEN THAT past failure is salient in memory and learning, we might expect that planning for future military operations would also highlight the possibility of loss. But, in fact, the opposite happens. When considering prospective uses of force, officials tend to downplay the possibility of disaster and focus instead on taking the first steps in the strategy of victory. Put simply, past failure is illuminated in bright lights whereas future failure is hidden.

U.S. military war games, for example, often neglect the potential for loss. A 1971 review of education in the U.S. Army discovered that war games and other exercises were, “generally euphoric in nature—the U.S. Army always wins with relative ease.” By 2001, the war games were more sophisticated, but the outcome was the same. According to a study by Robert Haffa and James Patton: “the good guys win convincingly and no one gets hurt.”

When war games do provide a cautionary tale, the results may simply be ignored. In the early 1960s, before the United States sent ground troops to Vietnam, the RAND Corporation ran a series of war games called SIGMA to simulate a multi-year U.S. campaign in Southeast Asia. Chairman of the Joint Chiefs Maxwell Taylor led the communist side to a crushing victory. Despite this outcome, the United States pressed ahead with an intervention in Vietnam and Taylor maintained confidence that the United States would win—perhaps because the communists would lack the benefit of his leadership.

Preparation for real war may also neglect the possibility of failure. Planning for the Iraq War, for example, was overly optimistic about the stabilization phase and the possible risks of disorder or insurgency. The special inspector general for Iraq reconstruction concluded that, “when Iraq’s withering post-invasion reality superseded [official] expectations, there was no well-defined ‘Plan B’ as a fallback and no existing government structures or resources to support a quick response.”

Why do officials downplay the possibility of future failure? Psychologists have found that mentally healthy people tend to exhibit the psychological bias of overconfidence, by exaggerating their perceived abilities, control over events and likely upcoming success. Positive illusions in war games and military strategy are consistent with the well-established “planning fallacy” or the tendency to adopt optimistic estimates of the time and costs required to complete future projects. The Sydney Opera House was supposed to cost AUD$7 million and be completed in 1963, but it was actually finished a decade late at a cost of AUD$102 million. Interestingly, people tend to be too optimistic about the success of their own projects, but more realistic when assessing other people’s projects.

Overconfidence varies, for example, due to national culture. Americans are particularly prone to positive illusions because self-confidence, a “can-do spirit” and winning are all valued traits in the United States. On the eve of D-Day, General George Patton told U.S. soldiers, “the very idea of losing is hateful to an American.” Studies suggest that Americans are more likely than Chinese people to believe they can control the environment and actualize their plans. Henry Kissinger noted that U.S. officials tend to see foreign policy problems as “soluble,” and pursue “specific outcomes with single-minded determination.” By contrast, Chinese officials are comfortable handling extended deadlock, and “believe that few problems have ultimate solutions.”

U.S. military culture reinforces overconfidence by lauding success and stigmatizing failure. At the heart of the military’s ethos is a commitment to achieve the mission. As Douglas MacArthur put it: “There is no substitute for victory.” Colin Powell declared that, “perpetual optimism is a force multiplier.” James Stavridis, the former Supreme Allied Commander in Europe, told me, “U.S. military culture is not particularly compatible with failure planning.”

Men also tend to be more overconfident than women. In a wide variety of domains, from salary negotiations to performance on game shows, men are more positive about their own skills, quicker to congratulate themselves for success and more likely to ignore external criticism in self-evaluations.

High stakes issues can exacerbate overconfidence. Rationally, as the potential costs of a decision rise, there ought to be greater consideration paid to the risks. But this is not always the case in planning for the use of force. At the operational level—for example, when refueling a ship—the U.S. military focuses on potential hazards through a process known as Operational Risk Management. But at the strategic level, the willingness to confront scenarios of failure may actually decline. If officials chose to place American lives on the line, the notion of identifying flaws with the plan can trigger cognitive dissonance and a desire to avoid second-guessing. According to the journalist George Packer, any U.S. official who raised problems with the Iraq invasion plan risked “humiliation and professional suicide.”

In addition, downplaying the potential for future failure may serve organizational interests. The armed services compete for resources, prestige and autonomy, which encourages them to confidently predict they can handle any mission.

These various causes of overconfidence matter because U.S. national security officials may check most or all the boxes, being male, American, in the military, with organizational incentives to play up the odds of success, and facing high-stakes issues.

Overconfidence can also be retrospective. In general, historical debacles loom large but there is an exception when people consider their own personal responsibility for failure. To protect their self-image (and their image in the eyes of others), people often adopt hagiographic autobiographies, downplay their role in causing negative outcomes and blame external forces. The “attribution error” in psychology predicts that people explain triumphs in terms of their own talents and wise strategy, and disasters in terms of environmental factors and random events beyond anyone’s control.

IN SUMMARY, historical debacles are highly salient and preparation for the use of force is often based on avoiding the last big mistake—with the caveat that any personal role in loss is downplayed. At the same time, the potential for future failure tends to be discounted in war games and military planning. Therefore, as the temporal focus shifts from past to future, the salience of loss declines.