I was originally going to post about a book that Andrew Krepinevich wrote in 2009: 7 Deadly Scenarios: A Military Futurist Explores War in the 21st Century. Chapter 3 of the book was entitled “Pandemic.” While the cause of the pandemic in Mr. Krepinevich’s book was a variation on influenza, instead of a coronavirus, many of the effects were eerily the same:
“The summer movie season has been washed away by the government’s ban on public gatherings. Meanwhile, baseball … is experiencing a ‘silent season.’ Major league baseball tried for a few days to play televised games in empty stadiums, then cancelled its schedule when several players came down with pandemic influenza ….”1
“In many parts of the country hospitals are collapsing under the strain, as ever greater numbers of Americans succumb to the virus.”2
“[D]eveloping a vaccine to deal with a novel strain of influenza can take nearly a year.”3
Which gets me to an article Tim Harford wrote for the Financial Times: “Why we fail to prepare for disasters.”4 Mr. Harford writes about all kinds of disasters for which we have had warnings and still failed to prepare. One of the more dramatic was the hurricane that almost hit New Orleans eleven months before Hurricane Katrina. Hurricane Ivan was on course to smash into New Orleans in September of 2004. The mayor urged people to leave. There were no emergency shelters, and the Superdome wasn’t viewed as hurricane-proof. But then, the hurricane changed course. As Mr. Harford says, Hurricane Ivan “had demonstrated the need to prepare, urgently and on a dozen different fronts, for the next hurricane, but New Orleans didn’t.”
A failure which is depressingly common. Mr. Harford identifies several different reasons for our failing to plan and/or planning inadequately. One is “normalcy bias” or “negative bias”. We are slow to recognize danger and/or we are confused about how to respond, so we do nothing. We don’t see others doing anything, so we don’t do anything.
Then there is “optimism bias.” People tend to be overly optimistic about their chances of surviving without harm. They don’t think that they will get sick or that, if the hurricane hits, it won’t damage their property. Howard Kunreuther and Robert Meyer described this concept in a book aptly called The Ostrich Paradox.
Next is “exponential myopia.” This can be particularly relevant in pandemics. People just don’t understand the exponential part of exponential growth. They understand fast and maybe even really fast, but exponential is beyond them, so they don’t appreciate the danger.
Finally, there is “wishful thinking.” This may be the most powerful reason for not planning. Instead of Hurricane Ivan being a warning for New Orleans to plan for the next hurricane and SARS being a warning for the U.S. to plan for a future pandemic, the fact that Hurricane Ivan didn’t hit New Orleans and that SARS was not a major problem in the United States became a reason not to prepare. If Ivan turned away, why wouldn’t Katrina? SARS wasn’t a problem. Why do we need to worry about the next one?
I personally fin this happening with winter weather forecasts in Chicago. Whenever a winter storm is, or even may be, approaching Chicago, the weather people on TV and radio start talking about how bad it could be. Their scary reports are all over the media. It’s all they talk about on the radio. Then, when the storm doesn’t happen or isn’t so bad, which seems to happen a lot, you wonder if it was ever likely to be bad or whether it was just hype to get people to listen.5 Or at least you think that way until the time it actually does snow a lot. At which point, I am surprised because I had started to rely on the storm not happening. The weather forecasters called “wolf” so many times, that I wasn’t expecting the wolf, or in this case snow, when it finally did come and I had a lot of shoveling to do.
I don’t know which of these reasons is why we weren’t prepared for Covid-19, though it could be all of them. In any case, it seems likely the United States will be prepared for the next big pandemic. One of the reasons Taiwan, South Korea, etc., were better prepared than we were for Covid-19 is that they were harder hit by SARS and MERS. They learned from their lack of preparation for those epidemics, so this time they were prepared.6 Of course, this assumes that the next pandemic happens reasonably soon. Preparation may dissipate over time, as Tim Harford noted:
“In 2006, Arnold Schwarzenegger – then governor of California – announced an investment of millions of dollars in medical supplies and mobile hospitals to deal with earthquakes, fires and particularly pandemics. …
It was impressive. But after a brutal recession, Schwarzenegger’s successor, Jerry Brown, cut the funding for the scheme, and the stockpile is nowhere to be found.”7
In terms of preparation, there is also the problem of “fighting the last war.” After World War I, the French built the Maginot Line to stop the kind of invasion the Germans launched in 1914. So, in 1940, the Germans simply went around it. After 2008-09 financial crisis, we built up banking reserves and ignored the health system. In 2020, we got a health emergency.
Of course, focusing on the last war isn’t the only risk. After Vietnam, the U.S. military decided it didn’t want to fight another war like Vietnam, so they forgot the lessons they learned and went about preparing for a “big tank war” in Europe. Which seemed prescient when the next foe was Saddam Hussein’s Iraq in the 1991 Gulf War. Big tanks won that war. But then, when the 2003 invasion of Iraq led to the Sunni insurgency, David Petraeus, et al, had to relearn the lessons we had forgotten about Vietnam.
Which brings me back to 7 Scenarios. Mr. Krepinevich’s book explores scenarios for war in the 21st century. But the ideas in the first and last chapters of the book, which look at planning for the future, are of value for other areas, too.
In the Introduction to his book, Mr. Krepinevich talks about Andrew Marshall, who he calls a legend “among those in the strategic studies world whose business it is to think about the future, and to identify the dangers that may threaten the security of the American people and their way of life”8:
“Marshall has endured because he has tried to help senior Defense Department officials, and the secretary of defense in particular, understand how to see the future rather than what to think or do. … His role has been to diagnose the emerging security environment. …
Diagnosing a disease correctly is indispensable to determining the appropriate treatment. … [A]s Marshall puts it, ‘I’d rather have decent analysis in response to the right set of questions than brilliant analysis focusing on the wrong set of questions.’ …
One of Marshall’s principal tools for preparing for future threats while they are still dark clouds on the distant horizon rather than storms that are already upon it is the use of scenarios – stories about how future events might come to pass.”9
Mr. Krepinevich wrote the scenarios in his book along the lines of Andrew Marshall’s scenarios. While those scenarios were focused on defense policy, this same kind of thinking can be used in non-defense areas. Would scenarios in the health field have helped us better identify weaknesses in our health sector and what we needed to do to better prepare for Covid-19? Could a Treasury Department futurist have come up with scenarios in 2005 that would have enabled us to see the risks of the Global Financial Crisis before 2008 and gotten us to do something to avoid or at least be better prepared for the GFC?
The answer to those questions is “yes – but.” The problem is, as Tim Harford points out, that even when we have good reasons and good examples to prepare for the future, we too often don’t. Maybe it’s normalcy bias. Maybe it’s optimism bias or wishful thinking. But that doesn’t mean there aren’t things to learn from developing scenarios like that. In some cases, with the right leaders, properly-done scenarios can demonstrate our weaknesses and/or the possible consequences of an emergency in a way that just-missed examples don’t. Scenarios can bring an immediacy and spur to action that we don’t get from something that almost happened but didn’t.
But in responding to such scenarios, or near-misses (if you are smart enough6), the key is not to respond too narrowly. The problem with the future is that it is hard to predict. There are lots of risks, but they won’t all happen and we can’t know which will and which won’t. All we can know is that some will. Therefore, our preparations need to be broad-based, so we can respond to whatever black swan flies in.
Let me suggest three things we need to have in order to deal with the emergencies we can’t predict.10 First, we need systems that are robust. Money and resources need to be set aside so we have the capabilities to deal with emergencies if and when they happen.11 And when the emergency is over, we must build up our robustness, i.e., our reserves and systems, again.
Of course, maintaining robustness may mean that at times we will have set aside resources that aren’t used because things don’t happen. Some people may say that, in doing so, we wasted money. But that is not true. Just because we don’t collect on an insurance policy, doesn’t mean the premiums were wasted.
Second, we need resilience – in government, the private sector, and people. We need to be able to bounce back when an emergency hits. It is like one of those inflatable clowns. You hit it and it comes right back up.
Third, we need to be nimble. Nimbleness (is “nimbility” a word) means that we are positioned to respond to emergencies when they occur. Another word was this might be flexibility. We do not know what is going to happen, but we are prepared to react promptly regardless of what happens. We can go this way or that. Do this thing or something else. Obviously, this is not what big bureaucracies, especially government but also private, are good at, so we need to work hard to be nimble. Maybe it means working to instill the spirit of “nimbility” in big bureaucracies. Or maybe it means having a small group ready to jump in right away, with bigger groups to follow.12
Robustness. Resiliency. Nimbleness. These are what we need if we are to be ready for the emergencies and disasters that will happen, though we don’t know which ones or when. Most of all, however, we need leaders who understand the importance of these qualities and who can provide the leadership to help us secure them and maintain them.
--------
1 Andrew Krepinevich, 7 Deadly Scenarios: A Military Futurist Explores War in the 21st Century (2009), p. 94.
2 Krepinevich, p. 97.
3 Krepinevich, p. 109.
4 Tim Harford, “Why we fail to prepare for disasters,” Financial Times, April 18, 2020.
5 Or because there is nothing else to talk about.
6 Smart people learn from their mistakes. Really smart people learn from other people’s mistakes.
7 Tim Harford, “Why we fail to prepare for disasters,” Financial Times, April 18, 2020.
8 Krepinevich, p. 8.
9 Krepinevich, pp. 9-10.
10 Emergencies that we can predict and that happen on some kind of regular basis aren’t really emergencies, at least of the kind I am talking about here. They’re daily life, at least for society as a whole.
11 The ultimate case of a lack of robustness would be the Illinois. Illinois had months and months of unpaid bills backed up and either $137 billion or $240 billion of unfunded pension liabilities (depending on whether you believe Illinois or Moody’s Investors Service) before the coronavirus crisis hit – because Illinois politicians like making promises and spending money but they don’t like raising the taxes to pay for them. That is the definition of a lack of robustness.
12 In addition, and if possible, systems should be, in the words of Nassim Nicholas Taleb, “antifragile.” Antifragility goes beyond resilience and robustness. Antifragility means the ability to adapt to, and even thrive in, disorder. Resiliency absorbs shocks and preserves. Something that is antifragile gets better when shocked or in an emergency.
Comments