Understanding Risk: A Core Competency of Leaders

Take this simple test:  Which of the following has the greater risk of causing death in North America today? Being eaten by a shark or being killed by falling airplane parts? Being poisoned or having tuberculosis? Having leukemia or having emphysema? From homicide or from suicide? From all accidents combined or from a stroke?

While the second cause stated in each case is much more common,  the vast majority of people choose the first one. You are thirty times more likely to die from falling airplane parts than from a shark attack. And while most people think we’re twice as likely to die from an accident as from a stroke, in fact we are forty times more likely to die of a stroke. (I am indebted to Thomas Kida for this data.)

One of the great paradoxes of our time  is that we are healthier, wealthier, wiser and live longer than ever before, yet we are increasingly afraid of the unpredictability and complexity inherent in an exponentially changing world and the attendant risks we associate with that uncertainty.

We confront a modern world full of risks (and opportunities)  with a prehistoric brain, one wired primarily for fight or flight. When faced with fear and risk-reward choices, the amygdala (the cortical real estate where most of our emotions are situated) reacts before the pre-frontal cortex (the seat of reason and judgement) has taken the time to examine and assess, especially when those decisions are couched in unknowns. In neurological terms, risk assessment occurs largely in the dopaminergic (emotional response) processing area of the brain.

Beyond the physiological basis of risk appraisal,  our brains are poorly equipped to evaluate uncertainty and discontinuity for a variety of other reasons. In short, our brains are both efficient and “lazy” – they rely on heuristics, or mental short-cuts, to simplify complicated issues. These heuristics include such compensating adjustments to incoming stimuli as the attraction of similarity (if a phenomenon does not fit a pre-existing category, we add or subtract information to make it fit), stereotyping, availability of information and anchoring (or the impact of initial information on our mental processing).

In addition to our penchant to simplify and categorize  for the purpose of “making sense” of the information, we tend to worry more about possibilities while ignoring probabilities. The greater the fear, the less we calculate the odds of it actually happening.

Some people, for example, have a morbid fear of flying,  preferring instead to drive. About 44,000 people die annually in car accidents as opposed to around 200 who perish in aeronautical incidents (indeed, more people will drown in their own bathtubs on an annual basis – around 325 – than will die from flying). While the odds are clearly stacked against drivers, the cognitive debilitating phobia associated with flight prevents them from computing the actual probability of death. Following the terrorist attack of 9/11, many people stopped flying. As a result of actuarial data, we know that a further 1,595 additional people died in car accidents as a consequence of that choice. On analysis, before the chance of being in a plane crash would exceed 50%, you would have to fly on an airline every day for 18,000 years.

There are countless other examples  of the inability, if not inherent irrationality, of assessing risk. A few years ago, despite the almost worldwide fear of the Avian flu, no one died of this disease in North America. Yet the common flu kills about 40,000 people a year. While 18 humans died of West Nile Virus in 2001, 875 Americans choked to death on their food. Far more people die on beaches from falling into sand castle holes (16 people/year) than do from shark attacks. While humans kill 26 million sharks annually, the chances of being killed by a shark today are about 1 in 280 million.

Since you are unlikely to be confronted by a shark in your lifetime,  consider the risk associated with the decision to choose a partner – something almost every person contemplates in his or her early adulthood. Some 94% of Canadians believe that marriage to that special someone will last the rest of their lives, yet in fact 52% get divorced. The point is simply that good data can bring needed perspective to evaluating risk.

Nor are we much better in our ability to predict the occurrence of risk. In his book, Expert Political Judgement, Philip Tetlock analyzed 82,361 predictions by acknowledged and well established experts in their fields and found the accuracy rate at much less than 50%. On that number of predictions, flipping a coin would actually serve to increase your ability to prognosticate rather than accept their expertise at face value. (Tetlock also notes that when these experts were asked to recall their predictions, they revised them closer to what actually happened.)

Accurately predicting the movement of the market  and managing risks for investors, despite what the gurus might tell you, is little more than a mug’s game. Yet there are over 200,000 investment advisers in North America today who try to convince us this can be done. Again, consider the evidence. The number of funds that have performed in the top 50% for four years running is 4%, less than what chance alone would predict.

Regardless of the arena of expertise offered,  whether the analysis is based on charts, “market fundamentals” or other technical criteria, no adviser has ever produced consistent above-average returns. Indeed, 46 of 48 economic forecasts from America’s major forecasters and think tanks over a 25 year period could not predict the economy’s major turning points. This is simply because the market is a psychological soup of fear, greed, hope and superstition, all of which generates chaos and complexity … which are inherently unpredictable.

So how can today’s leaders become more adept at predicting,  assessing and managing risk? The answer lies in enhancing one’s risk intelligence – reducing uncertainty by making strategic choices based on knowledge developed through observation, exploration, learning and sharing. While some risks are learnable, i.e., principally a matter of identifying and closing knowledge gaps, others are random or indeterminate. In which case, no amount of knowledge will reduce the uncertainty. (The antidote to this situation will be covered in a future article.)

George Day and Paul Shoemaker (Peripheral Vision: Detecting the Weak Signals That Will Make Or Break Your Company, Harvard Business School, 2006) claim that “97% of executives today lack an early warning system for detecting high impact surprises.” Indeed, few organizations today pay sufficient attention to building such an early detection capability, one that enables the key decision makers to identify, interpret and act on the signals that connote either risk or opportunity. Mark Penn’s research and counsel are instructive on this point (Microtrends: The Small Forces Behind Tomorrow’s Big Changes. Twelve, 2007).

To build a risk intelligence capability in your organization,  find ways to use more of what people know and give people more to know that is useful – knowledge increases when widely shared. Allow people free time to think, minimize meaningless bureaucracy, create information-sharing networks (not hierarchies or silos) and incentivize high performance. Moreover, seek to gather, then synthesize, the kind of information that enables you to answer questions like those David Agar suggests (Risk Intelligence: Learning to Manage What We Don’t Know. Harvard Business School, 2006):

  • Do we have enough relevant experience in the area of our specific challenge or decision?
  • Are the information and insights we possess sufficiently insightful to gain needed perspective?
  • Are our sources of information and our experiences sufficiently diverse and objective?
  • Have we broadened our information inputs through external partners and networks?
  • Have we methodically captured and assessed what we know (and don’t know)?
  • Have we audited and prioritized our risks to our complete satisfaction?
  • Can we build in shock absorbers and contingencies to minimize damage?
  • Can we minimize or distribute the risks (to others or through resource sharing)?
  • Are the risks known by significant stakeholders, shareholders or parties of direct interest? 

Like just about every other leadership capability,  anticipatory prowess is a skill that can be developed and strengthened. You do it by listening to the mavericks and the complainers (without unnecessarily energizing them), being insatiably curious and asking “horizon” questions, harvesting the knowledge of departing intellectual capital, probing the minds of key users and influencers and always testing, challenging and experimenting with your “antennae” raised.