How to Mitigate Bias

How to Mitigate Bias

Bias is the crutch our brains use to simplify our lives and keep us happy. We like to believe we’re in control of our brains and that we act rationally based on our self-interests. Nothing could be further from the truth. Our brain has a mind of its own and the focus of neuroscience is to try to unravel its mysteries. What we do know is that most of our behaviour is driven by subconscious cognitive tendencies and a reliance on imperfect mental shortcuts known as heuristics. While these keep the brain healthy, they also cause most of the misfortune in our lives.

If you do believe you’re an exception to this mental and emotional impairment, think again. The mother of all biases is the belief we’re not biased. We tend to think we update our knowledge with the most recent and accurate information. But how many things do we fervently believe that aren’t actually true? The time in which we now live is testimony to that reality. What happens when our entrenched beliefs are discredited by the latest scientific evidence or when a seemingly reliable account from a trusted source is later retracted? Do we change our minds that easily? (Think anti-vaxxers for starters.)

In our search for certainty, we look for coherence rather than accuracy. If it makes sense to us, we readily embrace it as fact. When what we deem to be credible supports our understanding of a story’s cause and effect, it becomes sticky. So, we accept it as the truth even when we’re subsequently told it’s not. We rarely bother to seek out counter arguments that might disconfirm what we believe is real, preferring instead the comfort of our existing knowledge – however misinformed it may be.

In my research over the years (this topic underscores much of my teaching), I’ve identified at least 35 different hard-wired cognitive impairments or mental distortions that severely impoverish our thinking. Some are more applicable to leaders – my primary focus – in unhealthy ways. One has been coined the narrative fallacy. This is a strong desire to impose an explanation on random events so as to bind a limited numbers of disparate facts together into a convenient or plausible theory. This bias particularly haunts those CEOs who hold dear the “predict, command and control” model of leadership.

For illustrative purposes, let’s more deeply examine one of these heuristics. William James, deemed by many the father of psychology, once said: “The attention we give to something is directly proportional to what we remember about it.” This is called the availability bias. The more available, shocking, unusual or trivial the information, the more important we may consider it to be. It’s simply too much work for our lazy (albeit efficient) thinking machine to comb through every piece of data that might in sum comprise the information bundle.

Hence, we frequently overestimate the significance of what is immediately in front of us, underestimate the likelihood of probable antithetical events and form judgments based on hearsay more than further investigation – what is known as thinking on all cylinders. We prefer engaging narratives and fairy tales over facts, pictures over data, and personal experiences over objective research. Anything that makes something easier to understand or remember increases its influence on our lives. In assessing risk, for example, we’re handicapped by an indolent brain that relies more on mental rules of thumb to simplify highly complicated issues.

Daniel Kahneman, the Nobel laureate widely regarded as the originator of behavioural economics, writes in his classic treatise Thinking Fast and Slow that “People tend to assess the relative importance of issues by the ease with which they are retrieved – and this is largely determined by the extent of their coverage in the media.” The media doesn’t report the news; rather, it decides for us what is news. And it frames these short stories in its own ideological way. It repetitively conveys rare incidents as everyday occurrences when they may simply be, in fact, inconsequential risks for most of us.

Unusual events receive our attention more than commonplace ones. We prefer driving to taking a plane because the dangers of the latter are more newsworthy, therefore memorable. Although flying is 100 times safer than driving, 30% of us suffer from aerophobia (fear of flying). Statistically, we’re more likely to die being kicked by a donkey or drowning in our bathtubs. And falling out of bed kills more than 600 people in the U.S. every year. But the news of air disasters are more readily available.

To reach the equivalent of annual road deaths, a fully loaded jumbo jet would have to crash every day. You’d have to fly daily for 18,000 years before your chance of being in a plane crash would exceed 50%. But, unmindful of objective actuarial data, our brains think otherwise. Mountains of studies indicating something is harmful (like the pandemic) don’t convince everyone to deal with it, especially if those dangers aren’t considered real, haven’t been personally experienced or run counter to the views of those we trust.

We are what we remember and our memories influence how we perceive. We remember what we want to believe and what our emotions inspire us to want to happen. We are overly influenced by the source of the information received and the frequency of our exposure to it. But there’s no connection between our memories of the past and what might happen in the future. In truth, the opposite is often the case. Overestimating the risk of unlikely events is an enormous waste of our irreplaceable time and limited resources. And that can prevent us from focusing on what does matter.

It’s virtually impossible to eliminate our biases. They’re hard-wired. We can only try to better understand them, then mitigate their negative effects on our thinking. Even that is difficult. A 2019 meta-analysis (495 studies) confirms that training in implicit bias recognition involving race and gender does not change behaviours. In some cases, it backfires leading to even more discrimination. Less than 10% of such training provide strategies for reducing bias; the rest focus on raising awareness of its existence and manifestations.

Here are a few ways to mitigate this bias. When evaluating the probability of what could happen, consider historical data. Research prior events similar to what you fear may happen. The odds of being killed by a shark are about one in 280 million (although 26 million sharks are killed annually by humans). Falling candy or pop machines kill more people than sharks. The dreaded Avian flu resulted in zero deaths, while the common flu caused 40,000 in the same year. The antidote to lessening the impact of most biases is to engage our natural skepticism by seeking out credible disconfirming data. Then think about it … and decide for yourself.

When faced with uncertainty or random events, start asking questions. Do I have enough relevant experience to make a sound judgement? Are my sources of information diverse and objective? Have I methodically captured and assessed both what I know and don’t know about the issue? Have I satisfactorily audited and prioritized the risks? Are my assumptions realistic? Can I build in options and contingencies to minimize the potential damage? The objective is to fill the obvious knowledge gaps.

We confront our volatile world with a prehistoric brain wired primarily for fight or flight. Step back and consider the bigger picture. Outlier events may be more memorable but they’re just that … atypical. Be mindful of how the brain works. It creates short cuts to save the energy required to parse a ton of unknowns. So it makes rapid judgments – some good and some bad. Overcome that urge and discover how to live more fully and freely. Slow down the process … stop and think!