What makes even the brightest sometimes squander their brilliance in breathtaking acts of stupidity? Although it’s looking in the rear-view mirror, the 2008 market meltdown is instructive. Some of the most revered business gurus illustrate the case in point. Jamie Dimon, then CEO of J.P. Morgan Chase, one of America’s largest and most respected banks, said this:“What the hell were we thinking? These things were way too complicated!” Allan Greenspan, the architect of America’s meticulously crafted economy at the time said:“I am in shocked disbelief at our economy’s collapse.” And Ken Griffen, CEO of Citadel Insurance, a 15 billion dollar enterprise, noted “What happened was not in my range of realistic scenarios.”
How could so many smart leaders get it so wrong? These are not stupid people. They just didn’t fathom the nature of risk, despite their credentials. The paradox of the meltdown was that the strategy of avoiding excessive risk, by dispersing it, exacerbated the very problem it was designed to prevent. The mental traps that bedevil smart people include the hubris that accompanies their expertise. Therein lies a lesson for us all. No one, not even bona fide experts can predict extreme events. Nicholas Taleb calls them black swans. Past events bear no relation to future shocks. Hindsight is not foresight. In their overconfidence and exuberance, they failed to buy the insurance needed to hedge their bets.
Bright people have a predisposition to rely on sophisticated analytical models to the exclusion of common sense. The ones used in this particular example were fatally sabotaged by poor assumptions: they neither perceived real estate to be in a bubble nor grasped the interconnectedness of modern finance. They didn’t take into account the fact that borrowers were financing their payments with new loans. So borrower debt was growing, not shrinking. The expert calculations didn’t foresee that basic truth. Extreme analysis and the addiction to certitude lead inevitably to unnecessary complexity and, eventually, delusion.
The research on failed business decisions indicates that, while those at the top do know their world is changing, they choose not to respond in timely or appropriate ways. Sometimes not at all. This is more than risk aversion. When faced with severe and daunting challenges, executives often don’t ‘think through’ the new realities. In his analysis of why business leaders get it wrong, Sydney Finkelstein found that catastrophic failures are caused by four destructive patterns of thinking. These are: flawed mind sets that reject perceptions of reality, delusional attitudes that keep this distortion in place, breakdowns in communications systems designed to handle emergent and urgent information, and leadership qualities that prevent them from correcting their course.
Why is this so? Madeline Van Hecke has some answers. She advises that our “blind spots” prevent us from understanding the behaviours required in a crisis – traits that deviate from the normal expectations of what leaders do. Unfortunately, in complex, exponentially changing, discontinuous business environments, these are becoming the norm. If leaders don’t know what these mental barriers are, how can they compensate for them and adjust accordingly? How can they use crises as opportunities for needed change?
Van Hecke claims Nobel laureates don’t achieve the esteem they richly deserve because their work involves a high level of abstract thinking but because they’re able to recognize and overcome the biases others in their field cannot. They see the possibilities others either ignore or reject as impossible. They grasp a perspective no one else considers. In other words, they choose to take a different course of action than their colleagues – by stopping to think, for a moment, about what’s actually happening that others take for granted, don’t comprehend or quickly dismiss.
What are the mental distortions and traps that bedevil smart people, especially those in positions of power? The obvious one is not taking a pause to consider what is unfolding before their eyes – what the signals are telling their gut. Feelings are data too. It’s remarkable how little people actually think about drastically altering the circumstances as they’re happening. The brain prefers normalcy. We especially don’t think about emerging and dire events when we’re distressed, fatigued, have too much information to process or are lulled by the comfort of existing routines.
Intelligence has little to do with the ability to think differently. Assuming we “know” the answer because of our expertise or experience is nonsensical. The problem is the affliction of thinking brilliance equates to perfection. Being smart is not about making brilliant decisions; it’s about avoiding terrible ones. Leaders need to embrace their ignorance as an opportunity and acknowledge the fundamental reality that no one can know everything. This is where objective and trusted advisors with the capability to speak truth to power often come in handy.
During times of radical change, such as the pandemic we are currently living through, there are additional blind spots of which leaders need to be mindful. They often fail because they see themselves or their organizations as successful and dominant, therefore immune to the risks of uncertainty. They may think they have all the answers, borne of internal experts with seemingly impeccable credentials, their prior experiences or proven decisiveness. They may be consoled by the fact that intimidating and difficult obstacles in past proved to be only temporary impediments that were easily dealt with.
Our “default setting” is to view the world from our unique vantage point even when circumstances suggest otherwise. When we want to believe something causes something, we can easily find a way to do so. We accept simplistic explanations rather than examining multiple causation. The mind has a difficult time accepting flukes, coincidences and randomness. The inability to take a systems perspective and observe the bigger picture often prevents us from taking the aerial view and seeing the forest as much as the trees.
Being smart is filtering out the familiar and getting beyond our normal or habitual sensory stimulation. When we get “used to” something, we no longer see, hear or smell it. But, like carbon monoxide, it can kill us. This is why smart people can fail to notice the critical exceptions and extraordinary data – aka the signals. These ultimately prove to be either lost opportunities or the cause of serious trouble. A lack of self-knowledge, or a failure to see ourselves as we really are, is the driver of incompetence. Wisdom comes when we allow our vulnerabilities to transform us rather than merely inform us.