,

Got Lazy Brain? 3 Tricks for Overcoming Cognitive Bias

Breaking news alert: our highly evolved brains continue to let us down. As much as we would like to pretend that we are completely rational, logical, and thoughtful beings – especially at work – most of us struggle with bias of the cognitive kind. Cognitive biases are systematic errors in our thinking that arise because our minds use heuristics or mental shortcuts, instead of spending the energy required to fully process the concept at hand.

Not surprisingly, these biases are hidden and powerful. Indeed, they are so powerful because they are hidden in plain sight. Because we are largely blind to cognitive biases, we are unaware of how they impact our decision-making, inform our approach to persuading others, and even dictate our perception of past and future events. But as leaders in government and the private sector, we cannot afford to make sub-optional decisions, ineffective resource allocations, or faulty arguments during critical times. A lazy brain just won’t do.

The good news is that these biases can be overcome. The first step is to be aware of what they are and how they impact our thinking. As the old saying goes “knowledge is power.” Once we know how our brain tricks us in predictable ways, we know what to look for. Then, we can intervene more thoughtfully to correct our thoughts and behaviors by thinking harder and more deeply.

Many of us are familiar with, or have at least heard of, the more common biases. Three common ones are confirmation bias (we seek to reaffirm what we already believe to be true; see rise of fake news), the sunk cost fallacy (we needlessly throw good money after bad because we are already “invested”; see US involvement in Afghanistan), and hindsight bias (everything seems obvious after the fact; see election of Donald Trump). However, there are a set of deeper, more complex biases that also creep into our place of work. They are more subtle and harder to discern. I will share three that I frequently observe in my line of work.

#1 – The Dunning-Kruger Effect

“An expert is an ordinary fellow from another town.” – Mark Twain

Have you ever been in a meeting when a colleague makes a statement regarding a complex topic with absolute certainty and utmost confidence? Of course you have. And how often has this unwavering opinion swayed the collective into siding with your overly confident colleague? Many times. Unfortunately, odds are that your colleague is wrong. What is more, they know far less about the topic than they think they do. The world today is filled with self-proclaimed “experts” who never doubt themselves. They argue louder and more forcefully than the rest of us. Of course, the current political climate and the presence of the internet don’t help matters.

However, the Dunning-Kruger (a strange manifestation of illusory superiority) effect suggests that true experts actually understand the limitations of their knowledge which makes them seem less like experts. Knowing the boundaries of one’s understanding is the definition of expertise. Thus, experts tend to engage more intellectually honestly and with greater nuance when it comes to making recommendations and/or drawing conclusions. In some ways, the more you know about a topic, the less confident you seem to be about it because you know so much.

So, the next time you are debating a topic during a meeting and feel very confident in your viewpoint, take a breath. Remember that if you know a little about a topic – say the underlying causes of employee attrition – you are in a dangerous spot. Your brain makes it awfully easy to view combating attrition as far simpler than it actually is. Ask yourself – do I really know enough to be that confident?  If not, be mindful in how you frame your opinion. Similarly, be watchful of colleagues who brazenly proclaim universal truths despite the underlying complexity.

#2 – The Gambler’s Fallacy

“Every gambler knows that to lose is what you’re really there for.” – U2

The human brain struggles to fully comprehend probabilities in the context of repeated events. Quick – if a coin has been flipped 10 times and came up heads all 10 times, what will the 11th flip be? Did you say tails? If a couple has three boys in row, are they more likely to have a son or a daughter next? That’s the gambler’s fallacy at work.

Like a coin, there are two sides to this fallacy. The first variant is often called the “hot-hand fallacy” after the notion that it is best to pass the basketball to a shooter who has made the last several shots. We tend to believe that a hot streak is likely to continue. In our mind, having “won” several bets in a row, the probability of winning again is higher than losing – even if the real probability is 50-50. Think of the stock market. If the market has been up for five days in a row, we tend to believe that it will undoubtedly be up again tomorrow (ironically, part of the reason the market repeatedly rises is due to the hot-hand fallacy).

The second variant of this fallacy occurs when we are repeatedly losing. It is human nature to believe that our “bad” luck will change, and we will soon go on a winning streak – please, just one more bet. One need only walk around a casino to observe this sad reality. Said differently, we believe that repeatedly getting a (negative) outcome, reduces the probability of that outcome in the future. In our minds, we feel “we are due.” After all, the “arc of the universe bends towards justice.” It does, but Dr. Martin Luther King, Jr. also mentioned the arc of the universe being “long” and playing out over many, many blackjack hands.

The danger of this flawed mindset is that it inculcates us from honestly evaluating our behavior in the moment. The gambler’s fallacy impedes teams from making the required changes to affect outcomes. For example, if your team submits five proposals for new government contracts and wins all of them, do you change your approach? No way – you are actually far less likely to change your bid approach. If it ain’t broke… The issue that once we get on a “hot streak,” it introduces other biases such as overconfidence, the Lake Wobegone Effect, and the illusion of control. Given our natural struggles with repeat probabilistic events, we must train our brains to evaluate each action in isolation instead of linking it to past events.

#3 – The Survivorship Bias

“Victory has a hundred fathers but defeat is an orphan.” – Count Caleazzo Ciano

Far too often, we make decisions and modify our behaviors based on highly-visible, anecdotal evidence of success. For example, it would be foolish for a high school graduate to believe that the path to material riches is by dropping out of college since Steve Jobs, Bill Gates, and Mark Zuckerberg all did it. We tend to overlook the millions of folks who didn’t turn out to be billionaires who also dropped out. Similarly, how often do you admire an old gadget or appliance and marvel that, “They just don’t make it like they used to” totally forgetting that so many other older purchases have long gone in the trash?  That’s survivorship bias.

At work, we may ask ourselves “What did Google do when it faced this situation?” or “How did Amazon handle this challenge?” because we are acutely aware of the wild successes these companies have had. However, this approach of mirroring well-known examples can lead to false, often dangerous conclusions. By overemphasizing winners, we fail to consider that there are countless companies who may have made the same decisions Google and Amazon made only to have gone out of business because of those very same choices. The problem is that we just don’t know about the losers. We only are aware of the survivors, the winners. But these forgotten losers also matter when a leader is deciding what to do.

Offsetting this bias requires doing a bit more homework and casting a broader data collection net. Walk through the graveyard of failures and not just winner’s row. Instead of defaulting your decision-making to reflect survivors, consider a broader data set which includes non-survivors as well. This will allow you to better understand the underlying context of the decision and not just the gold-plated outcome.

Deep, rational thinking is hard, time-intensive, and costly. No one wants to do it. Indeed, our brains have evolved to intentionally short-cut hard thinking with heuristics. Most of the time these latent heuristics serve us quite well. We don’t need to think deeply about what to wear or what to eat.

However, we operate in an increasingly complex world. There is more and more information flooding our minds. As such, it is incumbent upon leaders to determine which decisions and behaviors need to transcend our more basic thinking.  Not all of them. Just the critical few. Awareness of when and how cognitive biases impact us is paramount to staving off our ever-present lazy brain.

Wagish Bhartiya is a GovLoop Featured Contributor. He is a Senior Director at REI Systems where he leads the company’s Software-as-a-Service Business Unit. He created and is responsible for leading a team of more than 100 staff focused on applying software technologies to improve how government operates. Wagish leads a broad-based team that includes product development, R&D, project delivery, and customer success across State, local, Federal, and international government customers. Wagish is a regular contributor to a number of government-centric publications and has been on numerous government IT-related television programs including The Bridge which airs on WJLA-Channel 7. You can read his posts here.

Leave a Comment

One Comment

Leave a Reply