Sequels crush it at the box office. Not because they are novel but because they are familiar. They require less ingenuity and risk-taking from the studios. They enter the market with an established fan base. That’s why there are so many of them. Low risk, consistently high reward. It’s a tried and true formula for Hollywood (and for writers).
In honor of Avengers 4 (I think), Frozen 2, John Wick 3, and of course Star Wars 9, I am writing a follow-up to my earlier post on cognitive biases in the workplace. After all, there are so many biases that impact our day-to-day decision-making that I had to bring back the main characters. Just one post isn’t enough. These biases are so pervasive that they oftentimes lead to inferior outcomes across government and industry without us even knowing it.
My prior post covered three deep cognitive biases that affect all of us – illusory superiority, the gambler’s fallacy, and survivorship bias. The brain, in its attempt to conserve energy and do the least amount possible as fast as possible, uses these heuristics or biases to (often erroneously) guide decision-making.
In an attempt to help us overcome our lazy brains, this post will cover 3 more foundational biases – the fundamental attribution error, illusory correlation, and the availability heuristic. In terms of sequencing, this post covers more common if not basic errors of the mind. Perhaps this post is actually a prequel to my prior one…
#1 – The Fundamental Attribution Error
“Don’t blame the holidays. You were already overweight in August.”
Picture this: you are in the checkout line at the grocery store minding your own business. Suddenly, the individual ahead of you starts screaming at the cashier about … the price of socks? Think Black Friday freak out style screaming.
What is your immediate reaction? If you are like most of us, you immediately judge the person ahead of you (who you do not know at all). He/she is rude, obnoxious, and obviously a jerk. And if you live in the Washington D.C. metro area, you may also believe that he or she also votes for the opposing political party.
Now picture this: you are about to head home after a long, tiring day at work. The phone rings and your significant other calls you and asks you to pick up dinner for the family. OK, no big deal. On the way out the door, your boss “asks” you to work the upcoming weekend because of an upcoming deadline. Working the weekend will cause you to miss the concert you were really looking forward to. So be it.
The drive home takes extra long because of an accident that causes a traffic jam. And when you show up at the restaurant (late), you find out that there was a mix up with your order. Your items aren’t ready. Your stomach is growling. Now you must wait for them to be made… again. Your reaction – a little screaming.
Are you rude, obnoxious, and obviously a jerk? Of course not. You have had a very long day and are tired. You are going to miss your favorite band in concert. There are so many terrible drivers out there. Who mixes up a food order? After all, you are fundamentally a patient, reasonable, empathetic person, right?
“We judge others by their actions and ourselves by our intentions.” – Stephen Covey
This example highlights the fundamental attribution error at work. At its core, this error leads us to overemphasize personal characteristics when judging others’ behavior (generally negatively) and underestimate our own personal flaws.
When it comes to our own behavior, we blame the situation and circumstances for what went wrong. As humans, we paint others in a negative light far more often. And we rarely critique ourselves for the very same behavior.
At work, the fundamental attribution error manifests itself most often in how we assess others – our teammates, our bosses, and our peers. If teammate Tom incorrectly puts together an Excel model, we will attribute it to their personality (i.e., Tom is lazy or he is not a detailed oriented person) or skill level (i.e., Tom lacks the requisite Excel understanding or he is not very analytical).
However, what if we made the very same mistake? We will most likely attribute it to the situation (i.e., I had an overly aggressive timeline for getting it done and had to rush, I didn’t sleep well the night before, no one actually explained the assignment to me, etc.).
The fundamental attribution error is a very basic bias but pernicious. Over time it can become a poison in any team or organization. Managers and leaders must be especially mindful of it. The escalation from your simple judgment of a team member (Jill is late to two meetings) to constantly frustrating staff by the way you interact with them (you never assign Jill hard assignments) to creating a disengaged team (Jill looks for a new job) can be rapid and hard to stop. It is a doom loop.
The best way to nip the fundamental attribution error in the bud is to pause before judging others. Is there something else at play here that is circumstantial? Talk it out first. Give others the benefit of the doubt. Use many data points over time, not just a few. Conversely, don’t always let yourself off the hook when something goes wrong. Maybe there is something fundamental about yourself that you should actually work on.
#2 – Illusory Correlation
“Cars with flames painted on them get more speeding tickets. But are the flames making the car go faster?” – Barbara Kingsolver
The human brain naturally tends to see patterns where none exist. Think about all the cloud animals in the sky or religious figures found in our food. Our minds seek order and structure. We don’t do well with randomness and chaos. Not surprisingly then, we also tend to perceive relationships between two unrelated variables (especially regarding people) even when no such relationship exists.
Stereotypes and superstitions are prime examples of illusory correlation. We all have our stereotypical view of how greedy/slimy a used car salesman is supposed to be. Does that change how we act the next time we go to the dealership? Many of us were warned to stay indoors during a full moon growing up (see origins of the word lunatic if you don’t believe me). You make the game-winning goal wearing a pair of red socks you never wore before. Guess what? You never play a game without them.
“Superstition and truth cannot go together.” – Mahatma Gandhi
Correlation is not equal to causation. It is very easy to let your lazy brain trick you into thinking otherwise. Imagine you are a sales manager for a widget company. Checking on the health of your territories, you pull the new sales pipeline report for the Eastern Region. Alas, the Eastern Region is under-performing. You may jump to the conclusion that the territory isn’t viable, or that the manager needs to be replaced. Instead, your first step should be to assess the skill level of the sales reps. What if you swapped in stronger reps from other higher-performing territories? How would these lower-performing reps do in currently over-performing regions?
So how do we combat conflating correlation with causation? The simple answer is to be mindful that it is very easy to confuse correlation and causation. It is for this reason that the value of running “experiments” in the workplace cannot be overstated. They address the issues arising from correlation directly by grounding our assessments with numerical data. In addition to experiments, consider bringing in outside participants to pressure test decisions and choices. They won’t have the “benefit” of past discussions and knowledge.
#3 – Availability Heuristic
“I have done the calculation and your chances of winning the lottery are identical whether you play or not.” – Fran Lebowitz
Memory is a funny yet highly imperfect thing. Unfortunately, when it comes to making decisions, human beings are overly influenced by what we remember. And what we remember is overly influenced by what we can most easily recall. The human brain is constantly seeking to conserve energy.
One way to do that is to overstate the importance/salience of easy to recall memories. Basically – if I can remember something easily, it must be important and accurate. But recalling something quickly and forecasting its likelihood in the future are very different things.
A commonly cited example of the availability heuristic is word composition. Ask 10 people if there are more words that start with the letter “k” or have “k” as their third letter and you will get near unanimity. There are more words that start with “k”.
But that’s wrong. Why do we believe it to be? Because it is much easier to think of words that start with “k” – kale, kitchen, kick, kangaroo, kiss, etc. than it is to think of words like ask, bikini, bake, ink, awkward, ankle, etc.
Similarly, vivid, emotionally tinged memories are easier to recall than more mundane ones. For example, most people (quite incorrectly) believe that the likelihood of dying by shark attack or terrorist attack is far greater than being killed by fireworks or heart disease. This is where media and the constant barrage of images on the news damage our ability to parse real statistics from heuristics.
The number of tourists to beaches drops when news of a shark attack is on TV. Conversely, lottery companies love to show highlights from winners holding oversized checks to drive more ticket sales. Purchases of insurance increase after (not before) natural disasters even though the likelihood of a hurricane is no different. And it is actually safer to fly on a plane after a major accident than during a long-standing safety streak.
To mitigate the impact our memory has on decision-making, we must ask ourselves for data. Our gut instinct – weakened by our faulty memory – is wrong far too often. The worst part is that we don’t actually realize why we are making the decision never to go to the beach again. There is a concept in probability known as the base rate. The base rate is the abstracted likelihood of an event occurring. For example, the likelihood of being killed by a shark is 1 in 3,748,067. Simply put, before making a decision based on what you believe the likelihood of an outcome occurring is, consider using the the base rate as a balancing data point.
To quote my prior post – “Deep, rational thinking is hard, time-intensive, and costly.” Our brains have slowly evolved over the many millennia. We no longer live a life of scarcity where conserving our brain’s energy is paramount. Unfortunately, our biology simply cannot keep pace with the tremendous technological and sociological changes of the world we live in today.
Government and industry leaders must be cognizant of the prevalence of decision-making heuristics and the potential downside of relying on them. Cognitive biases are invisible without the right pair of glasses. And awareness is the only lens we can use to see them. Greater awareness gives rise to better decisions. It’s time to stop producing lazy brain sequels.
Wagish Bhartiya is a GovLoop Featured Contributor. He is a Senior Director at REI Systems where he leads the company’s Software-as-a-Service Business Unit. He created and is responsible for leading a team of more than 100 staff focused on applying software technologies to improve how government operates. Wagish leads a broad-based team that includes product development, R&D, project delivery, and customer success across State, local, Federal, and international government customers. Wagish is a regular contributor to a number of government-centric publications and has been on numerous government IT-related television programs including The Bridge which airs on WJLA-Channel 7. You can read his posts here.