Why We Make Bad Decisions in the Workplace

Buster Benson was filling his time on paternity leave when he came across Wikipedia’s exhausting list of cognitive biases. He synthesized the 175 biases into vague categories and with the help of John Manoogian, organized these predispositions into a poster they called the cognitive bias cheat sheet. It has been viewed over 700,000 times which has motivated Benson to start writing a book about the 4 qualities of the brain that can make us do crazy things.

Too Much Information
It is called priming. Where our brain interprets experiences based on the previous encounter. It is through this lens that we notice bizarre things that do not conform to our reality. This is why we have trouble embracing change. We are subsequently drawn into thinking that matches our preconceived notions of the world. Once on this slippery slope, we tend to see the speck of dirt in our neighbor’s eye and avoid the log in our own. We start to write our personal stories based on these patterns that prohibit us from seeing other people and things as they really are.

Not Enough Meaning
We take our experience and start filling in the gaps of our understanding of the world based on limited data from biases and beliefs gained from stereotypes. We develop in-group and out-group bias. We gravitate toward our in-group and we move away from our out-group. Our brain thin slices data by taking short cuts as we frame the information in ways that are easy to understand. Projection bias enters the picture as we project as beliefs on other people by assuming we know what they are thinking. We take the most recent events supported by limited information and project them into our past and future behavior.

Not Enough Time or Resources
We gravitate to the quick fix and ignore the hard work of taking another look at our preconceived notions. To preserve our king of the hill status and in-group strength, we will stubbornly hold on to our biases which prevent us from revisiting our decisions. We justify investment in judgements based on prior outlays even in the face of evidence that suggests our original decisions were wrong. In other words, it is too late to turn back now. We fall victim to temporal discounting when presented with multiple options, we go with the immediate one. We will squeeze the trigger too quickly since we want to preserve our status of making a difference and doing what feels good in the moment.

Not Enough Memory
We categorize what happens in our life based on how we experience life. We organize data about those experiences into the smallest categories by discarding information that would allow us to see these events in a broader light. To make up this deficit of knowledge, we develop generalities about what we are going through. To cover our behinds even more, we continue to edit our interpretation of life occurrences long after the original experience.

To summarize, our brain processes too much information, gives us limited understanding of that information, leads us to think we do not have enough time or wherewithal to interpret the information and has limited capacity to store the information accurately.

Now you know one aspect of why we make poor decisions in the workplace. We often don’t see people and things as they are. We see them as we are. Until we clean the dirty windows of our own lives, we will never view others in ways where they can meet their full potential.

Leave a Comment

One Comment

Leave a Reply

Mark Hammer

One of the more pivotal articles I ever read was in, of all places, an issue of BYTE magazine in the early 80’s by Nolan Bushnell,then-president of Atari. The article was on the psychology of video game design. Keep in mind we were barely past Pong at that point, and Frogger, Asteroids, and Space Invaders were considered “au courant”.

Bushnell’s central thesis was that, in a battle of wits, the human would beat the computer every single time, hands down (this is well before Ken Jennings vs. Watson). So the basis of videogame design revolved around giving the computer enough of an advantage that there was some challenge to the game for the human. I forget all the details, but much of what Bushnell outlined actually mirrors what you have in the piece above, Richard.

For example, the computer gains an edge over the player by restricting the information available to the player. So, you don’t know that object X will be useful in a later level of the game until you arrive at that level. Now you have to go restart and acquire that object in an earlier level. Or perhaps you don’t know that some challenge lurks around a corner, or that some other change in the system is going to occur. Not knowing leads to poor planning and bad decisions that require restarting the game before you can proceed.

The computer gains an edge by restricting the resources available to the player. That would be reflected in things like how many “lives” one has, how many bombs/shots one has, useful objects left, how many dollars to spend, etc. Limited resources obliges the player to be very deliberate and careful in their actions, often leading to mistakes.

The computer gains an edge by asking the player to do too much in too little time. “Space Invaders” was a perfect example of that, in its time. As the game progressed events would happen more and more quickly and the “invaders” would inch ever closer and move quicker so that one had less time to respond and needed to respond to more events.

What has remained a fundamental aspect of making computer-generated amusement tasks harder for humans also forms the basis for why work-related tasks can be hard for people. Human minds tend to work the same, whether seated at a desk beside coworkers, or slouching on the sofa with a controller.