In times of economic uncertainty, smarter government is a mandate. When information can be analyzed and presented more effectively, the result is better decision making, reporting and insight. New collaboration tools enable governments to transform relationships with citizens, creating an environment where efficiency and management by performance are the norm. IBM can help you governments become a smarter government.
Improve Government Health and Social Programs
Improve Public Safety
Improve Building Management
|Social Programs: Improve social programs & citizen-centric services||Becoming a smarter city: Six public safety projects that deliver quick results||When Buildings Become Intelligent|
Using Analytics to Equip Communities for Data Driven Decisions
Why Data is Your Most Critical Asset
Is “Trusting Your Gut” a Viable Decision-Making Method?
June 16, 2011 at 1:34 pm #133034
I don’t know about you, but the book “Blink: The Power of Thinking Without Thinking” by Malcolm Gladwell really left a lasting impression on me. If you haven’t read it, I would encourage you to check it out. Here’s the premise:
“We live in a world that assumes that the quality of a decision is directly related to the time and effort that went into making it…We believe that we are always better off gathering as much information as possible an depending as much time as possible in deliberation. We really only trust conscious decision making. But there are moments, particularly in times of stress, when haste does not make waste, when our snap judgments and first impressions can offer a much better means of making sense of the world. The first task of Blink is to convince you of a simple fact: decisions made very quickly can be every bit as good as decisions made cautiously and deliberately.”
I bolded those last two sentences because it smacked me in the forehead when I first read it and I continue to think about it as I am faced with a deluge of data in making decisions every day. Usually, those decisions need to be made quickly with very little time given to analysis. “Execute. Iterate.” seems to be the guiding mantra of our time.
What ever happened to “Deliberate” and “Evaluate” as the two bookends of that approach?
According to Gladwell, too much analysis — either in the planning or in the post-mortem phases — is overrated or needs to be minimized.
But that way of thinking runs counter to the aim of this group and the typical approaches to problem-solving in the public sector, which is prone to risk mitigation and avoidance in order to protect public resources.
So what you do you think?
1 – Should public servants “trust their guts” more in the decision-making process in order to speed time to execution?
2- Or should you continue to carefully analyze and take more deliberate (even if they are painstakingly slow) steps to achieve desired goals and outcomes?
June 16, 2011 at 1:48 pm #133090
Is it a viable method for decision making? Yes. Is it well-suited for the public sector? Probably not. Reason being that government leaders have to be prepared to defend themselves, if necessary at hearings. Having a thoroughly reasoned explanation for how you came to a particular decision is important.
In the private sector, a manager may be able to get away with playing his gut. If he’s right, the company succeeds and nobody is going to question him. If he’s wrong, it’s his job that’s on the line. How he came to the answer he tried is not as important. The private sector is results oriented. In government, it’s more about the processes.
June 16, 2011 at 1:50 pm #133088
i would like to add incentive-based behavior to Andrew’s discussion line. It seems that too often in government, our gut instincts are tied to satisfying the incentives and mores of the organization, rather than what’s needed for a “right” decision. The worst case I’ve seen is where a group of agencies formed a team to decide on how to integrate systems, data, and processes for faster and better resource allocations relative to national security threats. The team each thought that the least risky approach was to go slow so that they could methodically tailor the project to the changes at the speed that their agencies could absorb. When they pulled together their analysis and data, it turned out that approach was 6 times more risky. Ironically, the greatest source of risk was timeliness of decisionmaking, as the slower approach required three times the number of management decisions…
June 16, 2011 at 2:00 pm #133086
Oh, the irony! How do you incentivize speed?
June 16, 2011 at 2:02 pm #133084
…but does it have to be this way? Why can’t government be more results-oriented, striving to speed up the processes themselves? My sense is that we could take a careful look at every step of those processes and see where there are bottlenecks or barriers, then work toward removing those through alternative resource allocation.
June 16, 2011 at 2:11 pm #133082
that my friend is the question for all government reforms targeted on faster, better, more responsive government. Fast decisions generally mean too much personal risk for the government decisionmaker. In other words, lots of people can say NO, but who actually gets to make the YES decision.
June 16, 2011 at 2:13 pm #133080
… The bottlenecks are inherent to democracy. We have an adversarial political process, whereupon government decisions are questioned in a forum that does not trust the intentions of government leaders. This is quite a shame, as it turns out our ‘gut’ is always involved in decision making. The question of “reason or emotion” is a false choice – your emotions are engaged by definition. It’s how brains work.
There is a reason deliberate planning feels unnatural: it is.
June 16, 2011 at 2:21 pm #133078
Remember with these sorts of things its almost always never an either/or.That’s certainly true in this case. Sometimes intuition is valuable, sometimes its dangerous.
Gladwell takes his argument too far. There are times when are decision making process can be highly automated. Typically this is in domains where we are highly knowledgeable and highly experienced. Our brain is able to use that knowledge and experience without our knowledge (so to speak) and so we get what we call intuition or instinct.
We are pattern matchers, not information processors. So when our brain spots patterns that it can recognize, it acts accordingly. It all looks a bit like magic, but we’ve understood these things since the early days of Herb Simon and Co doing research with chess players (back in the 60s/70s).
If you want the antithesis of Gladwells book, read Madeleine Van Hecke’s book “Blind Spots: Why Smart People Do Dumb Things”. I’ve only skimmed it, but I spoken with Madeleine and I think she’s right on with many of her premises. And she actually contrasts her theories with Gladwell’s so its pertinent to this discussion.
This is a very useful question for our time though Andrew so thanks for posting it.
June 16, 2011 at 2:29 pm #133076
As with many things, it depends on the situation. A public safety director arriving at the scene of a major disaster probably needs to go with the gut and respond rapidly, trusting that years of experience in similar situations will lead to more correct than erroneous decisions. When the exact same director prepares to preposition equipment, supplies and personnel to best respond to future disasters, a more analyitcal and delibrative approach is likely to produce better results.
June 16, 2011 at 2:45 pm #133074
There’s a time and place for everything. If I have the time to do an analysis on an important decision, I’ll do it. However, we don’t always have the luxury of time and have to make snap decisions.
That’s why intuition is important.
I know the common perception is that intuition is something that we’re born with, that comes naturally, and that can’t be learned or trained. I’m not sure that’s 100% true. I think that as we learn and pick up patterns when it comes to certain problems that our intuition improves. I think the best example is the military.
The army and Marines train, train, train, train and then train some more. When they get deployed, things do not always go according to plan. When @*&^ hits the fan, they don’t have the luxury to plan out and analyze a response – they have to respond intuitively and they do a pretty damn good job at it.
When you ask them “how do you handle something that you didn’t plan for” – the answer is almost always “the training just kicks in”
So, I don’t think this is an either or thing. When I’m dealing with problems in a realm that I’m familiar with I can come up with answers much more intuitively. (As I’m sure most of us could). Put me in a car in a city that I’m not familiar with and my gut comes up with nothing – and is often wrong (much to the chagrin of my girl friend).
I would argue that if we train on our decision making process – we would make better snap decisions.
June 16, 2011 at 3:16 pm #133072
Fascinating line here: “We are pattern matchers, not information processors. So when our brain spots patterns that it can recognize, it acts accordingly.”
This notion was captured in another book that has been highly influential in my thinking about decision-making – “Unlimited Power” by Tony Robbins. Based on my memory of the book, our typical thought pattern — influenced partly by genetics, but also by our environment — follows a chain:
A (Event) => B (Cognitive / Emotional Processing) => C (Decision) => D (Response)
As a motivational speaker and performance consultant, Robbins contends that we typically can’t change A, but we can alter everything from B onward by re-programming our minds. I have used this method to successfully change undesired behavior in my life and make better decisions in the moment. In fact, Robbins takes it a step further and suggests that you can learn from people who have successfully performed a particular activity more effectively than you and mimic THEIR pattern. Or, as you say, we can become “pattern matchers.”
Organizations can do the same thing – asking: “How do we process new information and ideas? How do we use that new information to make decision? What are the outcomes/results of those decisions and are we happy with them?”
One other quick thought: you can even figure out how to change A. Using an Ishikawa diagram (also known as a fishbone diagram), a person or an organization can trace back the source of the event by brainstorming known or potential causes. So not only is it possible to change our reactions, but we can act preemptively to prevent events that lead to bad decision-making.
June 16, 2011 at 3:17 pm #133070
So how does the manager streamline the process of gathering information, analyzing it and packaging it in order to make a faster, better decision?
June 16, 2011 at 4:16 pm #133068
There is some cognitive research that supports Gladwell’s premise – intuitions often do lead to correct decisions. There’s also lots of cognitive research that points out that intuition is fed by a variety of biases.
But that’s not the point. Public institutions need to be credible and what makes them credible is deliberation – deliberation allows for public input. The deliberative process can be sped up but often, it should be slowed down. If the public feels things are being crammed down their throats, it doesn’t matter is the decision is 100% correct. It will fail unless the decision maker is prepared to explain very thoroughly why the decision is necessary (public safety being a good example, as noted previously. And public institutions are much more averse to Type I error than Type II (the cost of taking an inappropriate action is much higher than the cost of not taking an action one should).
I’d also like to point out that most public institutions are results oriented – the process used for determining those results is evaluation. Communicating those results increases credibility. The evaluation is often as much about the process as it is the action.
June 16, 2011 at 5:49 pm #133066
I agree that intuition is built over time but being really aware of your decisions, how they’ve turned out in the past, and knowing the topic more and more.
I think data also builds intuition as you see the analysis of hunches/guesses with data/results
June 16, 2011 at 5:54 pm #133064
1) A great many daily decisions ARE of an unconscious non-deliberative nature. We also have reams of research on “implicit memory” to demonstrate that humans are frequently learning things without being aware of it, and applying what they’ve learned, also without being aware they are doing so.
2) Humans vary in the expertise they can bring to bear on a content domain. An expert cook can taste something and immediately identify how to “repair” the flavour. An expert auto mechanic can simply hear a vehicle, and know what the issue is. An expert medical diagnostician can zero in on the key pieces of information and quickly, without any overt evidence of deliberation, make a differential diagnosis that is spot on. In my “other life” as a musical effects guru (which has met the 10,000hr requirement several times over), I can often rely on my gut for how to improve or repair things, and have successfully designed commercial products from scratch, simply based on intuition about what this, that, and the other element would result in. Those of us who are NOT experts in the domain in question, however, need conscious deliberation. The question then becomes one of not so much trusting your gut about the decision you make, but rather one of trusting your gut as to whether you’re as much of an expert as you think you are. THAT one is the weakest link. As such, I am reluctant to simply encourage people to “go with their gut”, without regard to actual expertise level.
3) Before I entered government, I studied and taught in adult development and agiong, and one of the more interesting areas was that of research in wisdom. “Wise” decision-making IS deliberative, and considers context, multiple stakeholder perspectives, the certainty of information, and the advisability of outcomes. SOME of what we do in government can be adequately served by gut feel, thank goodness, but many important decisions require the time to be deliberative, analytic, and adopt as broad a perspective as is possible.
4) I think it bears noting that “business” decision-making often does not bear the burden of having to consider “the institution” in the same way that public-sector decision-making often has to. So, Gladwell is obviously onto something, but it is important to be mindful of the contexts where that applies and where it doesn’t apply quite as well.
June 16, 2011 at 6:09 pm #133062
It’s interesting to see this post in a group dedicated to analytics. Had to be deliberate. 😉 lol
The challenge is on us to see value in two methods of decision making: the Blink method and one that uses varying degrees of more intense analysis. It is not, in my judgment, a discussion of one versus the other. It’s a question of when to apply which method.
As a former fire fighter, EMT, paramedic and Navy Corpsman – I’ve been under fire literally and figuratively. Politics can have a similar pressure intensifying effect on a decision maker. Analysis doesn’t work when the pressure is on. People who chose (or are forced) to make decisions in these (or similar) environments don’t have time for it. They use preparation and experience to guide them.
The quality of snap decisions in a high pressure environment is heavily dependent on experience. On the line, lack of experience can get you dead. How many times a person has seen similar circumstances, how many similar circumstances they have seen, and a learned ability to draw relationships and to make connections between circumstances play a huge role in the outcome.
When I was doing my graduate work in Knowledge Management, I proposed that people are like incubators. They take in knowledge, mix it around with context they already have, cook it for a while and come up with something entirely new. Experience works this way.Someone with experience seemingly has the ability to “just know” the right answer. Not much analysis needed. The lower the experience level, the more analysis is needed.Martial arts is a good example. Student who are early in their training tend to break every move down. They think through everything – position of their feet, position of their fingers, tightening of certain muscles, breathe… A more advanced martial artist doesn’t think much. They act and react. Confronted with a known opponent, they move right into counter attack with relative ease. Confronted with an unknown opponent, they give and take a little up front, size up their opponents strengths and weaknesses (without thinking about it much) and then act. When I did more tournaments, I used to refer to an automatic ability to know what was open on my opponent as the “cross hair syndrome.” I had no idea why it happened, but I seemed to see cross hairs on my opponent exactly where I needed to strike. Instinct.In the “ecosystems” of martial arts, fire fighting, or emergency medicine; I am relatively confident that my actions and reactions can be trusted. However, if I were to step into a yoga class or if I were asked to engineer a new plastic polymer, I would be hyper analyzing my every move – seemingly paralyzed and agonizingly slow to those who are proficient in these areas.Decision making itself is a learned and practiced art. Decision makers collect experience over a lifetime making decisions in many different environments. To a practiced decision maker, making decisions comes relatively easily (compared to those who don’t make decisions on a regular basis). Decisions seem to flow naturally, and in many cases, they hit the mark, or close enough to the mark to be effective. They generally understand the outcomes of their decisions before they make them – to a point.When a decision maker reaches a level of seniority where they oversee several unfamiliar ecosystems, I think we run into problems. The decision maker’s instinct is to apply the same experience-based decision making technique to a seemingly familiar environment. But they have blind spots. They literally don’t know what they don’t know. A decision to fix one thing actually breaks two others. If they’re not set up to monitor unfamiliar areas, they don’t even know they broke something.Information technology is a battleground for this phenomenon. Most senior decision makers are unfamiliar with it’s nature. It is ubiquitous and it is unforgiving. It crosses the boundaries of many traditional ecosystems. It puts hand offs from one ecosystem to another into the spotlight, and reveals how ecosystems affect one another. The complexity of the IT and the complexity of the ecosystems in which it operates is simply too much for any one human being to process or have experience with. Techies don’t get the business. Business doesn’t get the techie stuff. Decision makers struggle to command it all and produce the outcomes the organization needs. Blinkers meet Analysts. Both have a role to play.The solution is creating a healthy contemporary decision making environment. In the perfect environment, decision makers adjust their decision cycles to allow for analysis of appropriate content; and analysts leverage technology to ensure relevant analysis gets to the decision table in time to be consumed. Tactical and strategic decisions are tagged and scheduled as appropriate. One needs more analysis. The other needs less. The balance of it calibrated to the decision makers and support staffs themselves.Should public servants trust their gut? Yes.Should public servants carefully analyze and take more deliberate steps to achieve desired goals and outcomes? Yes.It depends on the variables.
June 16, 2011 at 7:35 pm #133060
I concur with Chris. Different decision making methodologies for different circumstances.
Problems come up when leaders move into decision making positions, then don’t understand how to balance the two methodologies.
June 16, 2011 at 8:53 pm #133058
Love what you’re saying, Scott – and I’m gonna keep pushing us here: how do we streamline the deliberation process to make equally effective but speedier decisions?
June 16, 2011 at 9:07 pm #133056
William D. EggersParticipant
Great discussion everyone. Doing a lot of work on this issue right now but the short answer I think is that “trusting your gut” is how we got into some of our biggest disasters in recent years in government. In an era of Big Data and ever improving analytic tools, we have much better options than intuition, which in the public policy area is too often clouded by confirmation bias. The answer is to put in place mechanisms to force fast failure, make small bets, see what works in the real world, with real users, give employees better situational awareness throughmobile social media and analytic tools etc. Here is an article I wrote on the topic awhile ago. http://www.governing.com/columns/mgmt-insights/Failing-the-Right-Way.html. In summary: experiment, evaluate and do it rapidly to test programs and hypothesis.
June 16, 2011 at 9:19 pm #133054
As I mentioned on Twitter, there’s some combination here analysis and intuition. I used the tactical approach / decision to take out Bin Laden as an example. Certainly, there were weeks if not months of build up and analysis. But when it came down to the decision to do a large scale operation (bombs and blasts) vs. a surgical maneuver, Obama chose the more difficult one. He had plenty data…but at some point, he just had to do a gut(-wrenching) check and take a risk => Now’s the time. Let’s do it.
Thoughts on the threshold between analysis and action?
June 17, 2011 at 4:22 am #133052
I also think back to the classic text book on the Cuban Missile Crisis called Essence of Decision. Failure is in the eye of the beholder, and too often, I think many decisions appear irrational because a decisionmaker is making a decision that is in their or their bosses’ best interest , although not in the country’s best interest. As explained in Essence of Decision, major government decisions are made based on incentive structures and control structures, not because of lack of data or “trusting your gut.” One of the most difficult challenges for a Congressional staffer is helping their boss answer a populist outcry that may be the result of an ad hoc horror story. Often a new law or regulation makes a systemic response that constrains a government employees ability to act. Generally, I find key decisions such as program and administration budgets being decided totally on self-interest. If good data can be brought into a decisonmaking process, generally government needs conflict to force a decision based on data. Otherwise, we have data being used to justify a decision made on self-interest — and we hear terms such as “selling the decision.” So, does “trusting your gut” mean acting in your self interest? And does making decisions on solid data mean forgoing incentives and self interest if the data contradicts what is in the decisionmakers self interest? I think it is human nature to use Cognitive Dissonance, data that justifies the decision rather than being the basis for the decision made on instinct.
June 17, 2011 at 12:40 pm #133050
Well said, Mark.
The degree of selfishness or self-less-ness is significant. I believe this individual internal alignment is formed over time through a series of small, often invisible-to-the-world decisions. This factor definitely affects which way a leader’s “gut” may be inclined to roll, and how they process what information is handed them.
My own previous comments on this string assumed a well-intentioned, selfless leader. It assumed experiece combined with selfless intention and a comittment to do the right thing. A virtual unicorn…
Selfishness can throw “good gut” and “reasonable response” out the window. It is a toxic and often insidious element that we all have to contend with.
I remember, with a sick feeling in my gut, the number of times I’ve struggled to serve up a solid recommendation to a selfish (or a forced-to-be-selfish) decision maker. I knew the “right thing to do,” but the right thing to do was insufficient to convince the person who needed to make the decision. I had to find a way to make “the right thing to do” appealing to the self interest of the person who needed to make the final decision. That, my friends, felt like wrapping a good piece of steak in a poop wrapper to attract flies. The steak may eventually be consumed (mostly), but the process is disgusting.
June 17, 2011 at 3:16 pm #133048
I really like Deming’s recommendation – allow for decisions to be made at the lowest level of the organization possible but keep stakeholders informed of what’s being decided and why so that the organization learns. Those with good political acumen will be able to discern what issues require public input versus those that don’t. Using the right tool for those decisions that do require public input can be a huge saver of time – sometimes a community meeting is used when a press release would suffice.
June 17, 2011 at 5:22 pm #133046
The decisions aren’t always based on selfishness – implying deliberate manipulation – but may simply be a case of “locally optimized, but not globally optimized”, as operation researchers say. Data and an analysis framework can help to prevent these poor decisions by analyzing the larger context and the implications of the decision. Think of modern chess playing computer programs which can now quickly analyze all possible moves and counter-moves that even a grand-master can’t consider. While some decisions are obvious enough, or common enough that they can be made by instinct, government (and the world) seems to be facing more and more Non-Routine Problems. Because we don’t have experience to draw on (the black swan problem), the use of data and analysis is becoming increasing important. Fortunately, the data and tools are more readily available and capable than ever, and for those agencies that use these on a routine basis, the expertise is there to utilize these tools even when a quick reaction is necessary.
June 17, 2011 at 5:40 pm #133044
I think the right mix of “trusting your guts” and deliberate analysis are what it takes to achieve the best outcomes. There are multiple ways of getting the right answers and their is no one-size fits all approach. We need to be smart about how we look at numbers, results, qualitative evidence and cases to determine the leading practices given the circumstance. For analysis, we should also attempt to speed up the data to information cycle, in order to not get stuck in “analysis paralysis”. The ways I have seen to do this are with decision support systems, automation, sharing information, collaborating, social searching and by mixing in a healthy dose of prioritization.
Below is a book that talks about subjective probability, or the art and science of decision-making. I read this almost 10 years ago, while I was doing a fellowship with NASA. For decision-making this books describes when and how the art (“trusting your gut” and science (analysis) can work together in the right mix to achieve the best overall results:
“Subjective Probability and Engineering Judgment artfully weaves together three elements at the very core of engineering: uncertainties in knowledge; inductive reasoning; and individual expertise. Written from a geotechnical and earth sciences perspective, it brings new meaning to these concepts for nearly any field of engineering, science, or technology decision making. It begins by examining the tension between theory and practice, showing how this is manifested in the different meanings of probability used in reliability and risk analysis. This book emphasizes the cognitive processes used in conceptualizing uncertainty, with techniques and tools for subjective probability assessment. Turning to what this requires, it lends substance to professional judgment by examining its diagnostic, inductive, and interpretive elements. It also investigates the nature of expertise by bringing to life such towering figures as Roebling, Stevens, and Terzaghi, and their monumental engineering achievements. Masterfully synthesizing a wide variety of sources in a way directly relevant to engineers, the book draws on the history and philosophy of science through the work of Thomas Kuhn and Henri Poincaré. It provides a window on current behavioral research that applies it to everyday thinking. This is all richly illustrated by examples from practice, including a gripping reassessment of the 1986 Challenger incident as seen through the eyes of the engineers who experienced it.”
June 22, 2011 at 4:05 pm #133042
Whether to choose either of the options you present, depends on the situation. But, I generally rely on my intuition. It has served me pretty well. Sometimes, it’s better not to have a great deal of time to analyze and mull over a situation.
June 22, 2011 at 4:39 pm #133040
One final thought: Bayesian statistics combines beliefs/ intuition with data. It’s gained a lot of applicability of late and could be helpful for us to read up on to get a better understanding.
June 27, 2011 at 6:26 pm #133038
The science part of this question is covered in the excellent How we Decide by Jonah Lehrman, which uses good cognitive and neuroscience to analyse the subject, rather than Gladwell’s anecdote based method.
The science in there supports emotional and rational parts to decision making, basically saying that we need both, and that neither works independently, and also interesting is that we need repeteated exposure to making mistakes and good decisions in order to develop the correct gut-feel in the future – basically the emotional part of the brain is one big feedback loop. This is covered in detail.
Taking on board the need for process, the process is surely to show due diligence, not to show that you came up with the decision that an automaton would have done?
July 30, 2011 at 10:04 pm #133036
It depends on the decision to be made and host of other factors like timing and urgency. I think that facts and feelings can often conflict. It’s how we sort them out and that comes from experience. Personally, I rely on my intuition.
You must be logged in to reply to this topic.