Ever since Malcolm Gladwell came out with his book Blink,
many people have become familiar with our strange ability to know more
than we can explain, but when does it make sense to act on impulse and
when should we stop and think things through?

As I noted in an earlier post,
Gladwell is not a researcher, but a journalist and therefore probably
not the best one to guide us.  However, two leaders in the field, Daniel Kahneman and Gary Klein,
representing two opposing views, collaborated for eight years to find
common ground.  The result is an excellent guide to effective decision-making.

A Bit of Background

For years, many thought decision-making was a deductive process.  The rational planning model
described an orderly method of defining problems, generating and
evaluating solutions, implementing choices and so on.  It made perfect
logical sense, but the more researchers looked into it, the less the
evidence supported it.

First, Kahneman, along with Amos Tversky, uncovered numerous cognitive biases.
 For example, we tend to be willing to take a chance when faced with a
loss, but turn risk averse when we have the opportunity to gain.  Their
experiments to resulted in prospect theory, which is more consistent with empirical data than previous utility functions.

Second, Klein’s research of people who make life or death decisions,
such as firemen, nurses and soldiers, found that they don’t use any
rational model.  Instead, they employ a form of naturalistic decision making.  They pick a likely solution and see if it works.  If it doesn’t, they move on.  In other words, they satisfice rather than optimize solutions.

The problem was that the two major objections to the rational model
came up with opposite conclusions.  The prospect theory of Kahneman and
Tversky suggested that our failure to act rationally costs us, while
Klein’s naturalistic decision making implied that snap decisions often
produce better results.

The Fireman’s Story

In Klein’s book, Sources of Power,
he gave numerous examples of how experts make quick decisions
effectively.  One was the story of a fireman who was called to put out a
seemingly ordinary kitchen fire.  While they were spraying water on it,
he got a bad feeling and ordered his men out of the house.  A few
seconds later, the floor below them collapsed.

If he and his men had still been inside, they might have been killed
or, at the very least, severely injured.  He attributed his decision to a
“sixth sense” he had about fires.  Many people in various fields seem
to develop similarly effective intuition.  Jack Welch, for instance, was said to be able to decipher complex financial statements at a glance

In actuality, the fireman’s decision was not so mysterious.  In
further interviews, it became clear that there were subtle clues that
something was amiss.   It was too hot for a mere kitchen fire, wasn’t
responding to the water hose and was too quiet – all of these things
were out of place and indeed the fire was emanating from the basement,
not the kitchen.

We are, in fact, able to take in information without being consciously aware of it. Neuroscientist Antonio Damasio calls this the somatic marker hypothesis.  In his view, gut feelings are very real and very important.

The Failure of Pundits

While expert intuition can be impressive in some instances; it can absolutely useless in others.  Philip Tetlock, who embarked on a 20 year study
of political pundits, found that expert’s predictions were no better
than flipping a coin.  What’s more, the most famous and highly regarded
analysts performed the worst of all.

He attributes the disparity to two personality types he calls hedgehogs and foxes.
Hedgehogs know one subject intensely, while foxes have a broader
knowledge.  The hedgehogs, he found were much more confident in their
judgments and also wrong more often.  The foxes were more cautious, but
more accurate.

Tetlock’s findings should give us some pause when feel like going
with our gut.  If pundits who spend their whole careers analyzing a
field can’t predict future events with any accuracy whatsoever, what
confidence can we have in our own judgments?  Moreover, if confidence is
negatively correlated with accuracy, we’re really in trouble.

A Tale of Two Systems

Kahneman provides a framework for navigating the quirks of rational and emotional thinking in his book Thinking, Fast and Slow.  He describes the tension between emotional and rational thinking as a competition between two competing systems:

System 1: This is our more instinctual, automatic system.  It relies on rules of thumb, called heuristics, that it enables it to act quickly.  It is, in other words, “fast and frugal.”

Incidentally system 1 is very active in politics.  Studies by political scientists have found that voters often make judgments of candidates based on “rapid, unreflective inferences” about their appearance.

System 2:  This system reflects our more rational
side.  We use it when we stop, think, make difficult calculations and
weigh facts, which takes a bit more time and effort than simply going on
instinct.  We only engage our second system when we realize that the
first one is falling short.

In other words, the first system contains beliefs and biases, and the
second drives thinking. Small wonder that we favor the fast and frugal
“system 1,” which allows us to decide quickly over the slow and
difficult “system 2,” which immobilizes us.  In effect, “system 1” takes
advantage of prior programming and “system 2” does not.

Substituting a Hard Question for an Easier One

Kahneman further posits that when we encounter a tough “system 2”
question we tend to substitute it with an easier “system 1” question.
 For example, when we get into the car in the morning to go to work, we
will not google statistics of traffic fatalities, but will be affected
by a horrible accident we saw on the news.

This is called the availability heuristic
and it explains why “system 1” is so active in politics. We substitute
the very difficult questions of policy for the easier one of whether we
like the way a candidate looks and speaks.  It is also why
overconfidence often leads to bad decisions because it leads to favoring
impulse over deliberation.

Training can help overcome this deficiency.  Emergency workers,
pilots and soldiers are taught to overcome their natural instincts
through deliberate practice.  In effect, they replace their natural programming for that of their field.

When Should We Blink?

So while Kahneman and Klein both agreed that intuition plays a large
role in decision making, they had polar opposite views about whether we
should trust our instincts or not. Unusually for this type of
disagreement between academics, the two struck up a friendship, a collaboration and, eventually an agreement on what makes intuition valid.

Regularity:  When we frequently encounter similar
situations, we learn to notice subtle cues.  This recognition can become
so ingrained that it bypasses the rational centers of the brain.  The
fireman had a gut feeling about the kitchen fire because many of the
cues he was used to experiencingwere absent.

Opportunity to Receive Feedback and Learn From It:
 Simply encountering situations is not enough.  You have to have the
opportunity to make judgments and see how they turn out.  For example,
psychotherapists tend to be very good at understanding how a patient
will react to stimulus in a session, but less able to predict the
long-term outcomes of treatment.

In a similar vein, Anders Ericsson (who’s best known for his “10,000 hour rule) found in his highly cited research on expertise that
coaching is absolutely essential to superior performance, because
without feedback it becomes very difficult to hone our instincts.

Relevance:  One of the pitfalls that many
accomplished people fall into is thinking that expertise in one area
carries over to another.  Their confidence in being able to spot subtle
clues and make snap judgments in their own field can be disastrous when
they try to apply the same approach to other domains.

In the final analysis, gut feelings should be taken seriously.  They
are often telling us that something is amiss.  However, they are much
more reliable when they are alerting us to danger than when they are
pushing us to overlook pertinent facts.  If there is time, it’s always
better to stop and think.

Confidence and certainty don’t mean that we’ve got the right answer,
they often just mean that we have substituted a tough question for an
easier one.