Let's say you're on a game show.
You've already earned $1000
in the first round
when you land on the bonus space.
Now, you have a choice.
You can either take
a $500 bonus guaranteed
or you can flip a coin.
If it's heads, you win $1000 bonus.
If it's tails, you get no bonus at all.
In the second round, you've earned $2000
when you land on the penalty space.
Now you have another choice.
You can either take a $500 loss,
or try your luck at the coin flip.
If it's heads, you lose nothing,
but if it's tails, you lose $1000 instead.
If you're like most people,
you probably chose to take
the guaranteed bonus in the first round
and flip the coin in the second round.
But if you think about it,
this makes no sense.
The odds and outcomes in both rounds
are exactly the same.
So why does the second round
seem much scarier?
The answer lies in a phenomenon
known as loss aversion.
Under rational economic theory,
our decisions should follow a simple
mathematical equation
that weighs the level of risk
against the amount at stake.
But studies have found
that for many people,
the negative psychological impact
we feel from losing something
is about twice as strong as the positive
impact of gaining the same thing.
Loss aversion is one cognitive bias
that arises from heuristics,
problem-solving approaches based on
previous experience and intuition
rather than careful analysis.
And these mental shortcuts can lead
to irrational decisions,
not like falling in love
or bungee jumping off a cliff,
but logical fallacies that can easily
be proven wrong.
Situations involving probability are
notoriously bad for applying heuristics.
For instance, say you were to roll a die
with four green faces and two red faces
twenty times.
You can choose one of
the following sequences of rolls,
and if it shows up,
you'll win $25.
Which would you pick?
In one study, 65% of the participants
who were all college students
chose sequence B
even though A is shorter
and contained within B,
in other words, more likely.
This is what's called
a conjunction fallacy.
Here, we expect to see more green rolls,
so our brains can trick us into picking
the less likely option.
Heuristics are also terrible
at dealing with numbers in general.
In one example, students were split
into two groups.
The first group was asked whether
Mahatma Gandhi died before or after age 9,
while the second was asked whether
he died before or after age 140.
Both numbers were obviously way off,
but when the students were then asked
to guess the actual age at which he died,
the first group's answers averaged to 50
while the second group's averaged to 67.
Even though the clearly wrong information
in the initial questions
should have been irrelevant,
it still affected the students' estimates.
This is an example
of the anchoring effect,
and it's often used in marketing
and negotiations
to raise the prices
that people are willing to pay.
So, if heuristics lead to
all these wrong decisions,
why do we even have them?
Well, because they can be quite effective.
For most of human history,
survival depended on making quick
decisions with limited information.
When there's no time to logically
analyze all the possibilities,
heuristics can sometimes save our lives.
But today's environment requires
far more complex decision-making,
and these decisions are more biased
by unconscious factors than we think,
affecting everything from health
and education
to finance and criminal justice.
We can't just shut off
our brain's heuristics,
but we can learn to be aware of them.
When you come to
a situation involving numbers,
probability,
or multiple details,
pause for a second
and consider that the intuitive answer
might not be the right one after all.