Thinking Fast and Slow (2011) is a popular psychology book dealing with human cognitive biases, including descriptions, research, and examples. Daniel Kahneman, the author, also seeks to develop a theory of human biases, assigning biases to the “fast” way of thinking, as opposed to the slower and more rational way of thinking.
Contents
Bullet Summary
- We are less rational than we think
- Intuitive heuristics are simpler rules we use when faced with a complex reality
- Only when intuitive heuristics are not enough we engage our slower, rational brain
Full Summary
About The Author: Daniel Kahneman is a psychology professor and researcher, a Nobel laureate, and widely recognized as a leading researcher and expert on human psychology.
Kahneman collaborated with Amos Tversky for a long time in pioneering research on human cognitive biases and heuristics
In the book “The Undoing Project,” Michael Lewis tells the story of the friendship and rivalry between Kahneman and Tversky.
Introduction
Daniel Kahneman says economics was blind for many years to the fact that humans are far from rational.
Today, the idea that our mind is susceptible to systematic bias mistakes is generally accepted, and, in good part, it’s thanks to the work of Kahneman and Tversky.
Part I: Two Systems
Kahneman describes two different modes of thinking we deploy, which he calls System 1 and System 2.
System 1 is our default system. System 2 is only called upon when system 1 doesn’t have a ready and quick answer and an “escalation” is needed.
System 2 also monitors System 1 to ensure that any suggestions and default decisions are correct, and it suppresses the ones that aren’t. But more likely than not, system 2 will go along with what System 1 dictates. Kahneman says “laziness is built deep into our system”.
Also, self-control shrinks when our brain is fatigued.
System 1: Fast
System one is fast and automated, it needs no voluntary control. It operates on simple heuristics that kick in automatically for the simplest actions and behavior. System one takes care of things like:
- 1+1
- Understand an angry face
- Drive (after you’ve been driving for a long time)
Some complex or more difficult tasks, such as the example of driving, can be “burned” into the system after adequate and long enough training and repetition.
The eye-opening part of the book is that system one, the nonrational part of our brain is much more influential than most of us would want to believe.
System 2: Slow
System two is the slower, rational part of our brain, the one that uses focus and concentration looks at different variables, and makes pondered decisions.
We use system two when for example:
- Fill out a complex form
- Decide between two job offers
- Walk faster than we are used to
Some Heuristics Daniel Kahneman introduces are:
- Priming
Conscious or subconscious exposure to a particular topic primes one to think in the direction of that topic.
Studies proved that people reading about older people walked slower and people who were asked to move more slowly more quickly recognized words related to old age. And it’s true that our body and movement influence our mood (emotion is created by motion says Tony Robbins).
2019 Update:
Priming has been largely criticized and, possibly, debunked. Read the criticism section for more.
- Cognitive Ease
Things that are easier or more familiar seem more true.
Similarly, if a statement is linked to other beliefs or preferences you hold, it will also seem truer.
Repeating something over and over then does work in making it sound truer (check Trump leveraging cognitive ease during his first and second debates).
- Coherent Stories
We always try to make the world and the events fit into stories that make sense to us.
- Confirmation Bias
We look for evidence of what we believe is true and discard evidence proving the contrary (this is why for a long time scientists refused to believe the brain is malleable).
- Halo Effect
When we like someone we tend to fill in the dots of traits we haven’t even seen yet with positive characteristics.
And the inverse is true.
That’s why first impressions are very important.
- Substitution
When faced with complex questions we often substitute it with an easier one instead and answer that one.
Part II: Heuristics and Biases
System 1 doesn’t like doubt.
As we’ve seen for the confirmation bias, it’s System 1 that especially takes care of fitting exceptions and ambiguity in a way that fit our stories.
System 2 is our inner skeptic.
It can handle doubt and can maintain opposing thoughts at the same time. But doubt is more taxing for the brain and sometimes System 2 just doesn’t intervene at all.
- Law of Small Number
We often lend high credibility to the result of small samples.
It’s a mistake because small samples are likely to lead to imprecise outcomes and are not statistically relevant.
- Confidence Over Doubt
We tend to make connections and attribute causality when there are actually no connections.
Sometimes it’s pure chance or luck and there’s no law or design behind the events.
- Anchoring
Anchoring (Tversky & Kahneman, 1974) is the tendency to change our own beliefs and guesses based on the number we’ve previously heard.
This is also sometimes used in negotiation as explained by Cialdini.
- Availability Heuristic
We tend to overestimate the odds of something happening when it’s easier for us to retrieve that information.
So being exposed to news about terror attacks, we tend to overestimate the likelihood of them happening.
- Rare Events Overestimation
We overestimate the chances of rare events happening and we assign them too big a weight in our decision making.
Also read The Black Swan.
- Representativeness
We judge something we don’t know based on how familiar it is to something else we know.
For example, baseball scouts used to recruit new talents based on how closely they resembled previous successful players and the league improved when statistics instead started becoming common use.
- Conjunction Fallacy
We are more likely to believe a person has more than one characteristic based on plausibility than having a single characteristic, which is obviously statistically more likely.
For example, students rated as more likely that Linda is a feminist teller than her just being a teller.
It’s because system 1 overlooks logic in favor of plausibility and system 2 just doesn’t always intervene.
- Underestimating Statistics
When we only have statistics we usually make accurate predictions, but if we have statistics and plausible stories, we favor stories over data.
So we tend to generalize from single stories and events rather than going from the general to the specific.
Also, read “How to Lie With Statistics“.
- Overlooking Luck and Chance
We assign stories and explanations to events that are just random.
If we yell at a bad performance, we think it’s our yelling that “sets them straight”, but it’s just the natural regression to the mean.
But yelling is not the best way to teach. It was bad luck the student had a poor performance and it was simply likely he would do better the next time out.
- Intuitive Predictions
Sometimes we just “feel right” and trust our intuition.
That’s usually overconfidence bred from System 1, which has a tendency to predict events and extreme occurrences from flimsy evidence.
Part III: Overconfidence
- Narrative Fallacy
We look at the past and create false narrative stories that make sense to us, but that do not necessarily -and often do not- adhere to reality. And we tend to underestimate chance and luck.
- Hindsight Illusion
Our intuitions feel much more true after the event has actually happened.
So we get the famous “I knew it” or the worst “I told you so” when nobody actually told us anything.
We are prone to giving too little credit to good decisions because in hindsight they seemed obvious. And we overestimate bad decisions as they also seem like obvious blunders in hindsight.
And that’s why Ray Dalio suggests to judge the swing more than the shot.
- Illusion of Validity:
Sometimes we feel confident about our judgment because we base our judgment on the coherence of what we have and the cognitive ease we experience in processing it.
Of course, we often discard to check other data and neglect to consider data running against our beliefs.
Intuition exists, but it’s more an immediate pattern recognition based on high experience or highly emotional events.
- The Planning Fallacy
We have a strong tendency to plan with the best-case scenario in mind and we often fail to take into account worst-case scenarios and likely issues we’ll face.
- Optimistic Bias
We tend to overestimate our capabilities and believe the final result lies fully in our hands.
We discount the element of luck and the uncertainty of our environment. This is accentuated by human nature: confident leaders make us feel more certain and we turn to them.
Part IV: Choices
Daniel Kahneman introduces this part of Prospect Theory, the theory that granted him a Nobel prize.
I invite you to read the book for more because it’s very interesting.
But basically, the theory debunked the aspects that viewed humans as being rational and mathematical when approaching money and risk (Expected Utility Theory).
For a quick summary:
- The value of money is subjective to how much one already has
- Changes in wealth are subjective to how much it previously changed
- Losses command a much bigger real estate in our brain
For more on behavioral psychology see Misbehaving by Richard Thaler.
- Loss Aversion
We dislike losing more than we like winning
- Theory Blindness
Once you accept a theory you tend to use it as a filter to see the reality.
But you only see evidence supporting the theory and tend to exclude and omit anything else running against that theory.
- Endowment Effect
We overestimate the value of an object we like and use.
We invest it with emotional attachment and we hate to lose it.
- Possibility Effect
We tend to overestimate the possibility of highly unlikely positive outcomes.
Think of lottery tickets.
- Certainty Effect
We underestimate the likelihood of highly likely scenarios
- Expectation Principle
The Possibility and Certainty Effect together explains how people don’t make decisions based on the weighted value of the outcomes, but more on the psychological feeling that winning and losing gives us.
This means:
- We are scared large likely gains might not happen and will accept much smaller but sure gains
- We overestimate very small chances of winning huge. And we buy lottery tickets.
- We buy insurance -and often overpay for it- because it buys us peace of mind
- Faced with high chances of making things worse, but a small chance of improving we’ll take the gamble because it gives us hope.
And that’s why gamblers keep on losing till they have nothing left (or until they can borrow more to lose).
- Disposition Effect
We prefer selling stocks we are making money with than the ones we are losing with.
Selling losers is an admission of failure and selling winners makes us feel smart.
- Sunk Cost Fallacy
To avoid the bad feeling of cutting losses and admitting failure we stay too long in negative scenarios, be it losing trades or abusive relationships.
- Framing
How we verbalize the same scenario will lead to different choices.
Doctors are more likely to operate with a 90% survival rate than with a 10% mortality rate.
Part V: Two Selves
Daniel Kahneman says we have two selves: one who experiences and the other who remembers.
Once the event is over, the one who remembers is the one in control.
And if we had an amazing holiday but on the last day had a negative experience, we are likely to remember the holiday as not really positive.
Kahneman also notices that people are really bad at assessing their global well-being -they answer how happy they are right now- and what will make them happy in the future.
- Ending Fallacy
How an experience ends takes a disproportionately bigger role in how you remember that experience.
Also see Pre-Suasion by Cialdini.
- Duration Neglect
How long we experienced the positive or negative experience matters little for our memory.
The peak of the experience and the ending are the most important
- Focusing Illusion
The most essential thing in our life… This is what we are focusing on at that exact moment.
And when we are asked to rate how happy something or someone makes us we are likely to focus mostly on one aspect.
And we tend to focus on positive traits disregarding how often we actually experience those traits.
Criticism
First of all, from a self-development point of view, I warn you of the potential for mental disempowerment if you go to the extreme of the “human irrationality” bias.
And I recommend two articles here:
- Law of optimum balance, such that your own empowerment starts from accepting your (mental) limitations, without going to the extreme and thinking of yourself as totally irrational. Indeed, you grow more limitless as you accept your cognitive biases, while also finding ways around them
- Naive self-help, especially the section on “naive disempowerment”
From a more academic point of view, Thinking Fast and Slow has been heavily criticized for lending support to the priming research literature which has specularly failed to replicate.
Kahneman originally demanded more clarity and continued to profess optimism that priming research would be confirmed (note: there was a link to Kahnemann’s response that has been then taken down).
But later in 2017 he eventually admitted that the criticism of chapter 4 of “Thinking Fast and Slow” is legit.
He said:
I placed too much faith in underpowered studies
(…)
However, I now understand that my reasoning was flawed
(…)
Unanimity of underpowered studies provides compelling evidence for the existence of a severe file-drawer problem.
You can read more here:
More Irrational Than We Really Are
One of the issues with Kahneman’s work is the endless list of biases that sometimes frame humans as “irrational”.
At the extreme, that becomes “hopeless irrational”.
But in truth, those biases work more often than not.
Or, at least, they often work “reliably enough” to keep us alive and well-functioning.
Says Jordan Peterson:
That doesn’t mean that the heuristics themselves are deeply flawed. That’s the problem with the work of Kahneman ant Tversky, they take perfectly valid heuristics (and look for specific and rare situations in which they fail).
Thinking Fast and Slow Audio Summary
Here is a summary I liked on YouTube:
Real-Life Applications
- Trust Yourself… A Bit Less
All the heuristics in Thinking Fast and Slow have a common thread: we are not rational decision-makers.
And often our decision does not serve our best interests or the interests of the people around us.
This allows us to make huge improvements in our life if we only learn where we fall short.
As an example applied to human interactions, check 7 dating mistakes women do.
- We Prefer Stories
When given the choice between listening to data or stories… We choose plausible stories.
So choose plausible stories when you need to influence people.
CONS
Lack of Common Thread
Well, the thread of Thinking Fast and Slow seems to be that of “we are less rational than we think we are”, but it seems at times a list/collection of heuristics.
Review
Daniel Kahneman is one of the men behind behavioral economics -on which I based my master thesis– and he contributed to finally “dethrone” the idea of humans as rational decision-makers.
And for that, it got a spot in the best psychology books of all time.
Thinking Fast and Slow deals mostly with economics and choices in the face of uncertainty, but they are also relevant to our day-to-day interactions.
To find out your biases, and how you can deal with them, check this post.