The following is an excerpt from my master thesis on the psychological causes of the subprime mortgage crisis.
The main behind is that financial crises are not the exception, but a natural consequence of human nature and human psychology.
To really understand them -and prevent them- we then need to understand -and plan- with human nature in mind.
I will share here how some well-proven psychological biases might have contributed to the financial crisis.
- Hyperbolic discounting
- Diffusion of Responsibility
- Social Rationality & Competition
- Cognitive Dissonance
- Belief manipulation (or “self-deception”)
- Base Risk Bias
- Illusory Superiority
- Bandwagon Effect and Herd Instinct
- Attention Anomalies
- Loss Aversion & Ambiguity Aversion
- Categorization Bias
- Normalcy Bias
- Abundance Bias
- Psychopathy & Sociopathy
- Sociology & psychology of financial crisis: Summary
Hyperbolic discounting is:
the tendency for people to increasingly choose a smaller-sooner reward over a larger-later reward as the delay occurs sooner rather than later in time.
Such as, humans show a preference for the temporally nearest reward by a factor that increases with the increase of the delay.
This might also be due to an evolutionary disposition: since we evolved in an environment where food was not readily available at will, it made sense for our ancestors to guzzle and get as much as possible in those few occasions when they could have guzzled.
Even though rushing might have caused a stomachache a few hours later, that might have given them a big leg up in tackling a possible food shortage in the coming days (note: this is highly speculative, it’s just my personal guess and there’s no proven science behind it).
In the same way, when traders and bankers can binge on the profits, there’s a tendency to just do it.
Hyperbolic discounting only addresses rewards, but it’s probably not too much of a stretch to imagine that short term reward can blind people not just to bigger rewards in the future, but also to lack of rewards, or even full-blown problems, in the future.
After all, we probably all know someone who spends all they have -or even borrow to consume more-.
And this is nothing, after all, than the “grasshopper mindset” of the famous fable The Ant and the Grasshopper.
The hyperbolic discounting is magnified by the fact that the “costs” are often not borne by the individual. Or at least, there’s a tendency to believe they are.
Enter, the diffusion of responsibility:
Diffusion of Responsibility
When many different parties are contributing to a flawed process it is easier for everyone to tell themselves that they are not at fault.
This helps people assuage their moral and ethical concerns about their own actions.
And in there were so many parties involved in the securitization process that it was easy for everyone to say, both to themselves and to others: “look I’m not doing anything wrong here, everyone else is doing the same”.
Most might not have been even aware that something was wrong, but the diffusion of responsibility bias simply never led anyone to even ask if what was happening was good or dangerous.
Social Rationality & Competition
Social rationality refers to the type of practical rationality displayed by individuals in multi-agent decision-making situations.
Basically, people apply simple heuristic to complex problems and tend to follow the crowd.
Strong economic competition and social interaction push financial institutions to follow short-term strategies by taking as reference what the majority thinks (either go with or against the majority), independent of the actual relevance of what the majority thinks.
This might have brought people to sell and buy CDOs and MBS not much because they thought it was the best thing to do, but because they were in competition with the other hedge fund managers that were all doing the same.
Our competitors are packaging and making mad money off of CDOs.
Shall we play it safe, research thoroughly and plan for the long term or shall we do the same and make mad money too, right now?
The question had such an obvious answer that many probably never even asked it.
The social rationality, coupled with the “competitive arousal” model of decision making suggests that situational factors such as rivalry, time pressures and external people judging our performance stimulates physiological arousal that pushes us to win.
No matter if winning means doing the right thing or not.
Competitive arousal also crowds out more thoughtful consideration of risks and costs, and pushes people “straight into grabbing the spoils”.
Social rationality and competitive arousal seem to have been instrumental to the exponential increase of leverage.
Cognitive dissonance is the discomfort we feel when we take actions that conflict with our (usually positive) self-image.
To avoid coping with the fact that they were doing something wrong, bankers and traders manipulated their own beliefs to convince themselves that everything was ok.
This is a process called by Barberis:
Belief manipulation (or “self-deception”)
The high complexity of the derivatives might have helped in the “belief manipulation”.
It would have taken considerable effort to disprove the claim that MBS were relatively safe, and since we prefer the path of least resistance, the normal path was to delude themselves about their risks.
To the “belief manipulation” come in help other biases, like the Confirmation bias, that leads people to overweigh information that confirms their prior beliefs and to underweigh information that disproves those beliefs.
Through these two biases, we can also more properly frame the role played by the mathematical models.
Many blamed the models for the crisis, but I don’t fully agree with it.
The mathematical models didn’t fool everyone, but they were a ready excuse and a handy instrument of self-deception.
Why shall I believe that silly voice in my head that things can turn sour?
The mathematical models of our smart finance whizzes say everything shall be alright.
So let’s get back to making money…
People are remarkably willing to follow instructions from an authority figure, even when the task seems inappropriate.
Especially in confusing situations, we look to authority figures for guidance on what to do (so our dependence on authority figures rises in the presence of a stressful situation).
Many people may have felt that what they were doing was not completely sensible but they felt a psychological need to do what their bosses wanted them to do.
It might not be a coincidence that Bridgewater was one of the very few mutual funds to foresee the crisis, since Ray Dalio strongly encouraged a mentality of independent thinking and speaking up (check Principles by Ray Dalio).
People’s tendency to see patterns, especially trends, in data which are actually completely random.
This is coupled with the tendency to extrapolate too readily (over-extrapolate) from small samples.
Housing prices had been rising for several years, so most home borrowers went ahead with their mortgages never even thinking they could lose money.
Base Risk Bias
Akin to the representativeness is the “base risk bias”, such as the tendency to overweight base-rate information and historical statistics.
For what concerns the crisis, an example is over-weighing historical default rates, which might give the impression of safety without assessing how the fundamentals, or the lending standards, had changed (Shefrin, 2009).
Similarly, people looked at the historical data to reassure themselves that house prices had never fallen.
Interesting how Greenspan, the then president of the Federal Reserve, used this statistic to convey the idea that a house crash was no immediate danger for the economy.
Conservatism is an empowering factor of the “base risk bias”, and is the tendency to overweight base-rate information relative to new (or singular) information.
This phenomenon is sometimes called “underreaction”, because we tend not to react promptly to the new information we receive.
Conservatism might have slowed people’s reaction when news about higher insolvency rates started to appear.
People have excessively rosy views of their prospects and the outcome of planned actions.
This includes over-estimating the likelihood of positive events and under-estimating the likelihood of negative events.
In surveys, people overwhelmingly think that setbacks are more likely to happen to others than to themselves.
The optimism bias operated at many levels.
Households may have underestimated the risks of taking a large mortgage and thought house prices had still a long way up to go.
Hedge fund managers, investment bankers and investors might have concentrated on the possible gains of their investments and underestimated the chances that they could actually lose money.
People strongly overestimate the precision of their forecast.
This too may have contributed to the poor modeling and the poor predictions we had about the future course of the housing prices.
Overconfidence may have led people to believe that they could forecast future house price movements more accurately than they actually could.
Illusory superiority is a cognitive bias that causes people to overestimate their positive qualities and abilities and to underestimate their negative qualities, relative to others.
Optimism, overconfidence and illusory superiority played a role on all levels.
Bandwagon Effect and Herd Instinct
We tend to assume naturally that when other people all say something it’s probably true (bandwagon effect) and we tend to follow them (herd instinct) to feel safer and avoid conflicts.
Therefore, when everybody behaves as if a house crash is so unlikely, we follow the herd believing they have good reasons for saying so.
And when everyone traded CDOs, it seemed just normal that we do the same.
Indeed, people tend to self-censor, and many could have been embarrassed to bring up the possibility that we might have a financial crisis.
Shiller says it worked for him and a colleague of him as well when he refrained from publishing a book dealing with the importance of psychology in finance.
This bias can also be strengthened by the “omission bias”, the tendency to judge harmful actions as worse than equally harmful omissions (inaction).
It might not have been a case that many of the people that predicted the crisis were rather marginal and unconnected people, sometimes also regarded as “oddballs”.
Of the same family as the biases just described is the Groupthink, a kind of thinking in which group cohesiveness is more important than carefully considering the facts.
Symptoms include people not voicing contrary opinions as not to rock the boat.
The pressure is on dissenters to conform and opposing opinions are viewed in a stereotypical, simplistic way.
This is often referred to as Conformity, the strong tendency to conform to the group we belong to even though we don’t think it makes any sense. It’s a behavior that stems from the basic human need to be liked and accepted by others.
Even if some bankers or traders saw the problems, they may have kept quiet for the sake of conformity
Groupthink is considered to be particularly strong when: the group is valued and attractive to belong to, which seems quite possible in the case of Wall Street.
The group is isolated from other opinions, and indeed there was not a very big diversity on the Wall Street boards and often risk managers weren’t lent a hear; there’s a strong leader who makes his view known, and being a strong assertive leader is almost a prerequisite to becoming a CEO in Wall Street.
We are more concentrated on emotionally salient stimuli and on what people and media around us are talking about.
Thus, investors might see their brain resources hogged by the “attractive” yields of the subprime borrowers that everybody is talking about, or by the growth targets discussed in the company’s meetings and spend less time in researching what might go wrong.
Loss Aversion & Ambiguity Aversion
People have an aversion to losses and ambiguity, especially under certain circumstances.
For example, the loss aversion is stronger when we’ve recently experienced a loss.
And the ambiguity aversion is stronger when we don’t feel well prepared to understand what’s happening.
Once CDOs and MBS started losing value both institutional and individual investors experienced increases in loss aversion and ambiguity aversion, leading them to quickly dump their holding of risky assets quicker than the real economy would have granted for.
This bias possibly accelerated the domino effect.
Categorization bias leads to partitioning objects into general categories and ignoring the differences among members of the same category.
Categorization bias may produce unintended side effects if the members of the same category are different from each other in meaningful ways.
The categorization likely played a major role once the rating agencies rubber-stamped their ratings.
So all investors considered all AAA assets to be the same, when it clearly turned out that wasn’t the case.
Similarly, AIG, one of the biggest insurers of toxic CDOs, treated the mortgages pools it was insuring at the beginning of the 2000s as the ones they insured in 2005-2006, even though the latter contained far more subprime mortgages.
A behavior that provided devastating for AIG and for the ripple effects it provoked.
The normalcy bias refers to a mental state people enter when facing a disaster.
It causes people to underestimate both the possibility of a disaster occurring and its possible effects.
Nassim Nicholas Taleb, in his book “The Black Swan: The Impact of the Highly Improbable”, explains it as the psychological biases that make people individually and collectively blind to uncertainty and unaware of the massive role of the rare events in historical affairs.
According to the Black Swan Theory, unpredictable events are much more common than people think and are the norm and not the exception. Black Swan events are hard to predict based on historical events.
This, coupled with the tendency to have an optimistic view, helps explain why many thought it was impossible some catastrophic event might have happened.
Mathematical models were also built without taking into account what back then seemed highly unlikely events, and the results of the model further helped many people to overlook the chances of improbable events.
We feel safer with what’s familiar, even when what’s familiar might not be necessarily safer.
Supported by the notoriety of a certain financial instrument, we are reassured into buying it.
We don’t notice this influence, and we distort our risk-profitability ratio.
Maybe this could also be stretched to include the “daily familiarity” we developed with certain securities. So for example after years of lucrative gains with CDOs and MBS traders might have felt those securities “reassuringly safe”.
The anchoring (or reference point bias, Tversky & Kahneman, 1974) is the tendency for which we tend to make forecasts on the ground of a starting assessment, or of a price thought to be “right”.
An anchor to which our mind binds to and prevent us from reacting promptly when new information surfaces.
So for example, people holding CDOs who were rapidly depreciating might have been brought to keep them because the price they were currently trading for was “far below the right one”.
Lehman Brothers traders, including his CEO, bought a big portfolio of MBS as it sought to deleverage because they thought “it was a great deal”.
It was such a great because they were comparing it to the old prices.
This, maybe also more importantly, is also true the other way around. People that get used to certain profits, might use them as references.
And as the risk premiums declined and the same amount invested were yielding less, here it surfaces the “need” to leverage to reach –and surpass- the reference point of the previous years (or quarter, or trade).
A reference point can also be set for emulating purposes. UBS increased dramatically its exposure to the subprime products because a consultant company had advised doing so (McKinsey).
Abundance bias might kick in when there’s so much to one’s disposal that worrying for the future shrink till reaching zero, and an end of the bonanza doesn’t even cross people’s mind.
Traders and bankers were so busy splurging that they just couldn’t fathom that an end of their plush life might be in the cards.
Psychopathy & Sociopathy
Finally, it’s possible that the higher concentration of socipaths and psychopaths in high-risk and high-reward industries such as investment banking and trading might have also contributed to the crisis.
Psychopaths tend to be highly driven towards gains and rewards, while ignoring risks to both selves and, even more, others.
Many psychopaths are also undeterred by the risk of punishment (which few people were considering anyway thanks to the diffusion of responsibility bias).
For more on psychopathy in business organizations, please read:
Sociology & psychology of financial crisis: Summary
The markets were –and maybe still are- ill-prepared to cope effectively and efficiently with the less rational side of human nature.
When the bubble was inflating, judgment and decisions about risk, reward, and the evaluation of success had become systematically compromised.
The excitement of gains erased the anxiety about potential consequences, producing herd effect, and competitive arousal.
Plus, any anxiety about the risks of financial collapse and legal consequences were quelled by the “diffusion of responsibility”, as people were comforted by the fact that “everyone else is doing it”, so “it’s safe being a scoundrel when everyone else is a scoundrel”>
When the bubble started deflating the same psychological forces started acting in the opposite direction.
Anxiety and panic broke out lead to the 21st-century version of bank runs, far worse than 20th-century version of bank runs since the whole system was far more global and interconnected, which lead to a domino-effect collapse.
And in the aftermath of the crisis, the emotional pain and social status loss that are involved in accepting and admitting responsibility stand in the way of learning our lessons.