The following is an extract of my master thesis.
The idea behind it is that crisis are a natural product of human nature and psychology. To really understand them -and prevent them- we then need to understand -and correct- human nature.
- Hyperbolic discounting
- Diffusion of Responsibility
- Social Rationality & Competition
- Cognitive Dissonance
- Belief manipulation
- Base Risk Bias
- Illusory Superiority
- Bandwagon Effect and Herd Instinct
- Attention Anomalies
- Loss Aversion & Ambiguity Aversion
- Categorization Bias
- Normalcy Bias
- Abundance Bias
- Sociology & psychology of financial crisis: Summary
Hyperbolic discounting is a tendency where, having two similar rewards, humans show a preference for the nearest reward by a factor that increases with the increase of the delay.
This might also be due to an evolutionary disposition: when having a lot at disposal, it paid off for our ancestors to guzzle on many of the resources they found. Even though rushing might have caused a stomachache a few hours later, that might have given them a big leg up in tackling a possible food shortage in the coming days (note: this is highly speculative, it’s just my personal guess and there’s no science behind it!).
In the same way, when traders and bankers can binge on the profits, there’s a tendency to just do it. Hyperbolic discounting only addresses rewards, but it’s probably not too much of a stretch to imagine that short term reward can blind people not just to bigger rewards in the future, but also to lack of rewards, or even full out problems, in the future.
After all, we probably all know someone who spends all they have -or even borrow to consume more-. And this is nothing, after all, than the “grasshopper mindset” of the famous fable The Ant and the Grasshopper.
The hyperbolic discounting is magnified by the fact that the “costs” are often not borne by the individual. Or at least, there’s a tendency to believe they are.
Enter, the diffusion of responsibility:
Diffusion of Responsibility
When many different parties are contributing to a flawed process it is easier for everyone to tell themselves that they are not at fault. This helps people assuage their moral and ethical concerns about their own actions.
And in the securitization process where so many parties were involved it was easy for everyone to say, both to themselves and to others: “look I’m not doing anything wrong here, everyone else is doing the same”.
Most might not have been even aware that something was wrong, but the diffusion of responsibility bias simply never led anyone to even ask if what was happening was good or dangerous.
Social Rationality & Competition
Social rationality refers to the type of practical rationality displayed by individuals in multi-agent decision-making situations. Basically, people apply simple heuristic to complex problem and tend to follow the crowd.
Strong economic competition and social interaction push financial institutions to follow short-term strategies by taking as reference what the majority thinks (either go with or against the majority), independent of the actual relevance of what the majority thinks.
This might have brought people to sell and buy CDOs and MBS not much because they thought it was the best thing to do, but because they were in competition with the other hedge fund managers that were all doing the same.
X bank is packaging and making money off of CDOs, shall I waste time doing lots of research or shall I do the same and make money right now? The question had such an obvious answer that many probably never even asked it.
The social rationality, coupled with the “competitive arousal” model of decision making suggests that situational factors such as rivalry, time pressures and external people judging our performance stimulates a physiological arousal that pushes us to win. No matter if winning means doing the right thing or not. And competitive arousal crowds out more thoughtful consideration of risks and costs.
Social rationality and competitive arousal seem to have been instrumental to the exponential increase of leverage.
Cognitive dissonance is the discomfort we feel when we take actions that conflicts with our (usually positive) self image. To avoid coping with the fact that they were doing something wrong, bankers and traders manipulated their own beliefs to convince themselves that everything was ok.
This is a process called by Barberis as:
(or “self-deception”). The high complexity of the derivatives might have helped in the “belief manipulation”. It would have taken considerable effort to disprove the claim that MBS were relatively safe, and since we prefer the path of last resistance, the normal path was to delude themselves about their risks.
To the “belief manipulation” come in help other biases, like the Confirmation bias, that leads people to overweight information that confirms their prior views and to underweight information that disconfirms those views.
Through these two biases, we can also think of a different role played by the mathematical models that many blamed for the crisis. The mathematical models didn’t fool everyone, but they were a ready excuses and a handy instrument of self-deception.
Why shall I believe that silly voice in my head that things can turn sour? The mathematical models of all our finance whiz say everything shall be alright..
People are remarkably willing to follow instructions from an authority figure, even when the task seems inappropriate. Especially in confusing situations we look to authority figures for guidance on what to do (so our dependence on authority figures rises in presence of a stressful situation).
Many people may have felt that what they were doing was not completely sensible but they felt a psychological need to do what their bosses wanted them to do.
It might not be a coincidence that Bridgewater was one of the very few mutual funds to foresee the crisis, since Ray Dalio strongly encouraged a mentality of independent thinking and speaking up (check Principles by Ray Dalio).
People’s tendency to see patterns, especially trends, in data which are actually completely random. This is coupled with the tendency to extrapolate too readily (overextrapolate) from small samples. Housing prices had been rising for several years, so most home borrowers went ahead with their mortgages never even thinking they could lose money.
Base Risk Bias
Akin to the representativness is the “base risk bias”, such as the tendency to overweight base-rate information and historical statistics. For what concerns the crisis, an example is overattention to historical default rates, that might give the impression of safety without assessing how the fundementals, or the lending standards, had changed (Shefrin, 2009). At the same way, people looked at the historical data to reassure themselves that house prices had never fallen.
Interesting how Greenspan, the then president of the Federal Reserve, used this statistic to convey the idea that an house crash was no immediate danger for the economy.
Conservatism is an empowering factor of the “base risk bias”, and is the tendency to overweight base-rate information relative to new (or singular) information. This phenomenon is sometimes called “underreaction”, because we tend not to react promptly to the new information we receive.
Conservatism might have slowed people’s reaction when news about higher insolvency rates started to appear.
People have excessively rosy views of their prospects and the outcome of planned actions. This includes over-estimating the likelihood of positive events and under-estimating the likelihood of negative events.
In surveys, people overwhelmingly think that setbacks are more likely to happen to others than to themselves.
The optimism bias operated at many levels. Household may have underestimated the risks of taking a large mortgage and thought house prices had still a long way up to go. Hedge fund managers, investment bankers and investors might have concentrated themselves on the possible gains of their investments and underestimated the chances that they could actually lose money.
People strongly overestimate the precision of their forecast.
This too may have contributed to the poor modeling and the poor predictions we had about the future course of the housing prices. Overconfidence may have led people to believe that they could forecast future house price movements more accurately than they actually could.
Illusory superiority is a cognitive bias that causes people to overestimate their positive qualities and abilities and to underestimate their negative qualities, relative to others.
Optimism, overconfidence and illusory superiority played a role on all levels.
Bandwagon Effect and Herd Instinct
We tend to assume naturally that when other people all say something it’s probably true (bandwagon effect) and we tend to follow them (herd instinct) to feel safer and avoid conflicts.
Therefore when everybody behaves as if a house crash is so unlikely, we follow the herd believing they have good reasons for saying so. And when everyone traded CDOs, it seemed just normal that we do the same.
Indeed people tend to self-censor, and many could have been embarrassed to bring up the possibility that we might have a financial crisis. Shiller says it worked for him and a colleague of him as well when he refrained from publishing a book dealing with the importance of psychology in finance.
This bias can also be strengthened by the “omission bias”, the tendency to judge harmful actions as worse than equally harmful omissions (inaction).
It might not have been a case that many of the people that predicted the crisis were rather marginal and unconnected people, sometimes also regarded as “oddballs”.
Of the same family as the biases just described is the Groupthink, a kind of thinking in which group cohesiveness is more important than carefully considering the facts. Symptoms are that people decide not to voice contrary opinions as not to rock the boat;
The pressure is on dissenters to conform and opposing opinions are viewed in a stereotypical, simplistic way. This is often referred to as Conformity, the strong tendency to conform to the group we belong to even though we don’t think it makes any sense. It’s a behavior that stems from the basic human need to be liked and accepted by others.
Even if some bankers or traders saw the problems, they may have kept quiet for the sake of conformity
Groupthink is considered to be particularly strong when: the group is valued and attractive to belong to, which seems quite possible in the case of Wall Street; the group is isolated from other opinion, and indeed there was not a very big diversity on the Wall Street boards and often risk managers weren’t lent a hear; there’s a strong leader who make his view known, and being a strong assertive leader is almost a prerequisite to become a CEO in Wall Street.
We are more concentrated on emotionally salient stimuli and on what people and media around us are talking about.
Thus, investors might see their brain resources hogged by the “attractive” yields of the subprime borrowers that everybody is talking about, or by the growth targets discussed in the company’s meetings and spend less time in researching what might go wrong.
Loss Aversion & Ambiguity Aversion
People have an aversion toward losses and ambiguity, especially under certain circumstances.
For example, the loss aversion is stronger when we’ve recently experienced a loss. And the ambiguity aversion is stronger when we don’t feel well prepared to understand what’s happening.
Once CDOs and MBS started losing value both institutional and individual investors experienced increases in loss aversion and ambiguity aversion, leading them to quickly dump their holding of risky assets quicker than the real economy would have granted for. This bias possibly accelerated the domino effect.
Categorization bias is the act of partitioning objects into general categories and ignoring the differences among members of the same category. Categorization bias may produce unintended side effects if the members of the same category are different from each other in meaningful ways.
The categorization likely played a major role once the rating agencies rubber stamped their ratings. So all investors considered all AAA assets to be the same, when it clearly turned out that wasn’t the case.
Similarly AIG, one of the biggest insurers of toxic CDOs, treated the mortgages pools it was insuring in the beginning of the 2000s as the ones they insured in 2005-2006, even though the latter contained much more subprime mortgages. A behavior which provided devastating for AIG and for the ripple effects it provoked.
The normalcy bias refers to a mental state people enter when facing a disaster. It causes people to underestimate both the possibility of a disaster occurring and its possible effects.
Nassim Nicholas Taleb, in his book “The Black Swan: The Impact of the Highly Improbable”, explains it as the psychological biases that make people individually and collectively blind to uncertainty and unaware of the massive role of the rare events in historical affairs. According to the Black Swan Theory, unpredictable events are much more common than people think and are the norm and not the exception. Black Swan events are hard to predict based on historical events.
This, coupled with the tendency to have an optimistic view, helps explaining why many thought it was impossible some catastrophic event might have happened. Mathematical models were also built without taking into account what back then seemed highly unlikely events, and the results of the model further helped many people to overlook the chances of improbably events.
Another known phenomenon is the “familiarity”. Supported by the notoriety of a certain financial instrument, we are reassured into buying it. We don’t notice this influence, and we distort our risk-profitability ratio.
Maybe this could also be stretched to include the “daily familiarity” we developed with certain securities. So for example after years of lucrative gains with CDOs and MBS traders might have felt those securities “reassuringly safe”.
The anchoring (or reference point bias, Tversky & Kahneman, 1974) is the tendency for which we tend to make forecasts on the ground of a starting assessment, or of a price thought to be “right”. An anchor to which our mind binds to and prevent us from reacting promptly when new information surfaces.
So for example, people holding CDOs who were rapidly depreciating might have been brought to keep them because the price they were currently trading for was “far below the right one”. Lehman Brothers at the beginning of the crisis bought another portfolio of MBS thinking it was “a great deal”.
This, maybe also more importantly, is also true the other way arond. People that get used to certain profits, might use them as references. And as the risk premiums declined and the same amount invested were yielding less, here it surfaces the “need” to leverage to reach –and surpass- the reference point of the previous years (or quarter, or trade). A reference point can also be set for emulating purposes. UBS increased dramatically its exposure to the subprime products because a consultant company had advised to do so (which was McKinsey).
Abundance bias might kick in when there’s so much to one’s disposal that worrying for the future shrink till reaching zero, and an end of the bonanza doesn’t even cross people’s mind.
Traders and banker were so busy splurging that they just couldn’t fathom that an end of their plush life might be in the cards.
Sociology & psychology of financial crisis: Summary
The markets were –and maybe still are- ill prepared to cope effectively and efficiently with the less rational side of the human nature. When the bubble was inflating, judgement and decisions about risk, reward and the evaluation of success had become systematically compromised. The excitement of gains erased the anxiety about potential consequences, producing herd effect and competitive arousal.
When the bubble started deflating the same psychological forces started acting in the opposite direction. Anxiety and panic broke out leading to bank runs and a collapse of demand.
And in the aftermath the emotional pain that would be involved in accepting responsibility stands in the way of lessons being learned.