- IStoleTheTV / Flickr, CC
The best analyst note published last week probably came from Michael J. Mauboussin and Dan Callahan at Credit Suisse, who looked at psychological biases that screw up investment decisions.
We all have these biases, yet most of us don’t know how powerful they are. They frame information for us so that it “feels” like you’re being rational. But they’re leading you to make decisions that are completely wrong. That’s why you end up making horrible decisions with your money.
You may have heard of some of them. (The power of “social conformity” is well known, for instance.) But do you know what the “disposition effect” is? Or “hyperbolic discounting”?
Here are 13 of the worst psychological biases living in your brain right now.
Pattern-seeking: Humans are worse than pigeons at optimising their behaviour when confronted with a pattern of rewards.
- Chris Arnade
We over-estimate our abilities to match patterns with our decisions, Mauboussin and Callahan write. For instance, imagine that a pigeon is confronted with two keys, one red and one white. The keys deliver a reward of food at differing rates. The red key provides food 80% of the time; the white key 20%. What should the pigeon do? In real life, pigeons figure out that the red key more reliably delivers food and they peck at it 100% of the time, and get an 80% reward rate.
Humans guess that the red key delivers more rewards too, but they also think they can guess the frequency at which the red key is delivering the reward. So they spread their picks across both keys, while still favoring the red one. As the maximum reward rate is 80%, any selection of the white key reduces that rate. And thus the reward rate for the human ends up being less than the pigeon, in real-life experiments.
Hyperbolic discounting: We’ll take a short-term gain over a long-term guarantee.
- Credit Suisse
Mauboussin and Callahan pose this quiz: “Which would you prefer, $10 today or $11 tomorrow? How about a choice between $10 one year from now and $11 in one year and a day?”
Most people choose to take $10 today, and $11 in the future. But mathematically, the two questions ought to be answered the same way. As long as you’re willing to wait one day, you’ll get a reward that is 10% greater in both scenarios.
The chart above shows that when we think about the future we’re more likely to be rational. But when confronted with immediate rewards – $10 today!!! – we’ll take less.
Confirmation bias: We seek out information we already agree with, and ignore the facts we disagree with.
- Daniel Goodman / Business Insider
Want proof that stock investor bulletin boards are filled with bad advice? Then look no further than this study of Samsung investors. Turns out that people who clicked to read the opinions of people on the message boards that they agreed with lost more money than those who did not:
“An analysis of 502 investor responses from the largest message board operator in South Korea supports our conjecture. We find that investors exhibit confirmation bias when they process information from message boards. We also demonstrate that investors with stronger confirmation bias exhibit greater overconfidence. Those investors have higher expectations about their performance, trade more frequently, but obtain lower realized returns.”
Social conformity: We have an increased tendency to agree with wrong ideas if we know that everyone else agrees with them.
- Credit Suisse via Solomon Asch
The power of conformity was demonstrated in a famous experiment in 1955 by the psychologist Solomon Asch. Test subjects were asked to join an audience that had to select the longest line on a screen in front of them. Unbeknownst to the subject, all the other viewers were stooges, and they are deliberately picking the wrong line. About 36.8% of subjects are so unnerved by the crowd’s confidently wrong guesses that they go along with them, even though the correct answers are right before their eyes.
“These subjects,” wrote Asch, “suffer from primary doubt.”
Anchoring: We have a tendency to stick with what we know, regardless.
- Tony Woolliscroft/Getty
Back in 2000, two academics – Andrew L. Zacharakisa and Dean A. Shepherd – did a study of how 51 venture capitalists make decisions, rating them on their level of confidence, how much information they had, and whether they ended up being right or wrong. The results showed that the more information they got, the more confident they became, and the more wrong they ended up.
The truth is that although it feels as if more information is likely to make your judgment more sound, it’s usually making the decision more complicated.
The disposition effect: Or, we’re all losers. People realise their gains too quickly and hold onto their losses for too long.
- Lenore Edman / Flickr, CC
Are you more likely to sell a stock that goes up by 50% or one that goes down by 50%? The value at stake is the same in both scenarios. But a study of 10,000 brokerage accounts shows that investors lock in their winnings too soon. They will also hold on to losing stocks for too long.
People believe, seemingly rationally, that once they’ve experienced a gain they should take it off the table as a “win.” Similarly, they are reluctant to sell losing positions because, hey, maybe they’ll come back! But in both scenarios it turns out that the winners sell too early, forfeiting future gains, and the losers hold too long, extending their losses.
Loss aversion: The pain of losses is more keenly felt than the pleasure of winning, which makes you too risk-averse.
If you offer people a bet in which they can win $300 for a correct call of a coin toss but they will lose $200 if they guess wrong, most people will turn it down, Mauboussin and Callahan say. Who wants to risk losing $200 on the flip of a coin!?
But the maths say you should play this game repeatedly, because the gains will outweigh the risks. Anyone who has suffered a recent loss tends to become more risk averse because losing that money hurts more than the reward of recent gains.
This is important for stock portfolios, which tend to experience sudden short-term losses but slow long-term gains. “If you examine your portfolio often, you are more likely to see losses. Being loss averse, you will insist on higher returns to compensate for your suffering. However, if you check your portfolio infrequently, you are more likely to see gains and hence will not require returns as high,” Mauboussin and Callahan say.
The availability heuristic: We go with what we know, and assume what we don’t know is less significant.
In a classic 1973 study, Amos Tversky and Daniel Kahneman asked people to guess whether there were more English words that began with K, or more that had K as the third letter of the word. Words that begin with K (kangaroo, kitchen, etc) are easy to guess. Thinking of words where K comes third is more difficult (ask, acknowledge). People tend to guess K=1st is greater. In real life there are actually way more words where K is third than first.
Lesson: We tend to believe the information we have that comes readily to hand.
The clustering illusion: Events that happen randomly do not happen evenly, so events appear to be occurring in sync with each other when they’re not.
- Math Lair
Just because things happen in bunches does not mean that they’re causally linked, or even correlated. But it sure feels that way.
Imagine making a fruitcake, and you stir 100 raisins into the dough. Assume the raisins are perfectly randomly mixed into the cake. Once you slice the cake, some of the raisins will be bunched together and there will be empty spaces elsewhere. What definitely will not happen is that the raisins will be spaced mathematically evenly throughout the cake. That would be weird!
Yet people often interpret clustered events as though they are linked in a meaningful way. Math Lair has a good explanation of this.
Motivated reasoning: Your gut informs your “logic.”
Once you have formed an opinion of something, your emotions will actively work to defend that opinion even when presented with facts that show it to be false.
MIT published this study in which people with strong political opinions were asked to rate the statements of presidential candidates George W. Bush and John Kerry on their level of hypocrisy, while their heads were in an fMRI machine.
The statements are rigged – both sides’ statements are equally hypocritical and equally plausible. When the subjects rated the candidates they agreed with, they used areas of their brains associated with logical reasoning. But when they rated the ones they disagreed with, the emotional areas of their brains became more active, showing that you use different areas of your brain for information you disagree with.
The denomination effect: You spend more money when you use small amounts of it.
- REUTERS/Philimon Bulawayo
If you’ve ever been on holiday to somewhere like the Czech Republic, where the koruna runs at 38 to the pound (25 to the dollar / 27 to the euro), then you’ll know how easy it is to spend money quickly. So if you want to control your spending then only get banknotes denominated in large numbers. Because if you opt for small notes or coins, you’ll spend them faster. This effect has been demonstrated in both China and the US.
“The curse of knowledge”: We are unable to see something from the point of view of a person who has less information than you do.
Investors usually trade with asymmetric or unequal knowledge. Investment banks often know more about a new stock than the investors they are hoping to sell it to on IPO day, for instance. So banks have to guess how much investors who know less than they should will pay for a stock. Yet banks consistently over-price deals because they cannot ignore their own expertise, which suggests the stock should be more valuable.
Colin Camerer demonstrated this effect elegantly with an experiment in which two sets of unequal traders are pitted against each other in a market, and the group with more knowledge is incentivised to correctly guess what the less-informed investment group will do. They still get it wrong.
The Edwards Quant Fallacy*: “There is a difference between having good data and applying judgment to good data.”
- Credit Suisse
In the 1990s the drug company Roche received hundreds of complaints of serious bowel problems associated with Accutane, an acne drug it sold. Roche was sued by 5,000 patients and was forced to stop selling the drug. Why did Roche ignore so many complaints? A legal ruling in the case hinted that the company became too dependent on quantitative statistical analyses of the drug’s adverse events, which systematically under-counted complaints. The analyses told Roche that the complaints were not statistically significant compared to the rates of reports in the general population. But even if the analyses were correct, that should not have overridden Roche’s curiosity about what might be wrong with the drug.
The notion that there is a difference between data and judgment was first suggested to me by Neal Soss, the chief economist at Credit Suisse, in a conversation about the influence of quants on investment banking. I later turned his proposition into the phrase, “There is a difference between having good data and applying judgment to good data,” and I’ve been trying to popularise it ever since.