In December of 2008, Stephen Greenspan, a developmental psychologist, published The Annals of Gullibility, the first comprehensive academic treatment of why people are gullible.
A few days later Bernie Madoff’s Ponzi scheme was revealed, a crime for which he is now serving a 150-year prison sentence.
Unrelated? Not quite. Greenspan was one of Madoff’s victims. He had a third of his retirement savings invested with Madoff.
Michael Mauboussin
Smart people – those who should know better, including some of the smartest people around – make horrendous mistakes. That was the case with Greenspan. It was also the case with the Nobel Prize-winning economists who ran Long Term Capital Management and with NASA’s rocket scientists responsible for launching the space shuttle Challenger despite known problems with its foam insulation.
Each of those failures was characterized by its own unique flaw – excessive hubris, narrow-mindedness, or insufficient research. Together, though, they illustrate the potentially devastating effects of a sub-optimal decision-making process.
“When faced with critical decisions, our minds naturally want to go down one path, when a better way to think about the problem is to go down another path,” said Michael Mauboussin. Mauboussin is the Chief Investment Strategist at Legg Mason Capital Management and the author of the newly released book Think Twice.
Mauboussin’s book is dedicated to stopping you from making costly mistakes, and he discussed a number of specific mistakes applicable to investors and advisors speaking at a luncheon in Boston on April 8.
Taking the inside view
After convincingly winning the Kentucky Derby and the Preakness Stakes in 2008, the thoroughbred Big Brown became an overwhelming favorite to win the third jewel in racing’s Triple Crown, the Belmont Stakes. His trainer claimed it was a “foregone conclusion” that he would win the race, and he went off at 3-to-10 odds, which implied he had a 77% chance of winning.
He didn’t win. In fact, he was the first Triple Crown contender to finish dead last in the Belmont.
Those who bet on Big Brown were taking the inside view, expecting success based on his performance in his prior two races. According to Mauboussin, when taking the inside view, one focuses on the specific problem at hand, making predictions based on that narrow and unique set of inputs.
A careful analysis, though, would have revealed important clues that Big Brown should not have been as heavily favored as he was. One taking the outside view would have asked how other horses did in similar situations.
In the last 130 years, approximately 40% of potential Triple Crown winners succeeded in winning the Belmont, but since 1950 only three of 40 have succeeded – a 15% success rate, far below the 77% probability implied by those wagering on Big Brown. Moreover, none have succeeded since 1978.
Perhaps Big Brown was a truly exceptional horse, though, like the Jamaican runner Usain Bolt who demolished sprinting records at the 2008 Olympics. Racing handicappers have a way to assess that – the Beyer Speed Figure, which computes a horse’s speed after adjusting for track conditions, weather, time of day, and other factors.
Big Brown’s Beyer Speeds were the worst of the last seven Triple Crown contenders.
Taking the outside view – a dispassionate and objective analysis – might have steered bettors away from a costly mistake in the 2008 Belmont Stakes.
It’s easy to see the analogy with the discipline of investing. Stock prices are based on expectations, like those reflected by a horse racing parimutuel system. Understanding equity valuations, though, requires a far more nuanced analysis, based on competitive positioning, management skills, strategy, and industry trends. That analysis can only be effectively performed by taking the outside view.
“What you are really looking for is mis-pricings – differences between the tote board and the underlying valuations,” Mauboussin said.
Our natural tendency to take the inside view stems from a number of well-documented biases, according to Mauboussin. A superiority bias causes us to have undue confidence in many skill sets (e.g., 80% of people believe they are better-than-average drivers) and an optimism bias causes us to over-weight the likelihood of positive outcomes (e.g., most people believe they will have better-than-average relationships and career earnings).
The mistake of taking the inside view is often amplified by a false sense of control. A morbid example of that behavior occurred following September 11, when many chose to drive rather than fly, because they believed they were in control of their fate. It was later found that 50% more people died in automobile accidents because of increased auto travel after the terrorist attacks than died as a direct result of September 11.
Why is it so hard for us to take the outside view? “We are profoundly human, and gravitate toward stories,” Mauboussin said. He cited an experiment in which people were presented with various treatments for a hypothetical disease. The researchers found the participants chose less effective treatments when presented with unpleasant (but statistically insignificant) anecdotes regarding the more effective treatment.
Stories and anecdotes conflict with objective analysis, and cause us to take the inside view.
Failure to acknowledge the decision-making context
Mauboussin, who also teaches at the Columbia Business School, asked a group of students to write down the last four digits of their phone number, and then he asked them to estimate the number of doctors in Manhattan. The answers revealed a strong correlation – those with lower phone numbers estimated fewer doctors, and vice versa.
Along the same lines, home appraisers in Tucson were asked to appraise a house. They were also shown the listing price, and the researchers found that a higher listing price led to a higher appraisal – despite the fact that 80% of the appraisers contended that the listing price had not swayed their judgment at all.
Those examples illustrate the power of the context in which a decision is made, and how subtle changes in the decision-making environment can lead to costly errors.
“Be extraordinarily mindful of what is going on around you when you make decisions,” Mauboussin warned. “We are not as objective and rational as we believe.”
Recognize that you are creating a decision-making context for those around you. “The tone you set will by actions and words will shape those decisions,” Mauboussin said. “Some environments are conducive to great decision making and others much less so.”
Undue belief in experts
“There is a role for experts in decision-making and, in some cases, they are absolutely vital,” Mauboussin said. “That said, we pay way too much attention to people in pin-striped suits and lab coats, and to the talking heads on television.”
Experts tend do well with rules-based problems that have limited degrees of freedom, according to Mauboussin. The video-rental system Cinematch, created by Netflix, is a prime example of the power of expert analysis. By matching user preferences to movie titles, Netflix was able to get people to rent from their full inventory of movies, instead of only new releases.
Experts typically fail at problems with high degrees of freedom that are probabilistic. Phil Tetlock’s 2007 book, Expert Political Judgment, surveyed 300 political experts and measured the accuracy of their nearly 30,000 predictions over a 15-year period.
The experts were dismal forecasters. “Experts are notoriously bad at predicting political and economic outcomes,” Mauboussin said.
Especially disconcerting was the revelation that the more frequently a pundit was mentioned, the worse their predictions. Those on whom we are most likely to rely are the least likely to offer accurate forecasts.
The alternative is to rely on ordinary people. Jim Surowiecki, who wrote The Wisdom of the Crowds in 2004, has done some of the most prominent research illustrating how average people can be better at forecasting than the so-called experts. Surowiecki worked with the electronics retailer Best Buy and found that a group of their relatively uninformed employees could more accurately predict key variables (such as holiday-season sales) than the experts Best Buy was employing.
Even though crowds have proven remarkably good at providing vital information, Surowiecki said recently that firms such as Best Buy still have fundamental misgivings when it comes to throwing problems to the crowds. Management still doesn’t trust them.
For crowd-based forecasting to work, Mauboussin said three conditions must be met: the participants must be diverse, there must be a properly functioning mechanism for aggregating their individual inputs, and there must be a system for rewarding accuracy. The most likely of those conditions to be violated, Mauboussin said, is diversity because, as humans, we are naturally social and imitative.
The fallibility of experts has occasionally been brought to light by the power of crowds. In 1870, a German scientist was studying the nutritional value of green vegetables and misplaced a decimal point and overstated the iron content of spinach. In 1920s, when the movie house Paramount decided to help Americans improve their health and to get people to eat from cans, it created the character Popeye, whose power was enhanced by eating spinach. That myth that spinach has more iron, having been propagated by crowds for nearly 150 years, still prevails today.
The role of intuition
Mauboussin is not convinced that the message of Malcolm Gladwell’s book Blink, which celebrates the value of intuition, is correct. “Intuition definitely has a role in decision making, and in many cases important one,” he said. “But I believe intuition has been vastly over-glorified in terms of its significance.”
He said that psychologists have identified two systems of decision-making. One is an experiential system, which is fast, automatic, and very difficult to train. It’s what causes you to jump when you see a snake.
The other system is analytical, and is characterized by slow, purposeful, and deliberate analysis. Unlike the experiential system, it is malleable and trainable.
In most activities, we start in the analytical system and, with enough practice and experience, our decisions slip more into the experiential realm. For example, as one learns the game of chess, each move is carefully considered. A grandmaster, however, can quickly assess a position and determine which side has the advantage and what would be the best moves.
“Relying on your intuition makes sense when the system is stable and linear, such as with board games, sports, and in some military settings,” he said. If a system is unstable and non-linear, though, then all bets are off with respect to your intuition.
An unanswered question, according to Mauboussin, is whether markets are stable and linear or unstable and non-linear. The answer dictates the degree to which you should rely on intuition in your investing decisions.
Some practical advice
Mauboussin offered several suggestions to facilitate better decision-making:
- Maintain a decision journal, such as an investment journal. Record what you decide, your thought process, what you expect will happen, and how you feel physically and emotionally. Mauboussin warned that this could be a humbling and embarrassing exercise, but the benefit is that you can prevent hindsight bias – the inclination to see events that have occurred as more predictable than they in fact were before they took place.
- Create a checklist of things to consider or tasks to perform before making a decision. In fields as disparate as medicine, aviation, and investing, studies have shown that the use of checklists can improve outcomes without improving one’s skill set. “You don’t need to be smarter or better trained, just better organized,” Mauboussin said.
- Perform a pre-mortem. Before you make a decision, put yourself in the future and write down why that decision didn’t work out. Mauboussin said this technique often results in documenting 30% more issues than one otherwise would and has proven to be very powerful by the organizations that use it.
“Recognize when you need to think twice,” Mauboussin advised. The keys are when the decision is consequential and in one’s decision-making danger zone. “Slow down and bring in a friend, a family member, or a colleague,” he said.
Much of the literature suggests there is a rational way to make decisions, and yet most of us operate in a sub-optimal manner, making decisions that are, for example, predictably irrational, as Dan Ariely contends. Mauboussin did not dispute this view. Instead, he said, the prevalence of sub-optimal decision-making creates the opportunity for outperformance by reducing one’s “unforced errors” in decision-making. The result? Those who heed his lessons – betting against Big Brown, for example – can take advantage of those who don’t.
Read more articles by Robert Huebscher