Think twice: an interview with Michael J. Mauboussin
Interview by: Alistair Craven
Michael J. Mauboussin is Chief Investment Strategist at Legg Mason Capital Management. Prior to joining LMCM in 2004, Michael was a Managing Director and Chief U.S. Investment Strategist at Credit Suisse.
Michael joined CS in 1992 as a packaged food industry analyst. He is a former president of the Consumer Analyst Group of New York and was repeatedly named to Institutional Investor’s All-America Research Team and The Wall Street Journal All-Star survey in the food industry group.
Michael is the author of Think Twice: Harnessing the Power of Counterintuition (Harvard Business Press, 2009) and More Than You Know: Finding Financial Wisdom in Unconventional Places – updated and expanded (New York: Columbia Business School Publishing, 2008). More Than You Know was named one of “The 100 Best Business Books of All Time” by 800-CEO-READ, one of the best business books by BusinessWeek (2006) and best economics book by Strategy+Business (2006). He is also co-author, with Alfred Rappaport, of Expectations Investing: Reading Stock Prices for Better Returns (Harvard Business School Press, 2001).
Michael has been an adjunct professor of finance at Columbia Business School since 1993 and is on the faculty of the Heilbrunn Center for Graham and Dodd Investing. In 2009, Michael received the Dean’s Award for Teaching Excellence. BusinessWeek’s Guide to the Best Business Schools (2001) highlighted Michael as one of the school’s “Outstanding Faculty,” a distinction received by only seven professors.
AC: Prior to your current role you were Managing Director and Chief US Investment Strategist at Credit Suisse. Can you tell us about this role?
I started at Credit Suisse as an equity analyst following the packaged food companies. I loved being an analyst, and was especially interested in the investment process – including how markets work, how to value businesses, and how to assess their competitive positioning. Eventually, the firm migrated my role from working on an industry to developing thoughts on investment process.
In this sense, I was unlike a traditional strategist. Most strategists spend time figuring out targets for the S&P 500, where earnings are going, or determining sector weighting. I did none of those things. I tried to focus my work on the essential building blocks of a successful investment process.
AC: Where did the idea for the Think Twice book come from?
In the early 1990s, I started teaching a course at Columbia Business School that is ostensibly about how to be a good investor. In the early years, I focused a great deal on the mechanics of investing – how to do a proper valuation, competitive strategy analysis, financial statement analysis, and the like. But over the years, I came to the clear realization that mastering those mechanics was not sufficient to being a great investor. What separated the good from the great investors had little to do with their analytical tools but a great deal to do with how they made decisions.
So over the past decade or so I turned my attention more to the psychological parts of investing. This coincided with the rapid growth of work in behavioural economics. My basic observation was that the successful participants in all probabilistic fields, including handicappers, poker players, sports team managers, and investors, had more in common with one another than they did with the average participant in their field. So I wanted to understand how those successful people thought differently than the rest of us.
So Think Twice is a consolidation of that inquiry. The fact is there is a big difference between intelligence, as measured by intelligence quotient, and good decision making. And while Think Twice is clearly applicable to investors and businesspeople, the lessons apply for all professionals who make decisions.
AC: How difficult is it for us to change our approach to decision making?
I think the evidence shows that it is really hard. Our minds are wired a certain way, and change does not come easily. In the book, I recommend three steps to improving your decision making. The first is to recognize situations where you’re likely to make a mistake. The idea is to build a mental database of potential problem areas. If you can do this effectively, you’re already a step ahead of the game.
The second step is to recognize the problem areas in context. It turns out these situations show up in different guises. But if you are versed in them and alert to their presence, you will see them everywhere. You’ll see them in your professional life, your personal life, and when taking in the news. I have enjoyed picking out these mistakes in various realms.
The final step is to take measures to mitigate these mistakes. Each chapter ends with some concrete tips on how to deal with the mistake discussed. So you need understand the potential problem, pick it out, and deal with it properly.
AC: You note that “no one wakes up thinking ‘I am going to make bad decisions today’, yet we all make them.” Why is this so?
When faced with certain types of situations our minds naturally take us down one path when another path is a better way to think about the problem. By the way, this observation is true no matter how intelligent you are. We all come with the same software.
One way to illustrate this point is with a question from what is known as the cognitive reflection test. Here it is: “A bat and ball together cost $1.10. The bat costs $1.00 more than the ball. How much does the ball cost?”
The first answer that pops into almost everyone’s mind is $0.10. It’s basically automatic. But, of course that answer is wrong. The ball must cost $0.05 ($0.05 + $1.05 = $1.10). Your mind takes you in one direction and you have to stop and go in another direction to get to the proper answer.
So that’s one of the reasons we make poor decisions even if we don’t set out to.
AC: In the introduction you state that our brains are not wired for the process of “moving from preparation to recognition.” Can you elaborate?
When we are faced with problems, we tend to spend much more time gathering information than we do considering the nature of the problem. Indeed, research shows that decision makers typically allocate one-quarter of their time to thinking properly about the problem. So we just don’t recognize the problem we’re up against.
Here’s the challenge. Lots of information is falsely empowering if you don’t understand the nature of the problem. In fact, it can create a false sense of confidence. The main point is that there is a lot of leverage in conceptualizing problems well.
AC: What are the risks and problems encountered when making decisions based on incomplete information?
There are lots of decisions in life that are based on incomplete information. The risk is that additional information is bad, hence making the outcome of your decision unfavourable.
Here are some thoughts on how to cope with this. First, it is essential to focus on the decision-making process and not on the outcome when evaluating such decisions. This is easier said than done, because outcomes are what ultimately matter. But in probabilistic situations, a good decision making process can lead to a bad outcome, and a bad process can lead to a good outcome. But you must operate under the belief that good decisions will ultimately lead to good outcomes.
"While I believe intuition has a definite role in decision making, I also think it has been over glorified. People have an uncanny ability to recall when intuition served them well, and an equally uncanny ability to forget when it failed them."
Let me make this a little more concrete. A friend of mine describes playing blackjack in Las Vegas, when the guy sitting next to him is dealt a 17. Now, if you know standard blackjack strategy you know that the right thing to do is to sit on a 17. But this man asked for a hit. The dealer revealed a 4, making his hand. For good measure, the dealer said, “good hit, sir.” That’s an example of a bad process and a good outcome. If you pursue that strategy over time you are sure to lose. So focus on process.
Second, when information is incomplete it is a good idea to defer your decision as long as you can. Sometimes you just have to decide, and if so you should use whatever information you have at your disposal. But if you have some flexibility and can put off a decision until more information comes in, you can improve your chances of success.
AC: You say that the best decisions often derive from “sameness.” Can you explain what you mean by this?
Yes. We tend to think of ourselves as unique from others and, by the way, superior to others. So when we’re faced with certain problems, it doesn’t occur to us to consult the vast experiences of others.
This is known as the problem of relying on the inside view versus the outside view. With the inside view, when we face a problem we gather lots of information, combine it with our own input, and project. This happens for trivial issues like how long it will take to complete a term paper to more consequential items like the probability of success of a large corporate merger.
The outside view suggests viewing a problem as an instance of a larger reference class. That allows you to pose a simple question: when others have been in this situation, what happened? While we all face decisions that may be unique or rare for us, many others have made similar decisions in the past. There’s a database of humanity out there, waiting to be tapped.
I want to stress that the outside view is an unnatural way to think about problems, precisely because it requires that you set aside all of the information you laboured to gather. The outside view doesn’t work in all situations, but it is underutilized in decision making because we don’t want to acknowledge our “sameness.”
AC: What are the dangers of “tunnel vision” as you refer to it in the book?
Our minds often want to get to the right answer, and to do it quickly. And, for the most part, our minds are really good at this.
But there are some problems with coming up with answers quickly, because to do so our minds have to use rules of thumb, or heuristics. The virtue of heuristics is, of course, that they work most of the time. But heuristics also have associated biases, which can cause us to fail to consider enough alternatives.
Another challenge is that our minds are quick to anchor on natural or arbitrary figures, distorting our ability to make good decisions. For example, I asked my students at Columbia Business School to write down the last four digits of their phone number and then estimate the number of doctors in the Manhattan borough of New York City. While all of the students acknowledged that there was no connection between their numbers and the population of doctors, the phone numbers anchored them such that students with low ending phone numbers guessed a much lower number than those students with high ending numbers.
AC: How does the well-documented “information overload” syndrome relate to your ideas on decision making?
Despite the trend toward multi-tasking, the evidence shows that we can really pay attention to only a few things at a time. So what you pay attention to will shape how you decide.
The way most people cope with this is to pay attention to the ideas they already believe and to basically ignore everything else. This shows up clearly in studies of political partisans.
So when faced with a consequential decision, you want to focus on the problem at hand. Think about the problem carefully. Then consider all information – both views that support your view as well as those that don’t. Finally, you can decide.
AC: You quote a survey which found that almost half of Fortune 1000 executives questioned said that they rely on intuition to make decisions. What was your reaction to this?
I found that statistic a little depressing. While I believe intuition has a definite role in decision making, I also think it has been over glorified. People have an uncanny ability to recall when intuition served them well, and an equally uncanny ability to forget when it failed them.
Here’s how I think about it. You can develop intuition in situations that are stable and linear. But if the activity is unstable and non-linear, all bets are off regarding the virtue of intuition. So, for instance, would chess masters have intuition? Absolutely, because the basic rules of the game don’t change despite its great complexity. But are markets, or businesses, stable and linear? Maybe in some regards, but not for the most part.
Maggie Neale, a professor who has studied decision making, distinguishes between experts and people with experience. As she points out, many people think they are experts because they have experience. But experts have models that are predictive, while people with experience have models that may not predict well at all.
AC: At the end of the book you make a fascinating point: almost everyone realizes how important decision making is, yet very few of us practice in order to improve. What do you think can be done to change this?
Well, we might start teaching some of these ideas to young people – perhaps the last couple years of high school and definitely those of college age. And in so doing we should strive to make the examples and tools as relevant as possible. The sooner an individual learns these lessons, the better off they will be as they go through life.
This can also be a focal point for leaders. If an organization’s leadership demonstrates an emphasis on good decision making, including proper training and decision support tools, the chance for success rises substantially. But truth be told, most leaders perceive themselves as too busy to allocate much time to learning how to improve their decisions.
AC: Finally, are there any closing comments you wish to make?
I would want to add that this book is really a story of opportunity. There are lots of great books about decision making, but many of them leave you with the same message: there is an optimal way to decide and your decisions fall short of that ideal. You walk away feeling somewhat deficient.
Now it is true that we are not optimal. But you can think of this as an opportunity that comes in two flavours. First, you can work on reducing your own mistakes. Like a tennis player who seeks to minimize unforced errors, fewer mistakes will lead to a better long-term outcome.
Second, you can be sure that others will continue to make these mistakes, and hopefully you will not be too proud to take advantage of the situations when you see them. My hope is that an emphasis on the opportunity inherent in these lessons will encourage more people to learn about the ideas.
You can order Think Twice: Harnessing the Power of Counterintuition from Amazon.com
Think twice: an interview with Michael J. Mauboussin