Making Great Decisions When it Counts


Some friends and I watched the above talk together by Dan Gilbert on the various ways humans made logical errors in decision making.  If you are a behavioral economist or are into psychology literature, you are probably all too familiar with the experiments on this subject, but it’s worth watching anyway.

There was some criticism of the talk in that it does ignore the fact that given limited resources in making decisions, the heuristics that we humans use (i.e. the rules of thumb, like price being a good indicator of quality) serve us very well most of the time.  It’s only under specific circumstances that these heuristics lead to logical errors and bad decisions.  Thus, the talk left some people thinking that the point Gilbert was making is that we’re all pretty bad decision makers and we should learn to transcend these error-prone heuristics.  The critics further suggested that no, we’re not bad decision makers, we are in fact really good 95% of the time, and furthermore it’s not really logical to waste our time trying to be better because the cost is too steep.  We’d waste every moment of our lives figuring out what a good price is for a bottle of wine.

My interpretation is slightly different.  The import of Gilbert’s thesis is not the 95% of the time where our rules of thumb lead us to a good or reasonable decision (all things considered).  Rather, it’s the 5% (or 1% or 0.1%) of the time where our bad decisions have a hugely negative impact.  Consider for a moment the fact that those in positions of great power (government leaders, CEOs of large corporations, etc), are working with the same faulty decision-making apparatus as the rest of us.  And so unless there are meta-apparatuses in place for making sound decisions on, say, whether and how to spend $800 Billion tax-payer dollars, we can expect that the logical errors that Gilbert speaks about will translate into massive losses in real dollars that otherwise could be easily avoided.

Gilbert’s example of the Homeland Security people asking him what to do about terrorism is a poignant reminder that the evolutionary legacy of our analytical minds flounders in ways today that it never could have on the savannas.  And if you agree that the potential consequences of individual decisions gets greater with each passing decade, then you should understand how vital it is for us to acknowledge the limitations of our analytic minds and to go to extraordinary lengths to make great decisions when it really matters.

  • Probably beyond the scope of Gilbert’s talk, but the separation of probability from value doesn’t hold in a zero sum game.

    The rational expectation of the state of nature, taken as the other person’s choices, is only dependent on the payoffs.

    Much of modern finance theory appears to ignore this, which is probably more worrying than any of the examples Gilbert raised in this talk, his papers, or his last book.

  • Michael, great comments. I agree wholeheartedly. Perhaps though you can elaborate on these points a little as they deserve explication. And do you know of good references or schools of thought which address these?