Friday, January 18, 2008

Risk versus uncertainty

Andrew Gelman pointed to this article by Nassim Taleb which I found interesting:
The Irrelevance of "Probability"
I spent a long time believing in the centrality of probability in life and advocating that we should express everything in terms of degrees of credence, with unitary probabilities as a special case for total certainties, and null for total implausibility. Critical thinking, knowledge, beliefs, everything needed to be probabilized. Until I came to realize, twelve years ago, that I was wrong in this notion that the calculus of probability could be a guide to life and help society. Indeed, it is only in very rare circumstances that probability (by itself) is a guide to decision making . It is a clumsy academic construction, extremely artificial, and nonobservable. Probability is backed out of decisions; it is not a construct to be handled in a standalone way in real-life decision-making. It has caused harm in many fields.


Consider the following statement. "I think that this book is going to be a flop. But I would be very happy to publish it." Is the statement incoherent? Of course not: even if the book was very likely to be a flop, it may make economic sense to publish it (for someone with deep pockets and the right appetite) since one cannot ignore the small possibility of a handsome windfall, or the even smaller possibility of a huge windfall. We can easily see that when it comes to small odds, decision making no longer depends on the probability alone. It is the pair probability times payoff (or a series of payoffs), the expectation, that matters. On occasion, the potential payoff can be so vast that it dwarfs the probability — and these are usually real world situations in which probability is not computable.

Consequently, there is a difference between knowledge and action. You cannot naively rely on scientific statistical knowledge (as they define it) or what the epistemologists call "justified true belief" for non-textbook decisions. Statistically oriented modern science is typically based on Right/Wrong with a set confidence level, stripped of consequences. Would you take a headache pill if it was deemed effective at a 95% confidence level? Most certainly. But would you take the pill if it is established that it is "not lethal" at a 95% confidence level? I hope not.

I would add another interpretation as well: Knowing the probabilities for success or failure is a known risk. Even though I know the risks I may not act to follow up on the actions associated with the risk (using some expected utility maximization framework with risk aversion) even though it may be beneficial. This is what I would think of as individual uncertainty - I don't know how well or badly I can handle the consequences of the action. Perhaps this is just heterogeneity in risk aversion and I am just more risk averse than average. But regardless -- economists consider risk as something that is quantifiable whereas uncertainty is not -- at the individual level all risk is unquantifiable so at this level, all risk is uncertainty.

No comments: