01 September 2012

Five Commandments of Decision Making Under Uncertainty

In a paper presented yesterday at the Jackson Hole Economic Policy Symposium (a history of the symposium is here in PDF), Andrew Haldane and Vasileios Madouros recommend "Five Commandments" of decision making under uncertainty. The paper is titled "The Dog and the Frisbee" and in it they describe these "five commandments":
These are “Five Commandments” of decision-making under uncertainty. That description is apt. Like disease detection, frisbee catching, sports prediction and stock-picking, living a moral life is a complex task. The Ten Commandments are heuristics to help guide people through that moral maze, the ultimate simple rules. They have proven remarkably robust through the millennia. Less has been more.
The "commandments" are summarized below, based on my distillation of the text of their paper, and frequent readers of this blog are going to find much in them that is familiar:

1. "Complex environments often instead call for simple decision rules"
The simplest explanation is that collecting and processing the information necessary for complex decisionmaking is costly, perhaps punitively so. Fully defining future states of the world, and probability-weighting them, is beyond anyone’s cognitive limits. Even in relatively simple games, such as chess, cognitive limits are quickly breached. Chess grandmasters are unable to evaluate fully more than 5 chess moves ahead. The largest super-computers cannot fully compute much beyond 10 moves ahead (Gigerenzer (2007)).

Most real-world decision-making is far more complex than chess – more moving pieces with larger numbers of opponents evaluated many more moves ahead. Simon coined the terms “bounded rationality” and “satisficing” to explain cost-induced deviations from rational decision-making (Simon (1956)). A generation on, these are the self-same justifications being used by behavioural economists today. For both, less may be more because more information comes at too high a price.
2. "Ignorance can be bliss"
Too great a focus on information gathered from the past may retard effective decision-making about the future. Knowing too much can clog up the cognitive inbox, overload the neurological hard disk. One of the main purposes of sleep – doing less – is to unclog the cognitive inbox (Wang et al (2011)). That is why, when making a big decision, we often “sleep on it”.

“Sleeping on it” has a direct parallel in statistical theory. In econometrics, a model seeking to infer behaviour from the past, based on too short a sample, may lead to “over-fitting”. Noise is then mistaken as signal, blips parameterised as trends. A model which is “over-fitted” sways with the smallest statistical breeze. For that reason, it may yield rather fragile predictions about the future.

Experimental evidence bears this out. Take sports prediction. . .
3. "Probabilistic weights from the past may be a fragile guide to the future"
John von Neumann and Oskar Morgenstern established that optimal decision-making involved probabilistically-weighting all possible future outcomes (von Neumann and Morgenstern (1944)). Multiple regression techniques are the statistical analogue of von Neumann-Morgenstern optimisation, with behaviour inferred by probabilistically-weighting explanatory factors.

In an uncertain environment, where statistical probabilities are unknown, however, these approaches to decision-making may no longer be suitable. Probabilistic weights from the past may be a fragile guide to the future. Weighting may be in vain. Strategies that simplify, or perhaps even ignore, statistical weights may be preferable. The simplest imaginable such scheme would be equal-weighting or “tallying”.
4.  "Other things equal, the smaller the sample, the greater the model uncertainty and the better the performance of simple, heuristic strategies"
The choice of optimal decision-making strategy depends importantly on the degree of uncertainty about the environment – in statistical terms, model uncertainty. A key factor determining that uncertainty is the length of the sample over which the model is estimated. Other things equal, the smaller the sample, the greater the model uncertainty and the better the performance of simple, heuristic strategies.

Small samples increase the sensitivity of parameter estimates. They increase the chances of inaccurately over-fitting historical data. This risk becomes more acute, the larger the parameter space being estimated. Complex models are more likely to be over-fitted. And the parametric sensitivity induced by over-fitting makes for unreliable predictions about the future. Simple models suffer fewer of these parametric excesssensitivity problems, especially when samples are short.
5.  "Complex rules may cause people to manage to the rules, for fear of falling foul of them"
There is a final, related but distinct, rationale for simple over complex rules. Complex rules may cause people to manage to the rules, for fear of falling foul of them. They may induce people to act defensively, focussing on the small print at the expense of the bigger picture.

Studies of the behaviour of doctors illustrate this pattern (Gigerenzer and Kurzenhäuser (2005)). Fearing misdiagnosis, perhaps litigation, doctors are prone to tick the boxes. That may mean over-diagnosing drugs or over-submitting patients to hospital. Both are defensive actions, reducing risks to the doctor. But both are a potential health hazard to the patient. For example, submitting patients to hospital increases significantly their risk of secondary infection. Hospitals are, after all, full of sick people.

Doctors unencumbered by a complex rulebook will have fewer incentives to act defensively. They may also be better able to form their own independent judgements when diagnosing medical problems, using their accumulated experience. That ought to more closely align a doctor’s risk incentives with their patient’s. The same is likely to be true of other professions, from lawyers to policemen to bank supervisors.
A focus on simple vs. complex analyses and decisions that are based on heuristics rather than optimization runs counter to the grain of conventional wisdom across many areas, from financial regulation to environmental protection.

One important point to note is that their paper uses two conflicting definitions of "uncertainty." One definition of uncertainty is equivalent to "risk" or the odds of a particular outcome from a known distribution of outcomes. If I bet $1,000 that the next roll of a die will turn up 6, I am taking a risk on an uncertain outcome. A second definition of uncertainty ("Knightian uncertainty") is equivalent to what I typically call "ignorance" following from the work of John Maynard Keynes, as discussed in The Honest Broker. These two definitions are obviously not referring to the same concept, and thus are subject to confusion unless care in taken in the interpretation. (I discuss uncertainty-as-ignorance at length in this recent paper in PDF).

Academics and policy makers typically like to focus on uncertainty-as-risk rather than uncertainty-as-ignorance as the former is more readily subject to easy quantification and manipulation. This focus reinforces the values of academia (where physics-envy runs rampant through the social sciences) and the desire of politicians to make concrete-looking claims backed by authoritative-sounding expertise. The result can be to create a zone of ignorance surrounding our decisions. Not surprisingly, bad decisions can result.

Haldane and Madouros apply their analysis to financial regulation, but the heuristics that they introduce have a much broader applicability. The entire paper just scratches the surface of this important topic, but it is a readable and valuable contribution. Have a look (PDF).