01 September 2012

Five Commandments of Decision Making Under Uncertainty

In a paper presented yesterday at the Jackson Hole Economic Policy Symposium (a history of the symposium is here in PDF), Andrew Haldane and Vasileios Madouros recommend "Five Commandments" of decision making under uncertainty. The paper is titled "The Dog and the Frisbee" and in it they describe these "five commandments":
These are “Five Commandments” of decision-making under uncertainty. That description is apt. Like disease detection, frisbee catching, sports prediction and stock-picking, living a moral life is a complex task. The Ten Commandments are heuristics to help guide people through that moral maze, the ultimate simple rules. They have proven remarkably robust through the millennia. Less has been more.
The "commandments" are summarized below, based on my distillation of the text of their paper, and frequent readers of this blog are going to find much in them that is familiar:

1. "Complex environments often instead call for simple decision rules"
The simplest explanation is that collecting and processing the information necessary for complex decisionmaking is costly, perhaps punitively so. Fully defining future states of the world, and probability-weighting them, is beyond anyone’s cognitive limits. Even in relatively simple games, such as chess, cognitive limits are quickly breached. Chess grandmasters are unable to evaluate fully more than 5 chess moves ahead. The largest super-computers cannot fully compute much beyond 10 moves ahead (Gigerenzer (2007)).

Most real-world decision-making is far more complex than chess – more moving pieces with larger numbers of opponents evaluated many more moves ahead. Simon coined the terms “bounded rationality” and “satisficing” to explain cost-induced deviations from rational decision-making (Simon (1956)). A generation on, these are the self-same justifications being used by behavioural economists today. For both, less may be more because more information comes at too high a price.
2. "Ignorance can be bliss"
Too great a focus on information gathered from the past may retard effective decision-making about the future. Knowing too much can clog up the cognitive inbox, overload the neurological hard disk. One of the main purposes of sleep – doing less – is to unclog the cognitive inbox (Wang et al (2011)). That is why, when making a big decision, we often “sleep on it”.

“Sleeping on it” has a direct parallel in statistical theory. In econometrics, a model seeking to infer behaviour from the past, based on too short a sample, may lead to “over-fitting”. Noise is then mistaken as signal, blips parameterised as trends. A model which is “over-fitted” sways with the smallest statistical breeze. For that reason, it may yield rather fragile predictions about the future.

Experimental evidence bears this out. Take sports prediction. . .
3. "Probabilistic weights from the past may be a fragile guide to the future"
John von Neumann and Oskar Morgenstern established that optimal decision-making involved probabilistically-weighting all possible future outcomes (von Neumann and Morgenstern (1944)). Multiple regression techniques are the statistical analogue of von Neumann-Morgenstern optimisation, with behaviour inferred by probabilistically-weighting explanatory factors.

In an uncertain environment, where statistical probabilities are unknown, however, these approaches to decision-making may no longer be suitable. Probabilistic weights from the past may be a fragile guide to the future. Weighting may be in vain. Strategies that simplify, or perhaps even ignore, statistical weights may be preferable. The simplest imaginable such scheme would be equal-weighting or “tallying”.
4.  "Other things equal, the smaller the sample, the greater the model uncertainty and the better the performance of simple, heuristic strategies"
The choice of optimal decision-making strategy depends importantly on the degree of uncertainty about the environment – in statistical terms, model uncertainty. A key factor determining that uncertainty is the length of the sample over which the model is estimated. Other things equal, the smaller the sample, the greater the model uncertainty and the better the performance of simple, heuristic strategies.

Small samples increase the sensitivity of parameter estimates. They increase the chances of inaccurately over-fitting historical data. This risk becomes more acute, the larger the parameter space being estimated. Complex models are more likely to be over-fitted. And the parametric sensitivity induced by over-fitting makes for unreliable predictions about the future. Simple models suffer fewer of these parametric excesssensitivity problems, especially when samples are short.
5.  "Complex rules may cause people to manage to the rules, for fear of falling foul of them"
There is a final, related but distinct, rationale for simple over complex rules. Complex rules may cause people to manage to the rules, for fear of falling foul of them. They may induce people to act defensively, focussing on the small print at the expense of the bigger picture.

Studies of the behaviour of doctors illustrate this pattern (Gigerenzer and Kurzenhäuser (2005)). Fearing misdiagnosis, perhaps litigation, doctors are prone to tick the boxes. That may mean over-diagnosing drugs or over-submitting patients to hospital. Both are defensive actions, reducing risks to the doctor. But both are a potential health hazard to the patient. For example, submitting patients to hospital increases significantly their risk of secondary infection. Hospitals are, after all, full of sick people.

Doctors unencumbered by a complex rulebook will have fewer incentives to act defensively. They may also be better able to form their own independent judgements when diagnosing medical problems, using their accumulated experience. That ought to more closely align a doctor’s risk incentives with their patient’s. The same is likely to be true of other professions, from lawyers to policemen to bank supervisors.
A focus on simple vs. complex analyses and decisions that are based on heuristics rather than optimization runs counter to the grain of conventional wisdom across many areas, from financial regulation to environmental protection.

One important point to note is that their paper uses two conflicting definitions of "uncertainty." One definition of uncertainty is equivalent to "risk" or the odds of a particular outcome from a known distribution of outcomes. If I bet $1,000 that the next roll of a die will turn up 6, I am taking a risk on an uncertain outcome. A second definition of uncertainty ("Knightian uncertainty") is equivalent to what I typically call "ignorance" following from the work of John Maynard Keynes, as discussed in The Honest Broker. These two definitions are obviously not referring to the same concept, and thus are subject to confusion unless care in taken in the interpretation. (I discuss uncertainty-as-ignorance at length in this recent paper in PDF).

Academics and policy makers typically like to focus on uncertainty-as-risk rather than uncertainty-as-ignorance as the former is more readily subject to easy quantification and manipulation. This focus reinforces the values of academia (where physics-envy runs rampant through the social sciences) and the desire of politicians to make concrete-looking claims backed by authoritative-sounding expertise. The result can be to create a zone of ignorance surrounding our decisions. Not surprisingly, bad decisions can result.

Haldane and Madouros apply their analysis to financial regulation, but the heuristics that they introduce have a much broader applicability. The entire paper just scratches the surface of this important topic, but it is a readable and valuable contribution. Have a look (PDF).

14 comments:

Les Johnson said...

Roger: Rule 5 is my bugaboo. In my business, everytime someone has an incident, anywhere in the world, someone designs a new form that needs to checked off.

We actually have check lists, to track the track lists we need to fill in.

Roger Pielke, Jr. said...

-1-Les Johnson

Checklists have their place, see:

http://www.johnkay.com/2012/08/29/how-i-learnt-the-power-of-checklists

Simplicity in excess is a problem too. Striking the right balance is where the action is, and easier said than done. Thanks!

Les Johnson said...

Roger: I agree that checklists are essential. I am more than happy that the pilot of my 777 needs to check that we have enough fuel to cross the Atlantic, and to see if all the engines are still on the wing.

I agree with John Kay on needing check lists, and on keeping it focused on the important, and eliminating the elementary. His example of not needing to have a suitcase on a travel checklist is a good example.

But, our business confirms what Commandment 5 states, in that mangagers now manage rules, and government regulators do the same. So when the government manages the rules, managers then create new check lists. (by creating check lists, they are managing the rules) Eventually, some poor sod at the bottom of the check lists forgets one, or pencil whips it. He is then hung out to dry, and government and management create more check lists.

I am tempted to buy 20 of Atul Gwande's book, and send them to all upper management.

Just before retirement, of course....

Unknown said...

-1 & -2 re: rule 5: More often than not these check lists end up as scripts that shouldn't be deviated from. That's when a check list goes bad.

From my world of software testing http://www.kaner.com/pdfs/ValueOfChecklists.pdf

stan said...

Roger,

Academics' need for quantification is a huge problem. It was a big part of the financial crisis. They have all these wonderful, sophisticated statistical tools and their impressive math skills. If only they could reduce everything to a number, they could make such wonderful music.

So they make these simple assumptions, construct the most amazing architectural wonders, and convince themselves that they have created a special insight into truth. If only those little bitty assumptions were real ...

MattL said...

In large part, it sounds like they've effectively applied Hayek's knowledge and cooperation problems to modern decision making techniques and identified some of the consequences of trying to defeat them.

Joshua said...

Not sure if it is subsumed by the five rules, but one problem I see with decision-making under uncertainty is a phenomenon where in an attempt to reduce uncertainty through "objective" data, people have a tendency to over-evaluate the benefit of some sort of numerical instrument as evidence.

Problem is, sometimes the numerical instrument isn't really valid in the sense of actually measuring what it is intended to measure. The best example I can think of is using standardized tests as a way to assess applicants for college or for jobs, even though the relationship between test results and school and/or job performance is not particularly strong, and even though less "objective" measures may actually be more useful.

So, don't use invalid measures and don't fall in love with numbers.

dhlii said...

You synopsis seems openly hostile to the value of past information.

We live in a complex world and past correlations are not the same as destiny, and we should not treat them as rules of certainty, at the same time, ignoring solutions that have worked in the past or adopting solutions that have failed without careful consideration of what is different is lunacy.

Roddy said...

dhlii, I don't think that's fair, 'openly hostile'.

Since Haldane is talking about financial regulation ....

The mortgage collapse was based, crudely, on past information indicating that it was all fine, hence ultimately collective reckless behaviour unobserved by ratings agencies, regulators, or Mr Haldane at the Bank of England for that matter. What else did the ratings agencies et al use but past information in forming their rating judgements?



Jonathan Gilligan said...

Many years ago Gilbert White, Paul Slovic, and Howard Kunreuther wrote a fine paper, "Decision Processes, Rationality, and Adjustment to Natural Hazards," which found that, contrary to the assertions of this "dogs and frisbees" paper, simple heuristics by people charged with managing natural hazards led to very poor decision making.

Simple heuristics in the face of uncertainty (uncertainty from ignorance as well as randomness) tended to lead risk managers to do such things as give excessive attention to the recent past when planning for the future and insufficient attention to the full historical record.

Another example of a simple heuristic: "Kates observed that three structures in different sites were elevated by 0.3 meter (1 foot) despite a wide variation in hazard among the sites. One foot is a convenient number and these decisions suggest that a crude approximation rule was used to determine the elevation changes. ... One also wonders about the depth of analysis that led to the selection of the 100-year flood as a standard criterion in the design of flood-protection structures."

We might learn from Einstein's advice to make things as simple as possible, but no simpler. As Roger says in Comment #2, finding that balance is easier said than done.

Finally, regarding rule #2:

Calvin: The more you know, the harder it is to take decisive action.

Once you are informed, you start seeing complexities and shades of gray.

You realize nothing is as clear as it first appears. Ultimately, knowledge is paralyzing.

Being a man of action, I cannot afford to take that risk.

Hobbes: You're ignorant, but at least you act on it.”

Harrywr2 said...

#10 Jonathan Gilligan

Another example of a simple heuristic: "Kates observed that three structures in different sites were elevated by 0.3 meter (1 foot) despite a wide variation in hazard among the sites

From a residential building code explaining when chemically treated wood must be used.
http://www.irccdd.com/building_division/R319.pdf

Wood joists or the bottom of a wood structural floor when closer than 18 inches (457 mm) or wood girders when closer than 12 inches (305 mm) to the exposed ground in crawl spaces or unexcavated area located within the periphery of the building foundation.

Building 12" above grade in the US is a fairly common defense against termites. Some jurisdictions only require 6".

Termite damage runs about $5 billion per year in the US.




eo said...

Decision making is a process. Before I retired I had to do a lot of decision making under uncertainty. The golden rule is " The correctness of decision made under uncertainty is uncertain". The first commandment is: Dont get emotional, dogmatic of the decision made under uncertainty. It could be wrong. Second commandment: keep an open mind, listen to critics, and other views because they might be right and be prepared to modify and even make a complete U turn or abandon a decision as more facts are available and uncertainty is reduced.
Third-keep enough reserve that if required to modify or alter the decision there are enough resources to work. Fourth commandment- hedge or take insurance on such that if the decision turns out to be a complete failure, you could start again. Fifth commandment- execute the decision with a political will and with confidence that the correct decision has been made. The decision could be correct but it will fail because of lack of confidence.

Duncan Stuart said...

Love Joshua's point. If in doubt - get a precise looking metric....seems to be the rule of many managers who use KPIs and metrics to disguise complexity. My hero John Tukey, giant statistician, was often quoted saying something like: Better to be approximately right than to be precisely wrong. I go crazy when clients sweat over decimal points when they're missing the big picture.

Peter Lang said...

A point I take from Andrew Haldane’s paper is, it’s easy to see why the World Economic Forum, “Global Risks 2012” views systemic collapse of the banking sector as about the greatest risks facing the world. http://www.weforum.org/reports/global-risks-2012-seventh-edition

It’s also easy to see that ever increasing regulation is a major concern. It can actually cause the crash.

My conclusion: let’s not increase the risk by imposing high risk policies like carbon pricing.

Each time the issue of carbon pricing comes up I recall something I was told long ago. “There are two fundamental inputs to everything mankind has. They are energy and human ingenuity.” (a third has been added recently). Two impose a carbon price, and risk subjecting energy prices to the increased political interference and volatility that is so apparent in EU carbon price, is a high risk strategy.

Post a Comment

Note: Only a member of this blog may post a comment.