Data Collection Quotes Research Enquiries and Quotes  

Taverner Blog

Thursday, 18 September 2008

The limits of statistics

"There are three kinds of lies: lies, damned lies, and statistics."
-Benjamin Disraeli

Statistics is a powerful and useful way of understanding the world. The advent of the modern computer, faster processing speeds, and software packages like SPSS and SAS have meant that statistics is easier than ever. Techniques and methods that were once the realm of professors and statisticians have become widespread and mainstream.

Is this a good thing? Not necessarily: as these techniques have become more and more widespread, the people who are using them know less and less about what they are doing. One of the biggest dangers is in the use of statistics where it does not belong.

Nasim Taleb argues in an essay at Edge.org that a misunderstanding of statistics has led to the current credit crisis and turmoil on financial markets. This is not hindsight wisdom: Taleb predicted the failure of Fannie Mae two years before Fannie Mae in fact collapsed.

One industry that has been using statistics more and more, but where there is very little deep understanding of the methods, is finance. In particular, says Taleb, they have been trying to calculate expected returns on thick-tailed distributions (such as power law distributions), which is a recipe for disaster.

What this meant in practice was that financial markets did not take into account the effect of rare, extreme events. They were simply assuming that such things did not need to factor into their decisions.

Consider this example: imagine you have bought a house that you are living in, and you are thinking about getting insurance in case your house burns down. You think back to last year. Your house did not burn down, and none of your friends' houses burned down. The odds are pretty small, so you decide not to get insurance. Over the next couple of years, you save several hundred dollars that you would otherwise have spent on house insurance. The risk you are taking is that you are betting against an unlikely extreme event, namely a catastrophic house fire.

Nasim also talks about increases in efficiency, which he says have been over-valued against the cost of increasing vulnerability. When organisations become more efficient, they are more vulnerable to extreme events. A company with no backup server is more efficient than a company with a backup server (because they can save on salaries, equipment, etc) but it is more vulnerable to mishap. It's easy to see the error in these concrete examples, but harder when the mistake is buried in the abstractions of financial markets.

The human brain does not compute statistics or probability well. These are domains that are prone to mistakes, misunderstandings and confusion. Computers have done wonders for modern society, but this is the downside. They have placed powerful tools in the hands of people who do not understand them.


UPDATE: An article came out in the Wall Street Journal today, saying roughly the same thing:

I called some old timers in the risk-management world to see what went wrong.

I fully expected them to tell me that the problem was that the alarms were blaring and red lights were flashing on the risk machines and greedy Wall Street bosses ignored the warnings to keep the profits flowing...

...it wasn’t quite that simple.

In fact, most Wall Street computer models radically underestimated the risk of the complex mortgage securities, they said. That is partly because the level of financial distress is “the equivalent of the 100-year flood,” in the words of Leslie Rahl, the president of Capital Market Risk Advisors, a consulting firm.

Labels: , , , , , , , , ,

posted by Dave

Subscribe to
Posts [Atom]

© Taverner 2008