08 July 2011

The Social Construction of Black Swans

A "black swan" event in the words of Nassim Nicolas Taleb is characterized as follows:
What we call here a Black Swan (and capitalize it) is an event with the following three attributes. First, it is an outlier, as it lies outside the realm of regular expectations, because nothing in the past can convincingly point to its possibility. Second, it carries an extreme impact. Third, in spite of its outlier status, human nature makes us concoct explanations for its occurrence after the fact, making it explainable and predictable.

I stop and summarize the triplet: rarity, extreme impact, and retrospective (though not prospective) predictability. A small number of Black Swans explains almost everything in our world, from the success of ideas and religions, to the dynamics of historical events, to elements of our own personal lives.
The Fukushima nuclear disaster certainly would seem to qualify.  However, a closer look at why the disaster occurred reveals that it was a black swan of our own making.  Here is an explanation why this is so from a recent WSJ story (thanks TB! Emphasis added.):
A design flaw at Japan's Fukushima Daiichi nuclear plant -- one that senior engineers had known about for years -- caused the cooling systems to fail at four reactors after the March 11 earthquake and tsunami, according to company records and interviews with current and former employees.

Tokyo Electric Power Co., the plant's operator, used two different designs for protecting the backup generators for its cooling systems at 10 reactors in the Fukushima region. The cooling systems at the reactors with the older design failed, causing fuel meltdowns and explosions.

The older Fukushima reactors, dating back to the 1960s and built with General Electric Co.'s Mark I design, housed their electric-switching stations in exterior buildings vulnerable to the tsunami's waves. Newer reactors, meanwhile, house these stations in their sturdy main buildings.

When the waves knocked out the switching stations at the older reactors, they rendered the backup generators useless. "Once water gets in there, the whole thing is kaput," said Katsuya Tomono, a former TEPCO executive vice president.

GE said its reactors are safe and that any design flaws are TEPCO's fault because the company was in charge of design changes. Current and former TEPCO engineers say that in retrospect, they should have done something about the flaw.
It is well understood that black swan events emerge from the murky unknown.  What is less appreciated is that the murky unknown is often a region of our own creation.


  1. My understanding is that the buildings that the generators and buses were in were supposed to be safe from a tsunami greater than the maximum historical tsunami in that region. Subsequent to the plants' construction, scientists discovered evidence of a prehistoric tsunami of a similar magnitude as that of March 11. A backfit designed with an appropriate degree of margin could have protected the generators and their electical buses and we never would have heard anything more than an initial report that the Fukushima reactors had suffered a loss of offsite power, but were operating on backup generators.

  2. From Stan by email:

    "I'm not sure I would call this a black swan. The flaw was known and its impact understood. The possibility of a tsunami was recognized. They simply decided (explicitly or implicitly) that the odds weren't worth making the changes. They rolled the dice and lost."

  3. Roger - I know you favor Taleb's Black Swan construct, but I've got problems with it. Biologists have long used the story and given it a practical usage that Taleb has stretched out of recognition. To biologists, the Black Swan story refers to definitions. The word 'swan' referred to a large, long-necked white water bird. When the first black swans were first seen, it was realized that 'white' was not essential to the definition of 'swan.'

    So for biologists, the moral of the black swan story is simply that there are categories of things you can't rule out without first making all possible observations. If you've seen 999 out of 1000 swans in the world, and they have all been white, that does not prove that all swans are white. In fact, whiteness was always a superficial attribution - the skeleton of a black swan would have been recognized by European naturalists as that of a swan immediately.

    The original error of the black swan was not that Europeans didn't properly weigh the known probability of finding black swans into account. The problem was that it never occurred to them that such a thing would exist. Their assumption at the time 'All swans are white' - was a perfectly reasonable one.

    All of which is entirely irrelevant to engineering risk management. The risks inherent in complex systems is known ahead of time. The risks inherent in financial markets are known as well. A rare event with potentially catastrophic consequences has little or nothing to do with the lesson of the black swan, in which a previously unknown observation had a trivial consequence.