- Black Swans
- Bow Ties
- Further Information
Published in 2007, Nassim Nicholas Taleb's book The Black Swan achieved almost overnight fame (a circumstance that is, in itself, something of a black swan). He defined a black swan event as having the following attributes.
- It is unpredictable;
- It has a massive impact; and
- After the fact we develop explanations to make the event appear less random, and more predictable, than it actually was.
A black swan is not the same as bad luck. Therefore the fact that such an event can occur does not justify lack of effort in reducing risk nor does it justify fatalism; it merely states that, no matter how good our risk management programs may be, bad events will occur.
Taleb's book was written about financial markets, but its concepts can be applied to almost any type of system, including process facilities. For example, the three parameters listed above apply almost perfectly to the Deepwater Horizon/Macondo event:
- It was a surprise;
- Its impact on the offshore oil and gas industry was profound; and
- By now, pretty much everyone can explain what happened, thus removing the element of surprise from their thinking. We have all become experts, even Monday morning quarterbacks.
Taleb attributes our inability to anticipate black swans to our tendency to focus on those things that we know, and to pay relatively less attention to what we don't know. We cannot "think the unthinkable". We are also too prone to categorize - a failing of most incident investigations as discussed by Dean Gano in his book Apollo Root Cause Analysis.
Taleb notes that the Japanese Nuclear Commission had, in the year 2003, set the following goal:
The mean value of acute fatality risk by radiation exposure resultant from an accident of a nuclear installation to individuals of the public, who live in the vicinity of the site boundary of the nuclear installation, should not exceed the probability of about 1//10^6 per year (that is, at least 1 per million years).
The Fukushima-Daiichi nuclear power plant catastrophe occurred eight years later.
He goes on to state,
I spent the last two decades explaining . . . why we should not talk about small probabilities in any domain. Science cannot deal with them. It is irresponsible to talk about small probabilities and make people rely on them, except for natural systems that have been standing for 3 billion years (not manmade ones for which the probabilities are derived theoretically, such as the nuclear field for which the effective track record is only 60 years).
. . . . .