Government, Industry Can Better Manage Risks of Rare Catastrophic Events

November 16, 2012

A Stanford University engineer and risk management expert has analyzed the phenomenon of government and industry waiting for rare catastrophes to happen before taking risk management steps. She concluded that a different approach to these events would go far towards anticipating them, preventing them or limiting the losses.

Several potentially preventable disasters have occurred during the past decade, including the recent outbreak of rare fungal meningitis linked to steroid shots given to 13,000 patients to relieve back pain. Before that, the 9/11 terrorist attacks in 2001, the Space Shuttle Columbia explosion in 2003, the financial crisis that started in 2008, the Deepwater Horizon accident in the Gulf of Mexico in 2011, and the Fukushima tsunami and ensuing nuclear accident also in 2011 were among rare and unexpected disasters that were considered extremely unlikely or even unthinkable.

To examine the risk management failures discernible in several major catastrophes, the research draws upon the combination of systems analysis and probability as used, for example, in engineering risk analysis. When relevant statistics are not available, it discusses the powerful alternative of systemic risk analysis to try to anticipate and manage the risks of highly uncertain, rare events.

The paper by Stanford University researcher Professor Elisabeth Paté-Cornell recommends “a systematic risk analysis anchored in history and fundamental knowledge” as opposed to both industry and regulators sometimes waiting until after a disaster occurs to take safety measures as was the case, for example, of the Deepwater Horizon accident in 2011.

Her paper, “On ‘Black Swans’ and ‘Perfect Storms’: Risk Analysis and Management When Statistics Are Not Enough,” appears in the November 2012 issue of Risk Analysis, published by the Society for Risk Analysis.

Paté-Cornell’s paper draws upon two commonly cited images representing different types of uncertainty—“black swans” and “perfect storms”—that are used both to describe extremely unlikely but high-consequence events and often to justify inaction until after the fact. The uncertainty in “perfect storms” derives mainly from the randomness of rare but known events occurring together. The uncertainty in “black swans” stems from the limits of fundamental understanding of a phenomenon, including in extreme cases, a complete lack of knowledge about its very existence.

Given these two extreme types of uncertainties, Paté-Cornell asks what has been learned about rare events in engineering risk analysis that can be incorporated in other fields such as finance or medicine. She notes that risk management often requires “an in-depth analysis of the system, its functions, and the probabilities of its failure modes.” The discipline confronts uncertainties by systematic identification of failure “scenarios,” including rare ones, using “reasoned imagination,” signals (new intelligence information, medical alerts, near-misses and accident precursors) and a set of analytical tools to assess the chances of events that have not happened yet. A main emphasis of systemic risk analysis is on dependencies (of failures, human errors, etc.) and on the role of external factors, such as earthquakes and tsunamis that become common causes of failure.

The “risk of no risk analysis” is illustrated by the case of the 14 meter Fukushima tsunami resulting from a magnitude 9 earthquake. Historical records showed that large tsunamis had occurred at least twice before in the same area. The first time was the Sanriku earthquake in the year 869, which was estimated at magnitude 8.6 with a tsunami that penetrated 4 kilometers inland. The second was the Sanriku earthquake of 1611, estimated at magnitude 8.1 that caused a tsunami with an estimated maximum wave height of about 20 meters. Yet, those previous events were not factored into the design of the Fukushima Dai-ichi nuclear reactor, which was built for a maximum wave height of 5.7 meters, simply based on the tidal wave caused in that area by the 1960 earthquake in Chile. Similar failures to capture historical data and various “signals” occurred in the cases of the 9/11 attacks, the Columbia Space Shuttle explosion and other examples analyzed in the paper.

The risks of truly unimaginable events that have never been seen before (such as the AIDS epidemics) cannot be assessed a priori, but careful and systematic monitoring, signals observation and a concerted response are keys to limiting the losses. Other rare events that place heavy pressure on human or technical systems are the result of convergences of known events (“perfect storms”) that can and should be anticipated. Their probabilities can be assessed using a set of analytical tools that capture dependencies and dynamics in scenario analysis. Given the results of such models, there should be no excuse for failing to take measures against rare but predictable events that have damaging consequences, and to react to signals, even imperfect ones, that something new may be unfolding.

Source: Society for Risk Analysis

Was this article valuable?

Here are more articles you may enjoy.