Featured Research

from universities, journals, and other organizations

How 'black swans' and 'perfect storms' become lame excuses for bad risk management

November 15, 2012
Stanford School of Engineering
Instead of reflecting on the unlikelihood of rare catastrophes after the fact, a risk analysis expert prescribes an engineering approach to anticipate them when possible, and to manage them when not.

Instead of reflecting on the unlikelihood of rare catastrophes after the fact, Stanford risk analysis expert Elisabeth Paté-Cornell prescribes an engineering approach to anticipate them when possible, and to manage them when not.

Related Articles

The terms "black swan" and "perfect storm" have become part of public vocabulary for describing disasters ranging from the 2008 meltdown in the financial sector to the terrorist attacks of September 11. But according to Elisabeth Paté-Cornell, a Stanford professor of management science and engineering, people in government and industry are using these terms too liberally in the aftermath of a disaster as an excuse for poor planning.

Her research, published in the November issue of the journal Risk Analysis, suggests that other fields could borrow risk analysis strategies from engineering to make better management decisions, even in the case of once-in-a-blue-moon events where statistics are scant, unreliable or non-existent.

Paté-Cornell argues that a true "black swan" -- an event that is impossible to imagine because we've known nothing like it in the past -- is extremely rare. The AIDS virus is one of very few examples. But usually, there are important clues and warning signs of emerging hazards (e.g., a new flu virus) that can be monitored to guide quick risk management responses.

Similarly, she argues that the risk of a "perfect storm," in which multiple forces join to create a disaster greater than the sum of its parts, can be assessed in a systematic way before the event because even though their conjunctions are rare, the events that compose them -- and all their dependences -- have been observed in the past.

"Risk analysis is not about predicting anything before it happens, it's just giving the probability of various scenarios," she said. She argues that systematically exploring those scenarios can help companies and regulators make smarter decisions before an event in the face of uncertainty.

Think like an engineer

An engineering risk analyst thinks in terms of systems, their functional components and their dependencies, Paté-Cornell said. For instance, in many power plants that require cooling, generators, turbines, water pumps, safety valves and more, all contribute to making the system work. Therefore, the analyst must first understand the ways in which the system works as a whole to identify how it could fail. The same method applies to medical systems, financial or ecological systems.

Paté-Cornell stresses the importance of accounting for dependent events whose probabilities are intertwined, to create a complete list of scenarios -- including the dependencies -- that must be accounted for in the risk analysis. It is, therefore, essential that engineering risk analysis include external factors that can affect the whole system, Paté-Cornell said.

In the case of a nuclear plant, the seismic activity or the potential for tsunamis in the area must be part of the equation, particularly if local earthquakes have historically led to tidal waves and destructive flooding. Paté-Cornell explained that the designers of the Fukushima Daiichi Nuclear Power Plant ignored important historical precedents, including two earthquakes in 869 and 1611 that generated waves similar to those witnessed in March of 2011.

What some described as a "perfect storm" of compounding mishaps Paté-Cornell sees as failure to assess basic failure probabilities based on experience and elementary logic.

A versatile framework

Engineering risk analyses can get complex, but their components are concrete objects whose mechanisms are usually well understood. Paté-Cornell says that this systematic approach is relevant to human aspects of risk analysis.

"Some argue that in engineering you have hard data about hard systems and hard architectures, but as soon as you involve human beings, you cannot apply the same methods due to the uncertainties of human error. I do not believe this is true," she said.

In fact, Paté-Cornell and her colleagues have long incorporated "soft" elements into their systems analysis to calculate the probability of human error. They look at all the people with access to the system, and factor in any available information about past behaviors, training and skills. Paté-Cornell has found that human errors, far from being unpredictable, are often rooted in the way an organization is managed.

"We look at how the management has trained, informed, and given incentives to people to do what they do and assign risk based on those assessments," she said.

Paté-Cornell has successfully applied this approach to the field of finance, estimating the probability that an insurance company would fail given its age and its size. She said the companies contacted her and funded the research because they needed forward-looking models that their financial analysts generally did not provide.

Traditional financial analysis, she said, is based on evaluating existing statistical data about past events. In her view, analysts can better anticipate market failures -- like the financial crisis that began in 2008 -- by recognizing precursors and warning signs, and factoring them into a systemic probabilistic analysis.

Medical specialists must also make decisions in the face of limited statistical data, and Paté-Cornell says the same approach is useful for calculating patient risk. She used systems analysis to assess data about anesthesia accidents -- a case in which human mistakes can create an accident chain that, if not recognized quickly, puts the patient's life in danger. Based on her result, she suggested retraining and recertification procedures for anesthesiologists to make their system safer.

Professor Paté-Cornell believes that the financial and medical sectors are just two of many fields that might benefit from systems analysis in uncertain, dynamic situations. "Lots of people don't like probability because they don't understand it," she said, "and they think if they don't have hard statistics, they cannot do a risk analysis. In fact, we generally do a system-based risk analysis because we do not have reliable statistics about the performance of the whole system."

She hopes that her probabilistic approach can replace the notions of black swans and perfect storms, making the public safer and better informed about risks. Apparently, others have this same hope.

"It must have struck a chord," she said, "because I already get lots of comments, responses and ideas on the subject from people around the world."

Story Source:

The above story is based on materials provided by Stanford School of Engineering. The original article was written by Kelly Servick, science-writing intern at the Stanford University School of Engineering. Note: Materials may be edited for content and length.

Journal Reference:

  1. Elisabeth Paté-Cornell. On “Black Swans” and “Perfect Storms”: Risk Analysis and Management When Statistics Are Not Enough. Risk Analysis, 2012; DOI: 10.1111/j.1539-6924.2011.01787.x

Cite This Page:

Stanford School of Engineering. "How 'black swans' and 'perfect storms' become lame excuses for bad risk management." ScienceDaily. ScienceDaily, 15 November 2012. <www.sciencedaily.com/releases/2012/11/121115133318.htm>.
Stanford School of Engineering. (2012, November 15). How 'black swans' and 'perfect storms' become lame excuses for bad risk management. ScienceDaily. Retrieved March 29, 2015 from www.sciencedaily.com/releases/2012/11/121115133318.htm
Stanford School of Engineering. "How 'black swans' and 'perfect storms' become lame excuses for bad risk management." ScienceDaily. www.sciencedaily.com/releases/2012/11/121115133318.htm (accessed March 29, 2015).

Share This

More From ScienceDaily

More Science & Society News

Sunday, March 29, 2015

Featured Research

from universities, journals, and other organizations

Featured Videos

from AP, Reuters, AFP, and other news services

Why So Many People Think NASA's Asteroid Mission Is A Waste

Why So Many People Think NASA's Asteroid Mission Is A Waste

Newsy (Mar. 27, 2015) — The Asteroid Retrieval Mission announced this week bears little resemblance to its grand beginnings. Even NASA scientists are asking, "Why bother?" Video provided by Newsy
Powered by NewsLook.com
WH Plan to Fight Antibiotic-Resistant Germs

WH Plan to Fight Antibiotic-Resistant Germs

AP (Mar. 27, 2015) — The White House on Friday announced a five-year plan to fight the threat posed by antibiotic-resistant bacteria amid fears that once-treatable germs could become deadly. (March 27) Video provided by AP
Powered by NewsLook.com
Indiana Permits Needle Exchange as HIV Cases Skyrocket

Indiana Permits Needle Exchange as HIV Cases Skyrocket

Reuters - US Online Video (Mar. 26, 2015) — Governor Mike Pence declares the recent HIV outbreak in rural Indiana a "public health emergency" and authorizes a short-term needle-exchange program. Rough Cut (no reporter narration) Video provided by Reuters
Powered by NewsLook.com
AAA: Distracted Driving a Serious Teen Problem

AAA: Distracted Driving a Serious Teen Problem

AP (Mar. 25, 2015) — While distracted driving is not a new problem for teens, new research from the AAA Foundation for Traffic Safety says it&apos;s much more serious than previously thought. (March 25) Video provided by AP
Powered by NewsLook.com

Search ScienceDaily

Number of stories in archives: 140,361

Find with keyword(s):
Enter a keyword or phrase to search ScienceDaily for related topics and research stories.


Breaking News:

More Coverage

Government, Industry Can Better Manage Risks of Very Rare Catastrophic Events, Experts Say

Nov. 15, 2012 — The mindset that uses the extreme unlikelihood of a catastrophic event as a rationale for not taking measures before a disaster happens, factored into the risk management failures of the Fukushima ... read more

Strange & Offbeat Stories


Science & Society

Business & Industry

Education & Learning

In Other News

... from NewsDaily.com

Science News

Health News

Environment News

Technology News


Free Subscriptions

Get the latest science news with ScienceDaily's free email newsletters, updated daily and weekly. Or view hourly updated newsfeeds in your RSS reader:

Get Social & Mobile

Keep up to date with the latest news from ScienceDaily via social networks and mobile apps:

Have Feedback?

Tell us what you think of ScienceDaily -- we welcome both positive and negative comments. Have any problems using the site? Questions?
Mobile iPhone Android Web
Follow Facebook Twitter Google+
Subscribe RSS Feeds Email Newsletters
Latest Headlines Health & Medicine Mind & Brain Space & Time Matter & Energy Computers & Math Plants & Animals Earth & Climate Fossils & Ruins