Monday, June 14, 2010

Just How Risky Are Risky Businesses?

A post on Carl Bialik Number Guy blog on 11 June 2010 considers the role of quantified risk assessment in light of the TransOcean disaster.

Apparently "BP didn’t make a quantitative estimate of risk, instead seeing the chance of a spill as “low likelihood” bases on prior events in the gulf." Other industries such as aviation and nuclear tend to use more quantitative assessments, and clearly the question should be whether BP should have done this.

Jon Pack, described as a 'spokesman' is quoted as saying "If you look at the history of drilling in the Gulf, and elsewhere, blowouts are very low-likelihood, but obviously it’s a high impact, and that’s what you plan for,"industry will need to take a second look at measures put in place to prevent hazards," but said this would likely focus on changing processes rather than on calculating risk.

Barry Franklin, a director in Towers Watson’s corporate risk management practice is quoted as saying "My recommendation to companies faced with low-probability and high-severity events would be to worry less about quantifying the probability of those events and focus on developing business continuity and disaster recovery plans that can minimize or contain the damage."

The post includes quite a bit about human error. Some sections are summarised below.

By observing people at work, Scott Shappell, professor of industrial engineering at Clemson University, has estimated that 60% of problems caused by human error involve skill failures, such as of attention or memory, while 35% involve decision errors — poor choices based on bad information, incomplete knowledge or insufficient experience.

NASA has used similar techniques for decades. Among the biggest components of shuttle risk, according to Robert Doremus, manager of the NASA shuttle program’s safety and mission assurance office, are orbital debris — which has a one-in-300 chance of leading to disaster — main-engine problems (one in 650) and debris on ascent (one in 840), which felled Columbia. Human error is also a factor: There’s a 1 in 770 chance that human error in general will cause a disaster, and a 1 in 1,200 chance of crew error on entry.

Human error adds to the imprecision. “Human reliability analysis is a challenge, because you could have widespread variability,” said Donald Dube, a senior technical advisor who works on risk assessment for NRC. “But it is founded on real data.”

In nuclear power plants that have been operating for some time, human errors are the most common ones, said Paul Barringer, a consulting engineer and president of Barringer & Associates Inc. “People are at the root” of many risks, Barringer said.

Doug Wiegmann, associate professor of industrial and systems engineering at the University of Wisconsin, Madison, has studied human error in cockpits, operating rooms and other contexts. “The general human-factors issues are the same whether you’re in a cockpit or anywhere else”: communications, technology design and a checklist chief among them.

3 comments:

Jacku said...

Very interesting post, especially the part about human errors. I can understand that a lot of human errors are made due to lack of time to gather the information needed to make a good decision, but that 60% involve skill failures ... Is this because people are badly trained or because too much information is being processed?

Human factors in risk management said...

It is not entirely clear to me what was meant by skill failure. I think it may be suggesting that most errors occur when people are doing tasks they are familiar, or skilled in. Most of those errors do not have significant consequences because they are recovered. However, a key problem can be where people can continue to do what they normally do without realising something is different.

Unknown said...

Andy...I saw that the original article mentions of risk estimation. I realize one of the biggest difference in UK vs US regulations is risk-based regulations.

However, the biggest barriers to applying risk-based principles is the uncertainty involved.

http://risk-safety.com/quantitative-risk-assessment-will-quantifying-risks-help-you-minimize-them/