Stumbled across this website from Australian company Mapwright. It gives some great advice for generating simple and effective quality systems.
I don't know anything about the company, but the website gives some great advice. It also reminds about some of the bad things done in the name of 'quality' over recent years.
Tuesday, July 20, 2010
Friday, July 16, 2010
The Nature of Human Error
Taken from The Human Contribution: Unsafe Acts, Accidents and Heroic Recoveries
Reason says that error is generally considered to be some form of deviation. These can be:
* From the upright (trip of stumble);
* From current intention (slip or lapse);
* From an appropriate route towards some goal (mistake);
* Straying the path of righteousness (sin)
A variety of classifications and taxonomies have been developed over the years the explain error, which fall into four basic categories:
1. Intention - Was there an intention before action, was intention the right one and were the action performed correct to achieve the intention?
2. Action - Were there omissions, unwanted or unintended actions, repetitions, actions on the wrong object, misorderings, mistimings or merging of actions?
3. Context - Deviating because you are anticipating what is coming or echoing the past, have been primed to follow a pattern that does not work for all circumstances, you are disrupted, or distracted or stressed.
4. Outcome - Looking at errors according to the outcome, which can include inconsequential free lessons, exceedances working near the edge of safe limits, incidents and accidents.
There are many myths about errors. Reason suggests the following:
1. Errors are intrinsically bad - trial-and-error learning is essential if we are to understand novel situations.
2. Bad people make bad errors - in fact it is often the best people that make the worst errors because they may be in positions of responsibility and tend to push the limits by trying out new techniques
3. Errors are random and variable - we can actually predict the type of error likely to occur based on situation.
Reason says that error is generally considered to be some form of deviation. These can be:
* From the upright (trip of stumble);
* From current intention (slip or lapse);
* From an appropriate route towards some goal (mistake);
* Straying the path of righteousness (sin)
A variety of classifications and taxonomies have been developed over the years the explain error, which fall into four basic categories:
1. Intention - Was there an intention before action, was intention the right one and were the action performed correct to achieve the intention?
2. Action - Were there omissions, unwanted or unintended actions, repetitions, actions on the wrong object, misorderings, mistimings or merging of actions?
3. Context - Deviating because you are anticipating what is coming or echoing the past, have been primed to follow a pattern that does not work for all circumstances, you are disrupted, or distracted or stressed.
4. Outcome - Looking at errors according to the outcome, which can include inconsequential free lessons, exceedances working near the edge of safe limits, incidents and accidents.
There are many myths about errors. Reason suggests the following:
1. Errors are intrinsically bad - trial-and-error learning is essential if we are to understand novel situations.
2. Bad people make bad errors - in fact it is often the best people that make the worst errors because they may be in positions of responsibility and tend to push the limits by trying out new techniques
3. Errors are random and variable - we can actually predict the type of error likely to occur based on situation.
The Human Contribution
Book by James Reason The Human Contribution: Unsafe Acts, Accidents and Heroic Recoveries. Published by Ashgate 2008
Some of this book is really interesting and useful. Equally, I did feel quite a lot of the content could have been cut to make it more readable and focussed on the real issues. I'd say it was not the writing that was the problem, but instead a lack of good editing.
It is certainly a book worth reading, and I will put some further posts here summarising some of the bits I thought were most useful. I'll start with some excerpts from the introduction.
"The purpose of this book is to explore the human contribution to both the reliability and resilience of complex and well-defended systems." However, instead of just taking the normal view that the human in a system is a 'hazard' because of its unsafe acts, the book also explores the role of the human as a 'hero' whose adaptations and compensations have "brought troubled systems back from the brink of disaster." The author says that he believes that learning more about 'heroic recoveries' will be "potentially more beneficial to the pursuit of improved safety in dangerous operations."
Some of this book is really interesting and useful. Equally, I did feel quite a lot of the content could have been cut to make it more readable and focussed on the real issues. I'd say it was not the writing that was the problem, but instead a lack of good editing.
It is certainly a book worth reading, and I will put some further posts here summarising some of the bits I thought were most useful. I'll start with some excerpts from the introduction.
"The purpose of this book is to explore the human contribution to both the reliability and resilience of complex and well-defended systems." However, instead of just taking the normal view that the human in a system is a 'hazard' because of its unsafe acts, the book also explores the role of the human as a 'hero' whose adaptations and compensations have "brought troubled systems back from the brink of disaster." The author says that he believes that learning more about 'heroic recoveries' will be "potentially more beneficial to the pursuit of improved safety in dangerous operations."
Wednesday, July 07, 2010
Overfill Protective Systems - Complex Problem, Simple Solution
Useful paper by Angela E. Summers, available from Sis-Tech.com
It considers the Buncefield, BP Texas City and Longford accidents and the issues related to high levels. The main conclusion is as follows:
"Catastrophic overfills are easily preventable. When overfill can lead to a fatality, follow these 7 simple steps to provide overfill protection:
1. Acknowledge that overfill of any vessel is credible regardless of the time required to overfill.
2. Identify each high level hazard and address the risk in the unit where it is caused rather than allowing it to propagate to downstream equipment.
3. Determine the safe fill limit based on the mechanical limits of the process or vessel, the measurement error, the maximum fill rate, and time required to complete action that stops filling.
4. When operator response can be effective, provide an independent high level alarm at a set point that provides sufficient time for the operator to bring the level back into the normal operating range prior to reaching a trip set point.
5. When the overfill leads to the release of highly hazardous chemicals or to significant equipment damage, design and implement an overfill protection system that provides an automated trip at a set point that allows sufficient time for the action to be completed safely. Risk analysis should be used to determine the safety integrity level (SIL) required to ensure that the overfill risk is adequately addressed. While there are exceptions, the majority of overfill protection systems are designed and managed to achieve SIL 1 or SIL 2.
6. Determine the technology most appropriate for detecting level during abnormal operation. The most appropriate technology may be different from the one applied for level control and custody transfer.
7. Finally, provide means to fully proof test any manual or automated overfill protective systems to demonstrate the ability to detect level at the high set point and to take action on the process in a timely manner.
It considers the Buncefield, BP Texas City and Longford accidents and the issues related to high levels. The main conclusion is as follows:
"Catastrophic overfills are easily preventable. When overfill can lead to a fatality, follow these 7 simple steps to provide overfill protection:
1. Acknowledge that overfill of any vessel is credible regardless of the time required to overfill.
2. Identify each high level hazard and address the risk in the unit where it is caused rather than allowing it to propagate to downstream equipment.
3. Determine the safe fill limit based on the mechanical limits of the process or vessel, the measurement error, the maximum fill rate, and time required to complete action that stops filling.
4. When operator response can be effective, provide an independent high level alarm at a set point that provides sufficient time for the operator to bring the level back into the normal operating range prior to reaching a trip set point.
5. When the overfill leads to the release of highly hazardous chemicals or to significant equipment damage, design and implement an overfill protection system that provides an automated trip at a set point that allows sufficient time for the action to be completed safely. Risk analysis should be used to determine the safety integrity level (SIL) required to ensure that the overfill risk is adequately addressed. While there are exceptions, the majority of overfill protection systems are designed and managed to achieve SIL 1 or SIL 2.
6. Determine the technology most appropriate for detecting level during abnormal operation. The most appropriate technology may be different from the one applied for level control and custody transfer.
7. Finally, provide means to fully proof test any manual or automated overfill protective systems to demonstrate the ability to detect level at the high set point and to take action on the process in a timely manner.
Tuesday, July 06, 2010
When risk management goes bad
Article in the Risk Digest on 2 July 2010, which itself was created from an article by Robert Charette available to members of Enterprise Risk Management & Governance
"According to BP PLC's 582-page 2009 spill response plan for the Gulf of Mexico, walruses along with sea otters, sea lions, and seals are among the "sensitive biological resources" that could be harmed by an oil discharge from its operations in the Gulf. The only problem is that walruses, sea otters, sea lions, and seals don't happen to live in the Gulf of Mexico, and haven't for a considerable period of time—like millions of years."
"The spill plan also lists a Japanese home shopping site as one of BP's primary providers of equipment for containing a spill, a dead professor as one of its wildlife experts to consult with in the event of spill, and other outrageous gaffes."
"BP was not alone in worrying about walruses. Chevron, ConocoPhillips, and ExxonMobil's oil discharge response plans in the Gulf of Mexico also listed those poor walruses as potential victims of a spill."
"The US government must have been worried about those walruses, too, since those in government accountable for reviewing and approving the oil companies' response plans didn't say a word about them."
The oil companies had actually outsourced the writing of their oil response plans to a consulting group. Either the organisations did not read the plans or they read them but did not pick up the errors. The latter may be more worrying because it suggest oil companies and the government lack the competence to manage risk.
"It is pretty clear that oil spill risk management wasn't taken seriously at all by BP, or by most of the other major oil companies drilling in the Gulf. In congressional hearings, oil industry officials admitted that the industry is poorly equipped to handle oil spills of any size in the Gulf, and that is why the industry tries to prevent spills from happening. The industry also viewed its oil-well blowout preventers as foolproof safety mechanisms, even though they fail regularly. However, the industry officials also admitted that less than 0.1% of corporate profits are spent on improving offshore drilling technologies, even as the risks of drilling offshore have increased significantly over the past decade."
The author suggests that in the future, whenever risk management is incompetently performed, done just to meet some requirement, isn't taken seriously, or is plain lackadaisical, it should be described by the phrase, "Jumping the Walrus."
"According to BP PLC's 582-page 2009 spill response plan for the Gulf of Mexico, walruses along with sea otters, sea lions, and seals are among the "sensitive biological resources" that could be harmed by an oil discharge from its operations in the Gulf. The only problem is that walruses, sea otters, sea lions, and seals don't happen to live in the Gulf of Mexico, and haven't for a considerable period of time—like millions of years."
"The spill plan also lists a Japanese home shopping site as one of BP's primary providers of equipment for containing a spill, a dead professor as one of its wildlife experts to consult with in the event of spill, and other outrageous gaffes."
"BP was not alone in worrying about walruses. Chevron, ConocoPhillips, and ExxonMobil's oil discharge response plans in the Gulf of Mexico also listed those poor walruses as potential victims of a spill."
"The US government must have been worried about those walruses, too, since those in government accountable for reviewing and approving the oil companies' response plans didn't say a word about them."
The oil companies had actually outsourced the writing of their oil response plans to a consulting group. Either the organisations did not read the plans or they read them but did not pick up the errors. The latter may be more worrying because it suggest oil companies and the government lack the competence to manage risk.
"It is pretty clear that oil spill risk management wasn't taken seriously at all by BP, or by most of the other major oil companies drilling in the Gulf. In congressional hearings, oil industry officials admitted that the industry is poorly equipped to handle oil spills of any size in the Gulf, and that is why the industry tries to prevent spills from happening. The industry also viewed its oil-well blowout preventers as foolproof safety mechanisms, even though they fail regularly. However, the industry officials also admitted that less than 0.1% of corporate profits are spent on improving offshore drilling technologies, even as the risks of drilling offshore have increased significantly over the past decade."
The author suggests that in the future, whenever risk management is incompetently performed, done just to meet some requirement, isn't taken seriously, or is plain lackadaisical, it should be described by the phrase, "Jumping the Walrus."
Friday, July 02, 2010
Airline pilot vowed to improve NHS safety culture after his wife's death
Articles at Wales Today by Madeleine Brindle of the Western Mail, on 28 June 2010
Airline pilot Martin Bromiley is now helping the NHS in Wales to put patient safety at the forefront of everything it does and prevent future fatal mistakes after his wife Elaine died because of clinical errors in her hospital care.
Elaine, aged 37 was admitted for a routine sinus operation, but never regained consciousness and died 13 days later.
After her death, Mr Bromiley was told by the ENT surgeon that they couldn’t have foreseen the complication and that they’d made all the right decisions, but that "it just didn’t work out." But when he realised the death would not be investigated unless he decided to sue he reflected on the aviation industry where all accidents are considered avoidable and investigations are thorough and routine, not to place blame, but so we can learn to ensure it doesn’t happen again.
He persuaded the director of the hospital unit where his wife died that an investigation was necessary. This showed that two minutes into the procedure his wife had turned blue and was struggling to breathe. Four minutes in and she was taking in only 40% oxygen. Six minutes in the team tried to put a tube down her throat. After 10 minutes in they still couldn’t get the tube in.
Guidelines stated this was an emergency, but the theatre staff continued with their attempts to intubate. This was a very experienced team, "In many ways they were the dream team to deal with something going wrong. So why didn’t they?"
The communication process seemed to have dried up. The lead anaesthetist lost control. Many of the nursing staff seemed to know what needed to be done but were ignored.
Mr Bromiley believes that inadvertent human error caused Elaine’s death and that systems need to be developed and people trained to reduce harm.
He said the NHS needs to look at how humans behave in the system and manage the structure around them to make it as easy as possible for the best service to be delivered.
“In aviation we accept that error is normal – it’s not poor performance and it’s not weakness. If you accept this then you can start to catch error; not hide and deny it. Then you can make a difference.
“If you work in healthcare and you feel something is going wrong you have to speak up. If the team in Elaine’s case had taken a minute to get as many views as possible from the team present, maybe it would have helped. Maybe she would still be here. We will never know.”
Mr Bromiley is now involved in the "1,000 Lives Plus campaign" which is involving patients to ensure that NHS Wales is working together to deliver a safe, quality, productive service.
Airline pilot Martin Bromiley is now helping the NHS in Wales to put patient safety at the forefront of everything it does and prevent future fatal mistakes after his wife Elaine died because of clinical errors in her hospital care.
Elaine, aged 37 was admitted for a routine sinus operation, but never regained consciousness and died 13 days later.
After her death, Mr Bromiley was told by the ENT surgeon that they couldn’t have foreseen the complication and that they’d made all the right decisions, but that "it just didn’t work out." But when he realised the death would not be investigated unless he decided to sue he reflected on the aviation industry where all accidents are considered avoidable and investigations are thorough and routine, not to place blame, but so we can learn to ensure it doesn’t happen again.
He persuaded the director of the hospital unit where his wife died that an investigation was necessary. This showed that two minutes into the procedure his wife had turned blue and was struggling to breathe. Four minutes in and she was taking in only 40% oxygen. Six minutes in the team tried to put a tube down her throat. After 10 minutes in they still couldn’t get the tube in.
Guidelines stated this was an emergency, but the theatre staff continued with their attempts to intubate. This was a very experienced team, "In many ways they were the dream team to deal with something going wrong. So why didn’t they?"
The communication process seemed to have dried up. The lead anaesthetist lost control. Many of the nursing staff seemed to know what needed to be done but were ignored.
Mr Bromiley believes that inadvertent human error caused Elaine’s death and that systems need to be developed and people trained to reduce harm.
He said the NHS needs to look at how humans behave in the system and manage the structure around them to make it as easy as possible for the best service to be delivered.
“In aviation we accept that error is normal – it’s not poor performance and it’s not weakness. If you accept this then you can start to catch error; not hide and deny it. Then you can make a difference.
“If you work in healthcare and you feel something is going wrong you have to speak up. If the team in Elaine’s case had taken a minute to get as many views as possible from the team present, maybe it would have helped. Maybe she would still be here. We will never know.”
Mr Bromiley is now involved in the "1,000 Lives Plus campaign" which is involving patients to ensure that NHS Wales is working together to deliver a safe, quality, productive service.
The new UK government is asking the public to participate in restoring Britain’s traditions of freedom and fairness, and free our society of unnecessary laws and regulations – both for individuals and businesses. One area that seems likely to receive some attention is health and safety safety.
They have set up a website where anyone can post ideas and comment on others. Those flagged as health and safety can be viewed at http://yourfreedom.hmg.gov.uk/@@search?Subject=healthandsafety
I think the challenge is that many of the problems are not with laws themselves, but the way they are interpreted. People are frightened to do things because they think they may be breaking a law or will be held liable if something goes wrong. This is perpetuated by a press that seems to love to use health and safety as a convenient excuse for many things. A government can change laws, but I am not sure how they can change perceptions.
They have set up a website where anyone can post ideas and comment on others. Those flagged as health and safety can be viewed at http://yourfreedom.hmg.gov.uk/@@search?Subject=healthandsafety
I think the challenge is that many of the problems are not with laws themselves, but the way they are interpreted. People are frightened to do things because they think they may be breaking a law or will be held liable if something goes wrong. This is perpetuated by a press that seems to love to use health and safety as a convenient excuse for many things. A government can change laws, but I am not sure how they can change perceptions.
Fines over taxi firm fatal blast
Article from BBC website on 24 February 2010
A taxi firm owner has been fined £2,400 for failing to protect his employees in relation to the storage of petrol and failing to protect the public and a petrol station £7,500 for breaching its petroleum licence after an explosion in Immingham in which two people died.
Sue Barker, 43, and Ann Mawer, 52, died in the blast at Fred's Taxis in 2007 when petrol on the premises ignited.
Mr Barker, owner of the company and husband of Sue, bought nearly 25 litres of petrol from the service station, using an unapproved container.
He then carried it into the taxi firm's office, which also contained a gas heater and electrical appliances.
The container broke and the petrol spilled and ignited, causing an explosion.
A taxi firm owner has been fined £2,400 for failing to protect his employees in relation to the storage of petrol and failing to protect the public and a petrol station £7,500 for breaching its petroleum licence after an explosion in Immingham in which two people died.
Sue Barker, 43, and Ann Mawer, 52, died in the blast at Fred's Taxis in 2007 when petrol on the premises ignited.
Mr Barker, owner of the company and husband of Sue, bought nearly 25 litres of petrol from the service station, using an unapproved container.
He then carried it into the taxi firm's office, which also contained a gas heater and electrical appliances.
The container broke and the petrol spilled and ignited, causing an explosion.
Subscribe to:
Posts (Atom)