Thursday, June 25, 2009

Use of shock horror pictures to promote safety

This subject is debated quite often. I know that studies have shown that showing horrific pictures to people actually causes denial rather than promotes safe behaviour, but people still think they will work. The same debate has recently come up in the LinkedIn EHSQ Elite Group. Patrick Hudson provided some great information that is shown below.

The use of vivid, scary or bloody pictures is often believed to be a guaranteed way to make people behave safely. The evidence is clear that this is wrong, yet the police like to use them for road traffic and industry likes to use them for getting workers to toe the line. What does work is the possibility, as used in the best horror movies, where the threat is always implied, the blood in the next frame and the engagement of the viewer is not turned off by the sight of blood and mangled limbs. The turn-off factor means that people move to deny that it will happen to them, it only happens to others, and this is, if you think about it, likely to produce exactly the opposite behaviour in your workforce to the one desired (it won't happen to me so I will carry on as I always do). The people who propagate the use of scary material are already won over, or are in a senior position so they aren't personally confronted with the problems.

What you CAN do is involve the workforce in learning to spot and recognise hazards and get them to convince themselves (not by preaching from a superior pulpit) that they are worth avoiding. This is part of the Working Safely model in one of the Hearts and Minds tools, based upon trying to understand what it takes to work safely as opposed to asking why people work dangerously. The empirical evidence then shows that many accidents are the result of either failing to perceive the hazard in the first place or failing to regard it as sufficiently dangerous to do something about, in a world with many hazards that require people to prioritise. If after 35 years you haven't been hurt, a 5 minute lecture by a consultant is unlikely to convince an experienced worker that 5 minutes by someone from outside are more valid than their 35 years on the job experience (even if statistically their experience is too little on an individual basis).

By the way this approach works in South East Asian cultures. In my experience no one is so fatalistic that they, personally, are happy to die to get the job done for someone else. Peoples' behaviour in the face of hazards, seen as their response to risks, are complicated (as high-paid bankers have shown us) and sometimes I get frustrated that authorities feel that they have an adequate knowledge of these factors in people, while they would never dare have the same presumptions about financial or technical issues. If they know about Prospect Theory and why Daniel Kahneman won the Nobel prize for his work with Amos Tversky in this area then I might be prepared to give them some more listening time. The use of shock tactics to force people into passive compliance is an indicator that the proposers are amateurs who have come up with a solution that, in the words of H.L.Mencken, are "simple, neat and wrong".

Wednesday, June 24, 2009

Human Error Now The Big Killer

Article on Strategy Page19 June 2009.

Pilot error is being identified more and more in military aircraft accidents. This is compared to commercial aviation where accident rates have declined 90 percent since World War II, mainly through the introduction of more safety devices and more reliable aircraft.

Apparently Spain has lost four military aircraft to mid-air collisions this year (two Mirage F-1s in January, and two F-18s this month).

In India, the crash of a Su-30 was initially thought to be engine or electronic problems. But the investigation team found that the pilot had inadvertently shut down the automated flight controls, was not aware of it, and believed the aircraft was, for some unknown reason, out of control. The pilot and weapons system operator ejected (the back seat guy was killed when his safety harness broke.)

Andy Brazier

Accident risk up due to stress

Article in Gulf News by Carol Spiers on 17 June 2009.

The claim is that lots of industrial accidents which involved human error had more to do with stress and less to do with personal failings. Apparently this is confirmed by a recent report by one the UK's largest insurance groups which concluded that the risk of accidents at work is increasing as stress-levels are driven up by the effects of the recession.

Fatigue is part of this, but Spiers places "more significance on the psychological factors - disorientation and fear , which cannot accurately be measured." She summarises two accidents:

1. A methanol fire occurred because a technician left a tap on a drum open whilst it was filling. An investigation tried to ascertain the cause of this "inexplicable and serious breach of the factory safety rules." They concluded that because the technician had problems paying his mortgage he had not been sleeping well and had been suffering high levels of stress. Apparently "It was clear that the accident was a result of human error that could have been avoided had the man sought help from his employer and taken medical advice."

2. An "over-stressed accountant" duplicated the keying-in routine when making a large electronic payment to a supplier, paying them twice.

Whilst I agree stress can make a significant contribution to human error, both examples quoted seem to be more related to poor design. It seems to me that someone has developed a theory and is looking for evidence to back it up, rather than any reliable evaluation of the data available.

Andy Brazier

Wednesday, June 17, 2009

Adequate Time Off Between Shifts a Key to Reducing Fatigue Risks

Article on Ergoweb By Jennifer Anderson June 15, 2009

Circadian Technologies, a London consultancy, says the answer to reducing the risk of fatigue from shift work is to allow a minimum of 11 hours off between shifts. This increases the chance of people achieving the recommended 8.4 hours of sleep.

Circadian generally recommends limiting eight-hour shifts to a maximum of seven in a row, and 12-hour shifts to four or five in a row.

A study by the company reported in Business Week in April 2005 concludes that obesity, diabetes and heart disorders are higher for night workers, that they have a 20 percent greater chance of being involved in a severe accident and make five times more serious mistakes than their daytime counterparts.

Andy Brazier

Monday, June 08, 2009

Medication Safety Tools and Resources

A number of documents at The Institute of Safe Medication Practices website.

Include
* Error-Prone Abbreviations List
* FMEA Process (with Sample FMEA)
* IOM Report on Medication Errors
* ISMP Assess-ERR - A medication system worksheet to assist with error report investigation.
* ISMP Confused Drug name List

Prevention of Medical Errors - Leveraging the power of science and compliance to prevent catastrophe

Article on Advance web for LPNs website by Barbara L. Olson

A lot of useful information in this article about human factors in general, and particularly medical error. I have to say that for a publication that claims to be for practical nurses, this article is so full of unnecessary human factors and psychological jargon that the key messages may well be lost, which is a real shame. Why can't people write clearly?

One of the messages seems to be that healthcare does not act in the same way as 'high reliability organisations.' In particular health care professionals are "more likely to accept broadly stated goals as the functional unit performance, rather than process-orientated steps that contribute to a larger outcome." I think this means that there is more responsibility put on individuals to act safely and less emphasis on improving systems to reduce the risk of error.

The article suggests a risk-reduction hierarchy that I think is good:
* Forcing functions and constraints
* Automation and computerisation
* Standardisation and protocols
* Checklists and double-check systems
* Rules and policies
* Education and information
* Suggestions to be more careful.

Andy Brazier

‘Human error’ blamed for 134 incorrect promotion notifications

Article in the Stars and Stripes by Jeff Schogol on 8 June 2009

134 petty officers second class were incorrectly notified that they had been promoted to E-6 before the Memorial Day weekend.

The process seems to involve calculating scores and those with the highest earn the promotion. In this case the quota for the number being promoted was exceeded due to human error. Those involved will have to be demoted. According to the article "The Navy has since added an extra check to make sure the human error in question does not happen again."

Andy Brazier