Friday, November 27, 2009

Five ways ergonomics has shaped your life

Article from the BBC Website by Megan Lane on 18 November 2009

They are

1. Behind the wheel - making sure people of all shapes and sizes can get into a car's driving seat
2. Nuclear power stations - design of control rooms
3. Pretty objects - things that are simple and intuitive to use are usually fun to use
4. Under wraps - making packaging that does its job whilst being easy to open
5. In the office - ergonomic keyboards, mice etc.

Not quite sure how these came out as the top five, but a useful illustration.

Andy Brazier

BBC Shipping forecast error

Article in The Register by Lewis Page on 25 November 2009

Parts of the first shipping forecast transmitted on Sunday November 15 - specifically the storm force winds - were actually the same ones sent out early on the previous day. This led to confusion for ships and boats in UK and adjacent waters, as the Force 11 winds predicted off the West Country had actually passed, while other regions were being hard hit.

According to the BBC "The late night announcer at the end of the shift pulled out an email of what she thought was the right shipping forecast and read it out completely unaware it was the wrong forecast. “All I can say is that I am most terribly sorry we got that wrong this was a big error on our part"

Apparently professional seafarers receive the shipping forecast by other, automated means (e.g. NAVTEX text receivers)and yachtsmen and small craft mostly tend to use the inshore-waters Marine VHF broadcasts made by coastguard stations.

The article makes the point that this "does reinforce the lesson that one does well to rely on the professionals - Met office, Coastguard etc, not the BBC or any other media/entertainment organisation - when lives are at stake." Fair enough, but those organisations can also make mistakes.

Andy Brazier

1989 changes to Tyson plant still benefit workers today

On Siouxland's News website 24 November 2009

Changes to ergonomics made at the Tyson Fresh Meats plant, in Dakota City, Nebraska, nearly two decades ago to resolve a labour dispute are still reaping benefits, are now the standard for plants across the company.

The tangible benefits have been "a decrease in the injury and illness rate by 67%." Also, it decreased the amount of worker turnover, which was typically high in the meat packing industry in the 70s

Tyson Foods COO Jim Lochner says, "workplace safety and health did not have the attention and focus. It was more of a productivity game without consideration for the health and well-being of the line worker. And what we had to do was really change that whole philosophy."

The Sioux City Journal also covered this story on 24 November 2009.

It adds to the story by saying "Since 1991, OSHA-recorded injuries and illnesses at the Dakota City plant has dropped 67 percent. The rate of instances requiring the care of a physican are 73 percent below 1991 levels." The company and unions agree the changes created a much safer workplace with key program's success bing the workers themselves who serve as safety and ergonomics monitors and suggest changes to improve worker safety.


Andy Brazier

Tuesday, October 06, 2009

Fears over care-home drug errors

Article on BBC website by Nick Triggle 5 October 2009

University of London researchers found seven in 10 elderly people living in care homes were victims of drug errors, having carried out half-day snapshot inspections of 55 homes. They blamed "inadequate information, over-worked staff, poor teamwork and often complex courses of medication."

The study gathered data on 256 residents from a morning's medication. In total, mistakes were made in 178 cases with many the victims of more than one error.

The most common mistakes involved wrong dosages, insufficient monitoring of residents after medication had been taken and people being given the drug at the wrong time. Most were only minor, although one resident did suffer a thyroid complication.

Care home residents are being given more complex courses of medication. In this study the residents received an average of eight different pills a day.

Andy Brazier

Driving factors and challenges in the development of unmanned combat air vehicles

Article on the Defence professionals wbesite on 5 October 2009. Discusses the increased use of unmanned military planes.

One point made is that these aircraft are becoming more autonomous, being left to fly themselves whilst the person who would have been in control in the past is now monitoring remotely. The article says "This, argue some, will result in fewer accidents as the piloting task is de-skilled and the UAV operator becomes a “supervisor”, with the majority of UAV losses now down to human error." However, it is interesting that there may be a technical necessity for higher degrees of autonomy because bandwidth is limited and hence if more unmanned aircraft are going to be used it may not be possible to control them all at the same time.

The article summarises a number of future challenges for increasing the role of unmanned aircraft. One is the difficulty with findings places to conduct flight trials that will not interfere with other military and civil air traffic. Another is that, whilst more is technically possible it may be that cultural acceptance may not be so forthcoming. "For instance, despite autopilots and advanced flight management systems in today’s glass-cockpit airliners which control the flight almost from beginning to end, the majority of passengers would be markedly reluctant to fly in a pilotless aircraft, even if told that the elimination of human error might actually result in a safer aircraft"

Andy Brazier

Monday, October 05, 2009

Electoral Commission rebuffs GLA plans for e-counting

Article in The Guardian by Charles Arthur 2 October 2009

The Greater London Council (GLA) have carried out a cost benefit analysis for using electronic vote counting instead of manual counting in forthcoming elections. The Electoral Commission has raised a number of concerns with the analysis including the suggestion that e-voting is "free from human error."

The basic finding is that e-counting will be 40% more expensive, but more accurate and quicker. But the commission report points out that if the extra money was paid into manual counting it might be able to achieve similar results.

It seems that this is a classic case of the benefits of using technology are over-stated whilst compared with the most pessimistic view of the low-tech, but well established techniques.

Andy Brazier

Monday, September 28, 2009

Research proves MEWPs are most effective tool - report from Europlatform

Article on Access International by Maria Hadlow on 17 Sep 2009

A review of work at height procedures up to 4.5 m for mechanical and electrical installation carried out by Crown House Technologies (part of the Laing O'Rourke Group) has found push around, powered access equipment to be far more time efficient than traditional solutions such as scaffold towers and podium steps.

Gerry Mulholland, health and safety leader at Crown House said, "Following our study we were able to provide evidence that MEWPs (mobile elevated work platforms) are safer, more productive, ergonomic and avoid unnecessary strain injury. MEWPs are also easier for site management to maintain the appropriate safe standards on site as there are fewer options offered, therefore, fewer opportunities to make the wrong choice; their automation reduces accidents caused by human error. General site opinion from our workforce is that MEWPs get the job done."

Falls from height are the primary cause of serious injury in the construction industry. CHt's detailed review found that just under a fifth of all accidents on site are related to access equipment. A small proportion of these are caused by MEWPs (19%), compared with podiums, mobile towers, ladders and A-frames.

Andy Brazier

Lighting upgrades enhance North Sea helicopter safety

Article at Flight Global by Kieran Daly 22 September 2009

Helideck lighting has come to prominence as an unexpectedly high priority for pilots in a UK Civil Aviation Authority safety survey. Issues identified include use of yellow and white lights that do not stand out from the rest of the rig lighting; the touchdown spot in the middle of the deck is effectively a "black hole"; and floodlighting used to illuminate the helideck is too bright, with even slight misalignment making things markedly worse.

An extensive series of flight trials over the last five years has allowed design to evolve including a switch to green lighting of the perimeter and removal of floodlights from helidecks. This is now receiving consistently strong reviews by pilots who have experienced it, or elements of it. A final configuration has been developed with green perimeter lights; a single, broken, yellow touchdown marker circle; and a green hollow-H for the touchdown point itself (see picture).

Accidents where lighting could have played a part include the loss of a Eurocopter AS365N Dauphin on approach to a gas platform in Morecambe Bay off the west coast of England in December 2006 and a loss of a Super Puma in February 2009 during a visual night-time approach to the ETAP Platform in poor visibility in the ­central North Sea.

Andy Brazier

Commonwealth Bank ATM in Queen St gave free cash to lucky customers

Article in the MacArthur Chronicle by Ben Pike 22 September 2009

A bank's cash machine dispensed $50 notes instead of $20 notes because of an error when loading the machine. The bank estimates about $20,000 was taken over a 26 hour period.

A bank spokesman said "This is an extremely rare occurrence and what is simply a case of human error. It’s unfortunate that these things happen ... we’re not perfect. We will be contacting people who used the ATM during that time to discuss recovery."

The bank is confident they can identify everyone who used the machine through their ATM cards and CCTV footage.

Their solution to thus is that "Staff will again be instructed on how to load the canisters correctly."

Andy Brazier

Computers can't replace pilots - yet, say experts

Article taken from Flight International by David Learmont 24 September 2009

The term "pilot error" is greatly over-used, especially given that on many occasions technically troubled flights are saved by ordinary airline crews. According to US Federal Aviation Administration's chief scientific and technical adviser, Dr Kathy Abbott, and Capt John Cox of the RAeS's operations committee "records showed about 30% of all system failure modes that led to accidents had not been anticipated by designers, so there were no checklists to deal with them. The corollary was that pilots successfully dealt with 70% of unanticipated failures, let alone the failures for which there was a checklist, she said."

Since an incident of Thomsonfly Boeing 737-300 where crew allowed the aircraft's speed to drop to a dangerously low level on an approach to Bournemouth airport in September 2007 eye-tracking tests of crews has taken place.

"The tests have revealed that a few pilots' instrument scans are seriously deficient, even when their performance would have been judged as good by an examiner on the flightdeck. The implication is that some airline crews, possibly at all airlines, are getting by simply because nothing goes technically wrong on their watch. The worry, says Thomson, is that this pattern may not be correctable because, even with retraining, the pilots concerned tend to revert to their natural patterns later."

Andy Brazier

UK mother relives horror of Venezuelan plane crash that killed son

Article at VH Headline, Venezuela 24 September 2009.

A woman told an inquest of the terrifying moment when a plane she was flying in crashed on take-off and killed her six-year-old son. Jane Horne was with her son, Thomas, and husband, David, as they boarded the flight at Canaima, in the south of Venezuela, in heavy rain during a two-week holiday.

Air crash expert Tim Atkinson, from the Air Accidents Investigation Branch, based in Farnborough, Hampshire, said that witness accounts had made him very confident he knew what had happened. The plane had stalled soon after take-off because it had not had sufficient time to gather enough speed and therefore lift to stay in the air.

The ''human error'' by the pilot of the aircraft was that he first decided to abort the take-off but then changed his mind, leaving not enough runway to safely get into the air. Flooding on the runway had also slowed the plane.

Mr Atkinson makes a very good observation that the pilot had a real incentive to go ahead with the flight even though it was raining heavily. The plane could only fly in good visual conditions and was not allowed to fly at night. Taking off at 3.30pm would have been the last time it could have flown that day before a substantial delay.

Andy Brazier

Tuesday, August 04, 2009

Human error to blame in child's death

Reported on a number of US news sites including Why Lincoln in an article by Matt McFarland on 25 July 2009.

It took nearly half an hour for an ambulance to reach the scene of an accident where a child cyclist was hit by a car. The child probably had little chance of, but the slow response 'outraged' the city's mayor who set up a a task force to investigate.

The task force report says the cause was "human error, mainly a mistake by a Mohawk Ambulance dispatcher" who did not realise there was not an ambulance available and so failed to initiate a mutual aid arrangement with neighbouring counties.

What is interesting is that despite an individual being identified as making an error a number of quite significant system changes are being made as a result. They include:

* increasing staffing at ambulance dispatch for evening and overnight shifts.
* purchasing two additional Albany fire department radios so those in the ambulances can speak with EMTs on scene.
* each ambulance will now be equipped with GPS navigational units.
* investment in computer software which when a call comes in, can identify where the closest ambulance is, give directions and even estimate a response time.

Following the incident the dispatcher was "re-trained and counseled."

Andy Brazier

Monday, July 06, 2009

Problems with testing

Article in the Palm Beach Post by Tom Blackburn 6 July 2009

Reports a case where fire fighters were selected for promotion based on a written test. No African-American passed; 18 white firefighters did. The case went to court due to alleged discrimination. "The white guys won."

The article includes some interesting insight into testing. One of the first tests was for Intelligence Quotient (IQ), which the author claims were "misused almost at once." The test writers looked at how different sexes answered questions and edited those which showed a bias. People have since used the results of IQ tests to show men and women have the same IQ, without realising that the question set had been specifically designed to achieve that result.

"The misuse of testing fills a need to believe that we can take the human factor out of judgement calls. Human error sneaks back in as part of the design of each test, but we act surprised every time we find it." We all seem to be reassured that tests give numerical, 'objective' results. The reality is that most tests have little evidence to show the results are anywhere near correlated to what supposedly was being tested.

"It's easy enough to devise a test that will tell you whether someone has memorized the times tables." But many factors such as leadership (as required by the firefighters in the case cited) is immeasurable.

"It's no surprise that we do a poor job of testing for leadership. After 100 years of testing for intelligence, we still are not clear about what IQ numbers really mean."

Friday, July 03, 2009

It's Management's Fault

Article in Forbes by Kenneth G. Brill on 1 July 2009

Author claims that 10 years ago he "developed Brill's Law of Catastrophic Failure: Catastrophic failure is never the result of a single event or interaction, such failures are the result of at least five, and as many as 10, interactions. Any single interaction or several in combination might be bad, but not catastrophic. But when the right combination of interactions occurs (typically seven are required), they will produce a domino effect and devastation will occur 100% of the time."

I think the idea of multiple causes of accidents was well established at this time, but the article does give some good examples of how situations develop. They include

* Falling of a ladder doing DIY - (self) management failures of working alone, rushing to complete before dark, reducing the number of trips taken up and down the ladder by carrying more stuff each time. Human nature of risk taking when activities are repeated without incident. All combine with events that occur including strong winds at the top of the ladder. The result is loss of balance whilst hands are full leading to the catastrophic failure of a fall.
* Failure of Manhattan air traffic control - management failures included agreeing to run on local electrical generators at times of high demand on grid to receive a financial incentive from the electrical supplier, sending technicians on an offsite course so unable to monitor, failing to ensure an alarm was visible to staff when a control centre was moved, failing to provide a properly redundant backup. There was not an immediate failure because battery backup worked for several hours. Luckily manual procedures worked and a catastrophe was averted.

The author has applied his analytical approach to other events and concludes that "The results are always the same. It doesn't matter what the underlying physical, mechanical or electrical portions of the event are, management error or inaction contributes up to half of the interactions resulting in catastrophic failure."

Based on data from The Uptime Institute for data centres, one third of failures are caused by equipment. Of the remaining two-thirds of availability failures, more than 70% are caused by intentional management decisions or by management inaction. This means only 20% of data centre failures are caused by human error.

He claims that "Systemically addressing management issues is the quickest and ultimately cheapest path to reducing catastrophic failure."

Thursday, June 25, 2009

Use of shock horror pictures to promote safety

This subject is debated quite often. I know that studies have shown that showing horrific pictures to people actually causes denial rather than promotes safe behaviour, but people still think they will work. The same debate has recently come up in the LinkedIn EHSQ Elite Group. Patrick Hudson provided some great information that is shown below.

The use of vivid, scary or bloody pictures is often believed to be a guaranteed way to make people behave safely. The evidence is clear that this is wrong, yet the police like to use them for road traffic and industry likes to use them for getting workers to toe the line. What does work is the possibility, as used in the best horror movies, where the threat is always implied, the blood in the next frame and the engagement of the viewer is not turned off by the sight of blood and mangled limbs. The turn-off factor means that people move to deny that it will happen to them, it only happens to others, and this is, if you think about it, likely to produce exactly the opposite behaviour in your workforce to the one desired (it won't happen to me so I will carry on as I always do). The people who propagate the use of scary material are already won over, or are in a senior position so they aren't personally confronted with the problems.

What you CAN do is involve the workforce in learning to spot and recognise hazards and get them to convince themselves (not by preaching from a superior pulpit) that they are worth avoiding. This is part of the Working Safely model in one of the Hearts and Minds tools, based upon trying to understand what it takes to work safely as opposed to asking why people work dangerously. The empirical evidence then shows that many accidents are the result of either failing to perceive the hazard in the first place or failing to regard it as sufficiently dangerous to do something about, in a world with many hazards that require people to prioritise. If after 35 years you haven't been hurt, a 5 minute lecture by a consultant is unlikely to convince an experienced worker that 5 minutes by someone from outside are more valid than their 35 years on the job experience (even if statistically their experience is too little on an individual basis).

By the way this approach works in South East Asian cultures. In my experience no one is so fatalistic that they, personally, are happy to die to get the job done for someone else. Peoples' behaviour in the face of hazards, seen as their response to risks, are complicated (as high-paid bankers have shown us) and sometimes I get frustrated that authorities feel that they have an adequate knowledge of these factors in people, while they would never dare have the same presumptions about financial or technical issues. If they know about Prospect Theory and why Daniel Kahneman won the Nobel prize for his work with Amos Tversky in this area then I might be prepared to give them some more listening time. The use of shock tactics to force people into passive compliance is an indicator that the proposers are amateurs who have come up with a solution that, in the words of H.L.Mencken, are "simple, neat and wrong".

Wednesday, June 24, 2009

Human Error Now The Big Killer

Article on Strategy Page19 June 2009.

Pilot error is being identified more and more in military aircraft accidents. This is compared to commercial aviation where accident rates have declined 90 percent since World War II, mainly through the introduction of more safety devices and more reliable aircraft.

Apparently Spain has lost four military aircraft to mid-air collisions this year (two Mirage F-1s in January, and two F-18s this month).

In India, the crash of a Su-30 was initially thought to be engine or electronic problems. But the investigation team found that the pilot had inadvertently shut down the automated flight controls, was not aware of it, and believed the aircraft was, for some unknown reason, out of control. The pilot and weapons system operator ejected (the back seat guy was killed when his safety harness broke.)

Andy Brazier

Accident risk up due to stress

Article in Gulf News by Carol Spiers on 17 June 2009.

The claim is that lots of industrial accidents which involved human error had more to do with stress and less to do with personal failings. Apparently this is confirmed by a recent report by one the UK's largest insurance groups which concluded that the risk of accidents at work is increasing as stress-levels are driven up by the effects of the recession.

Fatigue is part of this, but Spiers places "more significance on the psychological factors - disorientation and fear , which cannot accurately be measured." She summarises two accidents:

1. A methanol fire occurred because a technician left a tap on a drum open whilst it was filling. An investigation tried to ascertain the cause of this "inexplicable and serious breach of the factory safety rules." They concluded that because the technician had problems paying his mortgage he had not been sleeping well and had been suffering high levels of stress. Apparently "It was clear that the accident was a result of human error that could have been avoided had the man sought help from his employer and taken medical advice."

2. An "over-stressed accountant" duplicated the keying-in routine when making a large electronic payment to a supplier, paying them twice.

Whilst I agree stress can make a significant contribution to human error, both examples quoted seem to be more related to poor design. It seems to me that someone has developed a theory and is looking for evidence to back it up, rather than any reliable evaluation of the data available.

Andy Brazier

Wednesday, June 17, 2009

Adequate Time Off Between Shifts a Key to Reducing Fatigue Risks

Article on Ergoweb By Jennifer Anderson June 15, 2009

Circadian Technologies, a London consultancy, says the answer to reducing the risk of fatigue from shift work is to allow a minimum of 11 hours off between shifts. This increases the chance of people achieving the recommended 8.4 hours of sleep.

Circadian generally recommends limiting eight-hour shifts to a maximum of seven in a row, and 12-hour shifts to four or five in a row.

A study by the company reported in Business Week in April 2005 concludes that obesity, diabetes and heart disorders are higher for night workers, that they have a 20 percent greater chance of being involved in a severe accident and make five times more serious mistakes than their daytime counterparts.

Andy Brazier

Monday, June 08, 2009

Medication Safety Tools and Resources

A number of documents at The Institute of Safe Medication Practices website.

Include
* Error-Prone Abbreviations List
* FMEA Process (with Sample FMEA)
* IOM Report on Medication Errors
* ISMP Assess-ERR - A medication system worksheet to assist with error report investigation.
* ISMP Confused Drug name List

Prevention of Medical Errors - Leveraging the power of science and compliance to prevent catastrophe

Article on Advance web for LPNs website by Barbara L. Olson

A lot of useful information in this article about human factors in general, and particularly medical error. I have to say that for a publication that claims to be for practical nurses, this article is so full of unnecessary human factors and psychological jargon that the key messages may well be lost, which is a real shame. Why can't people write clearly?

One of the messages seems to be that healthcare does not act in the same way as 'high reliability organisations.' In particular health care professionals are "more likely to accept broadly stated goals as the functional unit performance, rather than process-orientated steps that contribute to a larger outcome." I think this means that there is more responsibility put on individuals to act safely and less emphasis on improving systems to reduce the risk of error.

The article suggests a risk-reduction hierarchy that I think is good:
* Forcing functions and constraints
* Automation and computerisation
* Standardisation and protocols
* Checklists and double-check systems
* Rules and policies
* Education and information
* Suggestions to be more careful.

Andy Brazier

‘Human error’ blamed for 134 incorrect promotion notifications

Article in the Stars and Stripes by Jeff Schogol on 8 June 2009

134 petty officers second class were incorrectly notified that they had been promoted to E-6 before the Memorial Day weekend.

The process seems to involve calculating scores and those with the highest earn the promotion. In this case the quota for the number being promoted was exceeded due to human error. Those involved will have to be demoted. According to the article "The Navy has since added an extra check to make sure the human error in question does not happen again."

Andy Brazier

Friday, May 22, 2009

Death by Human Error Trumps Technology Again

Article on Live Science by Benjamin Radford 18 May 2009

I don't know if the author is trying to be controversial on purpose, but this is a terrible article.

The author says "human error causes far more damage and kills far more people than computer or technological failures."
Also "In these cases and many others, accidents happen when inexperienced people intentionally ignore important safety systems designed by scientists and engineers. Safety guidelines, regulations, and devices are there for a reason, and though many people distrust technology, history shows us that we pay a terrible cost for human error, instinct, and overconfidence."

He does quote some examples but I won't repeat them because they simply show how little this person understands. The idea that technology may cause or contribute to errors is completely missed.

I'm not impressed.

Andy Brazier

Tuesday, May 12, 2009

Wheels-up C-17 crash caused by pilot error

Article in Airforce Times by Bruce Rolfsen 8 May 2009

A C-17 Globemaster military transport crash plane landed at Bagram Airfield in Afghanistan January 2009 because the crew failed to lower the landing gear. Whilst no one was hurt it took the efforts of the efforts of more than 200 people, a 120-ton crane and airbags to lift the plane high enough to lower the landing gear so that it could be moved. Even so the repair bill was $19 million and the airport was out of action for about 30 hours.

Accident investigation board president Col. Richard D. Anderson is quoted as saying "Had they lowered the gear, the mishap would not have occurred." The report says the aircrew failed to follow checklist procedures, which is a "basic Air Force rule." But it also says the crew were distracted and others, including air traffic control, made errors.

Events identified in the report include:

* The automated 'ground proximately warning system' that would have instructed the crew to lower the wheels was apparently accidentally turned off;
* The airfield’s approach radar was not working so the crew were using visual flight rule, requiring more focus on plane speed and altitude; and other aircraft.
* To help spot mountain ridges surrounding Bagram and other aircraft, the pilots put on night-vision goggles.
* When the plane was about three miles out the crew radioed their “short final” to prompt the control tower for clearance to land, but got no response.
* 28 seconds before landing they radioed the tower again “short final,” this time getting clearance to land.
* Controllers failed to make the required reminder call — “Check wheels down.”
* The plane was flying at 172 mph on approach, 42 mph faster than approach rules called for, which should have prompted an aborted landing.
* None of the three pilots realised that they had missed to go through the “before landing checklist.”
* With the landing gear still up, the plane’s ground warning system should have sounded out “too low gear.” However, it didn’t activate because the pilots had accidentally turned it off (according to investigators).

I will look out for more information on this accident. It maybe that the crew are being lined up as the main culprits, but clearly there were errors by others and technical failures. Also, there may be some systemic and/or cultural problems, as evidenced by the fact the crew were prepared to land when speed was too high, which is especially significant given that one of the crew was an instructor monitoring one of the others.

Also, it is noted that this is not the first time this type of accident has happened.

Andy Brazier

Thursday, April 30, 2009

Report shows human error almost led to aircraft crash

Article from 3 Newsin New Zealand, 30 April 2009

Emirates Airbus A340, which had originated in Auckland, was forced to make an emergency landing at Melbourne Airport after its tail struck the runway while taking off. Flight data shows the plane's wheels were still on the ground 115 metres past the end of the runway and two antennas and a strobe light were hit as the plane struggled into the air.

A preliminary report by the Australian Transport Safety Bureau has found that somebody in the cockpit entered a takeoff weight 100 tonnes below actual into the performance calculation computer. This meant the aircraft computer applied less thrust than was needed for the plane to take off.

Human error is the most likely cause of the near-miss.

"It should have been picked up during cockpit management checks between one checking the other - it wasn't," he says. "But I would have thought also in the head of one of guys loading it in, they would have thought, 'ah this plane is 100 tonnes too light to be operating from Melbourne to Dubai'."

It is not the first time incorrect loading figures have caused problems. In 2003 a Singapore Airline Boeing 747 had to make an emergency landing at Auckland Airport after the plane's tail also hit the runway on takeoff. Pilot error was also found to be the cause.

Emirates says it has now installed a second computer in the cockpit to double-check the weight's been properly entered.

Wednesday, April 29, 2009

Deadly business - who pays?

Part 12 of a 'Special investigation' from Hazards magazine April 2009.

It continues on from previous parts looking at workplace accidents and ill health; blaming a "hands off approach to safety regulation" and "an absence of oversight." This part examines figures quoted by the British Chambers of commerce for the cost of health and safety legislation and proposes alternative figures for the cost of injury and ill health the society.

According to the report, when the British Chambers of Commerce (BCC) published its ‘2009 Burdens Barometer’ in March 2009, it put the cumulative cost to business of workplace safety regulations covering working time, chemicals, asbestos, explosives, biocides, work at height, vibration and noise, as well as occupational exposure limits and the corporate manslaughter act at over £21.5bn, 28 per cent of the total burden of regulation.

Hazards feels this figure totally ignores the potential benefits of having fewer dead, sick or injured workers which can result in reduced sick leave, retention of trained and productive staff and, potentially, avoidance of safety fines, compensation payouts and spiralling employers’ liability insurance costs. And it ignores entirely the human cost of poorly regulated workplaces.

Hazards has looked for figures showing the cost of accidents. They include

* A May 2006 government regulatory impact assessment put the total cost of non-asbestos occupational cancer deaths each year at between £3bn and £12.3bn.
* A 2008 Health and Safety Executive (HSE) economics briefing put the total cost of each occupational fatality – and there’s hundreds every year - at £1.5 million
* A 2004 HSE report, using 2001/02 figures, put the cost to society of occupational ill-health and injury at between £20bn and £31.8bn.

The main concern of Hazards seems to be summed up by the following statement regarding the cost of accidents to society where "less than a quarter was borne by employers, although they were by and large responsible for the workplace conditions that led to the injury or ill-health."

I do think it is right that publications like Hazards do challenge organisations like the BCC. But I don't see any cause and effect being demonstrated in the article between the way regulations are developed and enforced vs safety performance. And I think bashing employers is likely to be counterproductive.

Andy Brazier

Monday, April 27, 2009

We need the facts to win the ‘war on error’ in hospitals

Article in The National from the United Arab Emirates by Justin Thomas on 26 April 2009

Thomas is assistant professor in the Department of Natural Science and Public Health at Zayed University. He presented at a conference for health care professionals on the topic of "patient safety."

He says "patient safety is an important and complex issue that touches on the psychology of human error, risk management, medical negligence, freedom of information and corporate manslaughter litigation" and gives two examples from the UK:

In one case an elderly woman had been admitted for a fairly routine overnight stay. She choked on some toast. The ward staff did not know the phone number to contact the hospital's team resuscitation team had recently been changed. Also, resuscitation team did not know the number needed to open the new electronic keypad lock on the ward door. Finally, none of the staff on duty that night had been trained in basic life-support techniques. The result was the woman died.

In another case an elderly man received the wrong medication and died. He had been quite healthy, but had the misfortune to be admitted on New Year’s Eve when staff are typically thin on the ground and often made up of temporary staff supplied by agencies.

At least as a result of incidents like the first the NHS has now standardised the emergency number in all hospitals to 222

Thomas poses the question, which hospital would you rather attend. You may feel a 5 per cent rate of medical errors to be "scary stuff." But if the other does not know its error rate you have much more to fear.

According to Thomas you cannot know something is improving if you don’t have reliable quantitative data relating to past and present conditions? "The first step to improving patient safety and knowing we have improved patient safety is the adoption of a common incident classification and reporting system. The mandatory and centralised reporting of all patient safety incidents not only allows us to quantify progress in our “war on error” – but also helps us to identify themes and patterns in the types of errors that are occurring, thus allowing us to propose solutions that can be adopted across the health service, sharing the learning, and preventing future tragedy."

Andy Brazier

Human error leaves Torrens Transit bus commuters stranded

Article on Adelaide Now by Tom Zed 24 April 2009

Bus commuters were left stranded on bus routes because an employee forgot to assign drivers on the company roster. The error was not picked up until the company received a phone call. It was fixed as soon as they found out but over an hour went by without buses.

Apparently there was a "unique set of circumstances" involving the school holidays and an AFL game between Port Power and St Kilda at AAMI stadium happening that evening.

Andy Brazier

Human error likely cause of botched airlift rescue

Article on Haaretz.com by Anshel Pfeffe 24 April 2009

A hiker, Ala Aghbariya, wandered into a minefield in Israel near the frontier with Jordan. He set off a mine, was injured and a military rescue helicopter was sent to lift him out. Unfortunately, as the helicopter was climbing he fell to his death.

An internal air force committee found that there was no technical error or malfunction in the equipment used or with the helicopter's operational systems. Rather, it is believed that the rescue crew's misjudgment during the operation was the cause of the accident.

It is a bit difficult to understand the misjudgment in this case. Does it mean a physical judgment, meaning an attachment was not made properly, or that a decision was made not make the attachment?

Human Error Contributed to Freeway Complex Fire

Article in KTLA news on 25 April 2009

Human error, in addition to adverse weather conditions and poor brush clearing, resulted a Freeway Complex Fire, that burned thousands of homes through six cities and four counties. Costs are estimated at $150 million, although no one was seriously injured.

A report issued by Orange County Fire Authority revealed that actions taken by four Corona firefighters trapped in flames at the beginning of the fire last November may have delayed the response to the firestorm.

Other firefighters ordered to report to Yorba Linda to stand guard defied orders and went to save the Corona crew, which may have allowed the fire to spread faster into Orange County.

Radio confusion delayed a request for air tankers. One fire chief asked for them whilst another did not. This meant they were not ordered for over an hour.

Different fire departments were using different radio frequencies and so could not communicate directly.

Andy Brazier

Human error caused fatal accident: officials

Article in the China Post 26 April 2009

Two tourists from China killed in Taipei, Taiwan when an overloaded crane fell and crushed the tour bus taking them to the world's tallest building.

The small crane was operating a mid-sized boom on the 31st floor of a planned MRT transit hub when the boom broke off and fell to the street below.

The crane was hauling a weight of four tons at the time of the accident, above its 3.2-ton capacity. Also, the operators also failed to take into consideration the wind factor.

Labor officials from the city determined it was a case of human error and denied the city government might have failed to oversee the construction site saying many fines had been issued for various violations and saying they had "done all that needed to be done."

Six crane operators have been released on bail after hours of interrogation. Two representatives from two construction companies involved in the construction project were also questioned, but they were later freed without bail.

Andy Brazier

Tuesday, April 21, 2009

Human error is a business risk you are willing to assume

I use Google Alerts to point me to interesting articles for this blog. I was a bit confused when articles kept popping up with the sentence "Human error is a business risk you are willing to assume" referenced, but where the articles did not seem to be very relevant.

Going to one of the articles I noticed that it was part of the "Legal disclaimer and risk disclosure." Basically it is saying use information taken from the fxstreet.com website at your own risk.

Andy Brazier

Doing less with more

Article in Hoist magazine by Jim Galante 3 April 2009. Suggests there are "double dividends of integrating lean thinking and ergonomics."

Managers today face more challenges than ever before. They are being told to cut costs whilst maintaining or even increasing production rates; and maintaining quality. Staffing levels are being reduced, and so there is the need to "do more with less."

There are further issues when you consider "the American workforce is aging and many of the next generation of workers are looking more toward white collar jobs."

Many think good applied ergonomics can help. Whereas in the past it seen as “a nice thing to do” because it made the worker’s job easier and making them happier, the benefits of improved quality and productivity; and reduced possibility of injuries are being recognised. "Ergonomics today has become an essential and fundamental part of a well run business."

According to Galante, ergonomics can play a significant role in achieving the goals of lean thinking. "Improving productivity by reducing or even eliminating waste is a core lean value. Good ergonomics eliminates excessive body motions and limits the number of repetitions in most work tasks. Good ergonomics will reduce mistakes and will improve quality - more lean values."

Galante references the Ergonomic Guidelines for Manual Material Handling, published by the EASE Council. This considers four applications in which lean thinking and ergonomic principles are related and essential to creating effective, sustainable programmes. They are:

1 Removing waste - Removing wasted, unnecessary motion can have a significant positive impact on systems and processes as well as decrease lead times and inventory, increase quality and substantially increase productivity.

2 Flexible processes - Understanding the whole organisational system requires all business processes to be flexible. This will significantly aid a company’s ability to respond to changes which are occurring in the marketplace. This can be flexibility in set-up/change-over, the type of assist device, inventory controls or linkages (transporting or storing materials).

3 The negative impacts of fatigue - Ergonomic assist devices can dramatically reduce or even eliminate the forces required to perform a task as well as reduce the associated reaching, bending or stretching. They will reduce fatigue and stress that would be experienced by the worker. These symptoms are often a precursor to a lost time injury.

4 The needs of the office and the service sector - By focusing on strategic placement of parts, products, tools and equipment and reviewing the layout of the work area, human stress and ergonomic related injuries can be reduced. The white paper discusses these changes and presents practical solutions and improvements.

The bottom line

In today’s demanding work environment companies need to take every advantage and a good ergonomics programme compliments a good lean initiative. The two together with all their tools, techniques, and philosophies will prove to be vital contributors to success both in the short- and long-term.

About the author

James J. Galante is the chairman of the EASE Council. Ergonomic Assist Systems and Equipment (EASE) is the resource for trends, information, practices, equipment, and organisations that focus on ergonomics and improving the working interface between people and the materials they must move and use to reduce injury, increase productivity while providing a significant return on investment. Visit the EASE website for these many resources at www.mhia.org/ease.

Mobile phone as a source of ignition

I have been aware of several stories over the years about mobile (cell) phones starting fires at petrol stations. However, I have never seen any real evidence of them being true.

The website Snopes.com, which looks at rumours and urbane legends has studied it, and concludes there is no evidence to support any of the stories.

A useful reference for the next time the story is circulated.

Andy Brazier

Friday, March 20, 2009

Total liable for Buncefield blast

BBC website20 March 2009

Nearly 4 years after the explosion at the Buncefield oil depot in Hertfordshire, a judgement has been reached on which company was liable for the damages

The depot was owned by Total and Chevron in a joint venture called Hertfordshire Oil Storage Ltd (HOSL), but was operated by Total.

The ignition of the vapour cloud which followed the spillage of 300 tons of petrol, caused an explosion which measured 2.4 on the Richter Scale.

The court's view was that "Total had failed to discharge the burden of establishing that HOSL was responsible for the negligence of the supervisor." This was based on the fact that

* All those working at the site had contracts with Total;
* The terminal manager who was the most senior member of staff on site was appointed by Total and line managed by Total.
* All safety instructions were developed by Total.
* Total's head office staff to develop an adequate system for preventing the overfilling of a tank.

In a statement Total said: "We still believe... our joint venture partner should accept their share of the responsibilities for the incident.

"As a consequence we will be considering our grounds for appeal."

The Guardian was more damning of Total, blaming sloppy practices and inadequate risk assessment. Judge David Steel described the events leading up to the blast as "remarkable"

Des Collins, representing a number of claimants said "This judgment is a shocking indictment of the way in which this ultra-hazardous operation was conducted by Total." "What is equally shocking is the degree of irresponsibility demonstrated by Total over the past three years in its failure to recognise the ultimate futility of the series of defences which it adopted."

The court listed various reasons for the explosion, including the negligence of supervisors and a series of failures in risk assessment and prevention. The judge was also critical of a "near miss" at the plant in August 2003.

"I am left with the clearest impression that practices within the control room were at best sloppy," said the judge.

The Financial Times quoted more from Justice Steel including "overall want of planning and monitoring all contributed to the disaster"

Also, that Total declined to call several key witnesses during the civil trial, including the two supervisors on duty at the time and its operations manager.

The only director of the depot’s operating company that did testify was found by the judge to be "somewhat evasive and unwilling to face up to the difficulties of reconciling his evidence with the contemporary material".

The court was told during trial that there had already been a “near-miss” at the site when a tank gauge stuck in August 2003.

The supervisors working on the night of the 2005 explosion, were "somewhat ironically" awarded certificates of competency less than a week before, the judgment notes.

Councils get banned jargon list

Widely reported in the press including BBC on 18 March 2009

The Local Government Association (LGA) has published a list of words they consider to be jargon and not suitable for use in documents issued to the general public.

LGA chairman Margaret Eaton said: "The public sector must not hide behind impenetrable jargon and phrases."

According to the BBC the 200 banned words are

Across-the-piece

Actioned

Advocate

Agencies

Ambassador

Area based

Area focused

Autonomous

Baseline

Beacon

Benchmarking

Best Practice

Blue sky thinking

Bottom-Up

CAAs

Can do culture

Capabilities

Capacity

Capacity building

Cascading

Cautiously welcome

Challenge

Champion

Citizen empowerment

Client

Cohesive communities

Cohesiveness

Collaboration

Commissioning

Community engagement

Compact

Conditionality

Consensual

Contestability

Contextual

Core developments

Core Message

Core principles

Core Value

Coterminosity

Coterminous

Cross-cutting

Cross-fertilisation

Customer

Democratic legitimacy

Democratic mandate

Dialogue

Direction of travel

Distorts spending priorities

Double devolution

Downstream

Early Win

Edge-fit

Embedded

Empowerment

Enabler

Engagement

Engaging users

Enhance

Evidence Base

Exemplar

External challenge

Facilitate

Fast-Track

Flex

Flexibilities and Freedoms

Framework

Fulcrum

Functionality

Funding streams

Gateway review

Going forward

Good practice

Governance

Guidelines

Holistic

Holistic governance

Horizon scanning

Improvement levers

Incentivising

Income streams

Indicators

Initiative

Innovative capacity

Inspectorates

Interdepartmental

Interface

Iteration

Joined up

Joint working

LAAs

Level playing field

Lever

Leverage

Localities

Lowlights

MAAs

Mainstreaming

Management capacity

Meaningful consultation

Meaningful dialogue

Mechanisms

Menu of Options

Multi-agency

Multidisciplinary

Municipalities

Network model

Normalising

Outcomes

Outcomes

Output

Outsourced

Overarching

Paradigm

Parameter

Participatory

Partnership working

Partnerships

Pathfinder

Peer challenge

Performance Network

Place shaping

Pooled budgets

Pooled resources

Pooled risk

Populace

Potentialities

Practitioners

Predictors of Beaconicity

Preventative services

Prioritization

Priority

Proactive

Process driven

Procure

Procurement

Promulgate

Proportionality

Protocol

Provider vehicles

Quantum

Quick hit

Quick win

Rationalisation

Rebaselining

Reconfigured

Resource allocation

Revenue Streams

Risk based

Robust

Scaled-back

Scoping

Sector wise

Seedbed

Self-aggrandizement

Service users

Shared priority

Shell developments

Signpost

Single conversations

Single point of contact

Situational

Slippage

Social contracts

Social exclusion

Spatial

Stakeholder

Step change

Strategic

Strategic priorities

Streamlined

Sub-regional

Subsidiarity

Sustainable

Sustainable communities

Symposium ­­

Synergies

Systematics

Taxonomy

Tested for Soundness

Thematic

Thinking outside of the box

Third sector

Toolkit

Top-down

Trajectory

Tranche

Transactional

Transformational

Transparency

Upstream

Upward trend

Utilise

Value-added

Vision ­

Visionary

Welcome

Wellbeing

Worklessness

Tuesday, March 17, 2009

Deadly rules

Article in The Guardian by Cath Janes on 14 March 2009

A refreshing article that explores some of the issues about health and safety being allowed to go over the top. Some excerpts below



Our office has fire doors which we actually prop open with fire extinguishers. We know we shouldn't - but we do it anyway. These are the words of an office manager who wishes to remain anonymous.

The Health and Safety at Work Act (HSWA) celebrates its 35th anniversary this year, so health and safety should be second nature by now. But it's not. Employees continue to complain about the inconvenience of fire drills and computer monitor adjustments. Yet experts continue to point at the Health and Safety Executive's (HSE) reports of 2.1 million people suffering from illnesses they believe to have been caused or worsened at work.

"Health and safety should be a powerful unifying agenda between employers and the workforce, not a matter for confrontation," says Judith Hackitt, chair of the HSE. "The problem tends to be the misinterpretation of what is actually required." One recent initiative tries to dispel the idea that risk assessments need to be 10 pages or more for every task. "We have shown what's 'good enough' and that's all you have to do," says Hackitt.

David Symons, director at WSP Environment & Energy, a consultancy firm says "The problem is that health and safety is applied by people who don't have a deep understanding of what needs to be done. It's no wonder that it is seen as an impediment to the day job. It's not the legislation that's an issue, it's the implementation of it." "We are all adults," he says. "Let's just communicate the principles well. Communicate badly and it comes off as patronising. And if that's the case, and health and safety isn't being achieved, something has to be done about it."

"Paperwork is a sign of bad health and safety management," claims Lawrence Waterman, chairman of another consulting firm, Sypol. "If you are not rigorous in reviewing procedures you get a lot of bureaucracy and lose track of what you are asking people to do. Yes, it can be sensible to jot things down but there's a fine line between risk management and bureaucratic obstruction.

"That's why health and safety is a job for professionals. They can weave safety procedures through good business practice and not have it hanging about as a separate dynamic."

Business psychologist Pearn Kandola. "Humans want to fight against those rules though. We like to be free and intuitive and follow our emotions.

"There's also a reason why health and safety isn't second nature. It's because humans are risk-takers. We are not naturally safe and don't like health and safety, or the people who implement it, because we perceive them to be rule-bound and boring. While their role is essential it is never going to appeal to us, because we don't like rules and regulations."

Which, in the fight against the ministry of the bleedin' obvious, is a snag. Are employees ever going to prove they don't need to be warned against sticking their wet fingers in plug sockets? Surely what lies at the heart of health and safety is common sense, and we all have that ... don't we?

"You hear people saying that it is all about common sense," agrees Duff, "The problem is, they don't use it. We are not rational beings and accidents are often the result of irrational behaviour. We think we are great at making our own rules, but we are not."

Is this still a reason to treat employees like children, though? Problems in the workplace often lead to demotivation, low productivity and withering loyalty. Health and safety is no exception. On one hand you are considered savvy enough to close a deal with a client, yet on the other you are considered a prime candidate for a box-lifting demonstration. It's little wonder health and safety rankles. It's almost a reminder that you are not as in control as you thought you were.

"Which is why risks should be managed in a proportionate way rather than wrapping people up in cotton wool and taking the fun out of life," warns Derek Draper, senior consultant at Connaught Compliance. "The bonkers conkers stories just trivialise health and safety and detract attention from the task of keeping people safe at work. Risk assessment needn't be complicated though. After all we do a subconscious risk assessment every time we so much as cross the road."

Hackitt, of the HSE, has a final suggestion. "Challenge your employer but do it constructively," she says. "Don't turn health and safety into a management versus workforce confrontation issue. Offer solutions or more common sense ways of approaching the problem. Remember, it's about doing what is sensible, reasonable and practical to reduce risk, not eliminating it, and still getting on with your job."

Andy Brazier

Monday, March 16, 2009

Oops, we did it again - Why we make mistakes

Book review in The Independent on 18 March 2009 by Sophie Morris

The book is Why We Make Mistakes by Joseph T Hallinan, an American Pulitzer Prize-winning journalist.

The book has attracted winning reviews, with one critic predicting that it would change the face of mainstream behavioural science. Subtitled "How We Look Without Seeing, Forget Things in Seconds, and Are All Pretty Sure We Are Way Above Average", Hallinan's book is, according to its author, "a field guide to human error. People can look at it and see the mistakes they make, and find some of the reasons behind those mistakes."

The book says that error is a not personality or intelligence issue, and simply something to do with the way humans are designed. The very way we think, see and remember sets us up for mistakes. We are subconsciously biased, quick to judge by appearances and overconfident of our own abilities. Most of us believe we are above average at everything – a statistical impossibility that leads to slip-ups.

Until I read the book I can't tell whether there is anything new here.

Andy Brazier

Night shifts spark cancer pay-out

Article on BBC website by Kenneth Macdonald on 16 March 2009

The Danish government has begun paying compensation to women who have developed breast cancer after long spells working nights. It follows a ruling by a United Nations agency that night shifts probably increase the risk of developing cancer.

There has been growing evidence that night shifts are bad for you for years. Symptoms include disturbed sleep, fatigue, digestive problems and a greater risk of accidents at work. Cancer is now being added because there is a 'probable' link.

Dr Vincent Cogliano of the IARC said they reached their conclusion after looking at a wide number of studies of both humans and animals.

He said there was evidence to support the hypothesis that alterations in sleep patterns could suppress the production of melatonin in the body.

"Melatonin has some beneficial effects in preventing some of the steps leading to cancer," he said.

"The level of evidence is really no different than it might be for an industrial chemical."

What is not clear from this article is how big a risk factor night work is compared to others.

Andy Brazier

Friday, March 13, 2009

New computer to help cut hospital mistakes

Article on theBeattie Group website on 17 February 2009

A new computer has been launched in the UK that "could be the key to eliminating some of the 40,000 mistakes made each year in NHS hospitals"

The Panasonic CF-H1 Toughbook Mobile Clinical Assistant (MCA) has been developed in conjunction with NHS nurses to give them wireless access to patient notes at the bedside - spelling the end of the clipboard at the end of each bed.

It will give nurses access to up-to-the-minute electronic patient records, has a series of security features aimed at minimising the room for human error on high-pressure wards, enables other clinical staff such as doctors and pharmacists to check up-to-date medical history at patients' bedsides leading to them making quicker and better-informed decisions, is able to read barcodes in order to cut out any room for misreading labels and further ensure that the right treatment is given to the correct patient at all times.

Jon Tucker, product head for the MCA at Panasonic, said: "Hundreds of mistakes are estimated to be made in hospitals every week at the moment, either through poor communication or basic human error.

"Nurses are often required to memorise information, like changes to medication, then input the details into a computer off the ward afterwards which can result in delays in data input or forgotten information.

"Disjointed communication with other departments and between shift workers has also been a cause of mistakes in treatment.

"This computer enables nurses to update a patient's central records at the bedside during ward rounds, dramatically reducing the potential for mistakes."

Quiet cars may need alert for pedestrians

Article by Tom Greenwood in Detroit News on 17 February 2009

The National Federation for the Blind is concerned that electric and hybrid cars are so quiet the blind and visually impaired could be killed or seriously injured by walking unknowingly in front of them.

Tom agrees and tells a story from the recent North American International Auto Show where he was investigating "green" technology. He says "Believe me when I say they were absolutely silent; my vision and hearing are fine, but I found myself looking over my shoulder to see if a vehicle was creeping up on me."

The NFB is advocating for quiet vehicles to be equipped to emit a continuous sound and wants additional research on the problem.

Scientists from the Human Factors and Ergonomics Society tested a number of visually impaired individuals and asked them which of six types of sounds -- engine, horn, hum, siren, whistle and white noise -- they preferred as warnings.

By far the most preferred sound was that of an automobile engine, followed by white noise and hum.

Monday, February 16, 2009

Incident Investigation: Rethinking the Chain of Events Analogy

Article on EHS Today By Allan Goldberg on 17 November 2003. It disputes the often used notion that incidents occur due to a 'chain of events' suggesting the logic behind the chain may be its weakest link.

The safety profession often refers to a chain of events and then looks for the weak link as a means of identifying what went wrong that allowed the incident to occur. We then very often go further and identify a specific human error that was made, and the person who made it. That person, and/or what they did or didn't do, is thought of as a weak link in the sense of a "performance" chain. Rigid adherence to this way of thinking can lead to some significant errors in improving safety performance. We can and should avoid them.

There are three main problems that this traditional thinking about the chain of events analogy can lead to:

1. Incidents are not linear sequences and instead multivariable meaning there are many different possible paths to an incident.

2. The "weakest link" approach implies that there is only one "main" cause for a given incident whereas most incidents have multiple causes

3. Looking for the weak link creates a focus on the point of failure which is usually well removed from the best point of control. This leads t overemphasis on behavioral approaches and misses the true root causes.

Every link in a physical chain is in fact only connected to one other on each end. The real world chain of events, however, has many more "options" in terms of inputs and outputs. Breaking a single "link" will not necessarily preclude the end event from occurring.

Human actions are a combination of attitudes, beliefs, moods, training, awareness and many other factors. The point being, we may not respond to a given situation today the same way we did yesterday. The key idea here is that many sets of inputs and outputs are possibilities in incident causation. We must be very careful to avoid thinking about causation in a purely linear manner.

Root causes are likely to apply to a whole series of potential incidents, not just one event. These root causes are in fact the key to prevention of future incidents. And contrary to what all too many people may think, human error is not one of them! Human error itself is a symptom that there are other problems in the management of the work that is taking place. These error problems themselves have root causes. When a worker makes an error or fails to follow a procedure, there are reasons that set up the situation. These are the root causes that must be found.

Avoiding Pitfalls

1. Recognize the multivariable nature of incident causation.

2. Understand the Principle of Multiple Causes.

3. Realize the point of failure and the point of control are not necessarily the same. Seek to understand the problem as part of the overall system, and identify where the system itself can be best controlled.

Andy Brazier

Monday, February 09, 2009

Human error 'doing more harm than enemy'

Western Morning News on 26 January 2009

Air Chief Marshal Sir Jock Stirrup, the Chief of the Defence Staff (CDS), reported as saying accidents and mistakes in combat zones do more to undermine British troops' fighting abilities than attacks by the enemy, according to the head of the armed forces.

Sir Jock said that more than half of "accidents and incidents" which have led to troops being killed or injured on operations were down to human error.

The CDS also admitted that troops who make mistakes were too afraid to own up to their failings because of concerns they would be unfairly punished.

In an article for a Ministry of Defence publication, Sir Jock said that the absence of a "just culture" in the forces meant the military had failed to learn valuable lessons from its mistakes.

Since March 2003, 320 troops have died on operations and several thousand have been injured. And while the vast majority have been killed by enemy action, the CDS said that errors made by British troops had played a significant part.

In an article for "Desider", a magazine for the defence, equipment and support arms of the military, Sir Jock wrote: "Evidence shows that more than half our accidents and incidents are down to human factors. In other words, it is our people who are causing the most damage to our fighting capability. We must do something to drive down the number of accidents and incidents.

"One of the most effective ways of doing this is to promote a culture that encourages open and honest reporting that allows for a structured investigation of errors.

"This action should address all individual, systemic and environmental issues relating to an incident and allow us to learn from what took place.

"The actions and feedback will prevent us making the same mistakes again. It is the justness of what we do that gives rise to a just culture."

The CDS added: "To me, such a culture is based on trust. It suggests a working environment where individuals are encouraged to contribute to providing essential safety information and where they are commended for owning up to mistakes."

Sir Jock then asked: "Do we have a just culture in place? Is there a tolerant and non-punitive environment where mistakes can be admitted freely before they can cause an accident?

"My sense is that it is not as well established as it might be, nor as comprehensive as I would wish. The greatest challenge for senior leaders and those with command responsibility, including me, is to make a just culture a fact, not just an aspiration."

Sir Jock's comments come soon after the publication of a document in which General Sir Richard Dannatt, the Chief of the General Staff, revealed that 10 out of the 89 soldiers killed in combat in 2007 were "entirely avoidable accidents".

New Study Shows Patient Safety Benefits of Ensuring Rest for Doctors

Article by Jennifer Anderson 2 February 2009 on ergoweb.com website.

The University Hospitals Coventry and Warwickshire NHS Trust conducted the new study, which was reported by the BBC.

Nineteen junior doctors working on the endocrinology and respiratory wards at the hospital participated in the 12-week study. Nine were put on a 48-hour per week pattern that met the conditions of the European Working Time Directive (EWTD) and 10 were on a traditional pattern, where they worked up to 56 hours.


Two senior doctors, who were unfamiliar with the shift patterns of both groups, reviewed their errors by checking case notes.

Doctors working to the EWTD pattern made 33 percent fewer errors than their colleagues on the traditional pattern, and there were fewer potentially life-threatening events.

Andy Brazier

'Human error' kills Google seach

Article by David Walker 2 February 2009 at T3 gadget website

Apparently someone at Google inputted the symbol '/', which contrived to label all sites as unsafe for a portion of Saturday afternoon (UK time). This left all sites in the Google universe classed as unsafe, with the warning 'This site may harm your computer' appearing under the website name. Even if you went to Google, Google Maps, Gmail or indeed any other Google site.

Initially, Google blamed the non-profit website StopBadware.org, but later changed tact and offered up a new statement taking full responsibility.

Andy Brazier

Thursday, January 15, 2009

Prosecuting doctors won’t stop them making mistakes

Article in The National (Abu Dhabi)by Justin Thomas 4 January 2009.

It appears that there are draft proposals for criminal sanctions to be brought against negligent doctors in Abu Dhabi.

Justin says that whilt he agrees healthcare professionals need to be held accountable, "such accountability and possible culpability, should only be a small part of a system-wide approach to reducing errors and improving patient safety. In fact, individually punitive measures are actually more likely to lead to a deterioration in patient safety rather than improvements."

All human beings make errors. Punishing people for slips and lapses does not improve safety or performance; if anything, it breeds resentment and fear, which in turn can lead to the development of a blame-culture, scapegoating and cover-ups.

The article says that several studies in the UK have estimated that at least 10 per cent of all hospital admissions result in adverse events, with 50 per cent of these mishaps being viewed as preventable. Also, the UK government estimates that annually there are 10,000 adverse drug reactions, 400 deaths involving medical devices, 28,000 complaints about medical care, and £400 million (Dh2.1 billion) paid out in clinical negligence settlements.

"In terms of improving patient safety, the dismissal or even incarceration of errant professionals will have little impact, and in some cases may even make the situation worse, especially if practitioners become defensive, risk averse and demoralised."

"The answer to preventing error and improving patient safety lies in the development of organisational safety cultures, where staff have an active awareness of the potential for things to go wrong and know about things that have gone wrong previously, as well as the circumstances and causes leading up to such incidents. Such a culture should actively encourage people to speak up about mistakes, with a view to learning from them and minimising the likelihood of a recurrence."

Andy Brazier

100 years of flight safety advances

A very interesting article from Flight International by David Learmont published 5 January 2009

Well worth reading the whole article, but some of the key messages are summarised below.

Wilbur Wright wrote to his father: "In flying I have learned that carelessness and overconfidence are usually far more dangerous than deliberately accepted risks." Whilst, in the 1930s First World War pilot Capt A G Lamplugh described of the risks of flying as "Aviation in itself is not inherently dangerous. But to an even greater degree than the sea, it is terribly unforgiving of any carelessness, incapacity or neglect." Both had clearly learnt that no activity can be completely risk-free, but that risk should be managed so as to remain within acceptable bounds.

Most safety lessons are learnt through experience. Father of the Flight Safety Foundation Jerry Lederer said in 1939 that "strange as it may seem, a very light coating of snow or ice, light enough to be hardly visible, will have a tremendous effect on reducing the performance of a modern airplane". The challenge has always been to disseminate learning. In January 2002 a Bombardier Challenger 604 business jet at crashed on take-off from Birmingham. It had been left on the ramp overnight and not de-iced before take-off was attempted.

Airframes, engines and aircraft systems have continually become stronger and more reliable, but as these improved the aircraft could also fly faster, perform a greater variety of tasks, and operate in worse weather conditions.

As the machinery became more reliable it caused less accidents. The role of the human became the focus of those who would improve aviation safety, really staring in the 1970s covering both on-board crew and maintenance.

Cockpit or flightdeck ergonomics started to improve in the 1960s, and really stepped up in the 1980's when cathode ray tube instrument displays (later replaced by liquid crystal displays) started to appear. This provided opportunities to improve crew situational awareness because data regarding performance and navigation could be integrated rather than being displayed as disparate pieces of data. This not only reduced the potential for individual confusion, but provided both pilots with the same picture of what was going on rather than allowing each to develop their own pictures that may not be identical.

In the 1970s KLM invented the concept of crew resource management (CRM) with the objective of improving the way crew communicated and worked together. This is now officially accepted globally as a critical part of multi-crew pilot training.

Technology alone has rarely eliminated a serious risk, but since the mid-1990s real progress has been made in reducing what had been the worst killer accident category - controlled flight into terrain. The ground proximity warning system (GPWS) has been replaced by Enhanced GPWS (EGPWS) which provides pilots with a graphic picture of their position and height relative to terrain, plus audio alerts. It is stated that there have been no incidents of controlled flight into terrain involving aircraft fitted EGPWS, but 5% of the world's big jet airline fleet that do not have it.

The windshear alert was developed in the late 1980s after meteorologists improved their understanding of phenomena such as windshear and micro­bursts associated with storm cells, and how these can affect aircraft close to the ground just after take-off and on approach. Pilots' awareness of the risk has also been improved.

Information technology has allowed company and global databases of safety data to be developed. Downloading data from aircraft allows engineers to recognise where operational best practice has been breached and to spot the technical signs of impending equipment failure.

In addition, the adoption of safety management systems and global auditing of airlines has made its contribution. But it may be argued that liberalisation of the market has allowed greater competition and therefore greater passenger choice. Where there is a choice of another airline to fly with, a carrier that has suffered an accident also suffers commercially.

Andy Brazier

Formula 1's virtual reality

Article from gradprix.com by Joe Saward on 14 January 2009

Apparently Formula 1 has been using simulators for sometime for testing technical components and this will become more useful now that circuit testing has been banned in a bid to save the teams money. Highly-advanced rolling-road wind tunnels, transient dynos, seven-post rigs, Computational Fluid Dynamics (CFD) and computers to "crunch away to work out every conceivable race strategy" are being used along with "driver-in-the-loop simulators" where the F1 drivers sit in "virtual" F1 cars and drive them.

It is suggested that use of simulator technologies in Formula 1 started when teams recognised that they could make money by working with computer gaming companies to create entertainment for the public. The first racing computer game was Gran Trak 10released by Atari in 1974.

The article gives an interesting summary of the history of simulation.

"Modern simulation techniques can be traced back to the 1920s when an American engineer called Edwin Link, who had begun his career as a builder of organs and nickelodeons, used his knowledge of pneumatic pumps and valves to create the first flight simulator" He developed a device which became known as the Blue Box. It was an aircraft cockpit that the pilot sat in and was able to 'fly' using instruments alone - until this time learning to fly in cloud was done in the air and was known to be rather dangerous. The Blue Box produced pitch, roll and yaw motions which were controlled by the pilot. The Army Airforce made the first purchases in 1934 after a number of trainee pilot fatalities, and in the end 10,000 were sold with more than half a million aircrew from different nations using them to train.

The boom in civil aviation after World War II created a greater need and pneumatics were replaced by hydraulics in simulators by the 1960s. They incorporated "six degrees of freedom", which meant that the platforms on which the cockpits were mounted were able to generate roll, pitch, yaw motion plus surge (longitudinal), heave (vertical) and sway (lateral). Visuals were introduced, with the earliest versions using cameras that filmed models of the ground. By the 1970s wide-angled screens with film footage came in, to be followed by curved mirrors and ultimately plasma screens with virtual imagery.

Other uses of simulator included army ground vehicles and automotive simulators use to understand how drivers behaved in different situations. Today there are reckoned to be 1200 professional flight simulators in the world.

Back in Formula 1: McLaren is believed to have spent as much as $40m on its system using technology developed for the Eurofighter aircraft. The driver sits in a full-size F1 monocoque, in front of a large, curved plasma screen. The whole device is mounted on a hexapod which moves around an area about the size of a professional basketball court, in response to the driver's steering and pedal input. Conversely Williams have used a fixed simulator which has been "amazingly cost-effective, with a budget of probably a tenth of what has been spent at McLaren." Apparently Williams can download data from practice sessions on the track to the simulator to try out different set-ups, which can then be tried to ensure the cars have the optimum set-ups

Andy Brazier

Surgical checklist 'saves lives'

Article on BBC website 14 January 2009.

A one-page checklist devised by the World Health Organization (WHO) has been tested in eight cities around the globe (Seattle, Toronto, London, Auckland, Amman, New Delhi, Manila and Ifakara, Tanzania). It focuses on basic good practice before anaesthesia is administered, before a patient is cut open, and before a patient is removed from the operating theatre, and is designed to promote effective teamwork and prevent problems such as infection and unnecessary blood loss.

Data was collected from 7,688 patients, 3,733 before the checklist was implemented, and 3,955 afterwards. The rate of major complications fell from 11% to 7%, and the rate of inpatient deaths following surgery fell more than 40% from 1.5% to 0.8%. Findings were similar across all the hospitals in the study.

Dr Alex Haynes, who led the study, said the checklist had a significant impact at every hospital site in the study. "Even many clinicians who were initially sceptical of the idea became advocates once they saw the benefits to safety and consistency of care."

Dr Kevin Cleary, NPSA medical director, said: "The results of the study give clear evidence that a simple intervention leads to dramatic improvement in outcome for patients undergoing surgery."

UK Health Minister Lord Darzi is quoted as saying "The beauty of the surgical safety checklist is its simplicity" and "Operating theatres are high-risk environments. By using the checklist for every operation we are improving team communication, saving lives and helping ensure the highest standard of care for our patients."

The checklist is already in use in Scotland and the National Patient Safety Agency (NPSA) has ordered all hospitals in England and Wales to use it across the board by February 2010.

I'm a strong advocate of checklists for certain tasks, although their overuse can be counter-productive. What I don't understand is why it will take so long to get this implemented. The checklist is readily available for use. You can download it from the BBC website.

Andy Brazier

Monday, January 12, 2009

Take a Nap! Change your Life

A very interesting book by Sara Mednick. Available from Amazon

The book talks about how napping during the day to supplement night-time sleep is part of human nature. It makes specific reference to health and safety, including the role of fatigue in the Exxon Valdez accident.

A few snippets.

* In 1950's studies were done where subjects were kept in small flats without windows or clocks. After a short transitional phase people would sleep six to seven hours at a time that would represent night and roughly 12 hours later would return to bed for a shorter time. It is suggested that this is a natural sleep pattern.

* Before the light bulb was invented adults would typically get as much as 10 hours rest during the average weeknight. Today the average (in USA I presume) is 6.7 hours.

The book explains when to nap and for how long, but the basic message I take from it is napping can have great benefits, with small naps greatly reducing fatigue. I would not say I would follow detailed advice particularly, but it has encouraged me to take more naps whereas in the past I may have felt it a slightly silly thing to do. Also, it backs up advice I have given in the past for shift workers to have the opportunity to take short naps at works, especially when working nights.

Andy Brazier