Friday, December 05, 2008

The Fun Police

Cutting Edge documentary on Channel 4 last night (4 December 2008).

I don't think anyone in the health and safety profession ever expected this documentary to be particularly informative, and it wasn't. And, it is no surprise that people are saying on the internet today that they are disappointed or even angry about how the profession was represented.

But I have found five newspaper reviews of the program, and I don't think any have said it showed how ridiculous health and safety is. All noticed that the start of the program did labour some of the issues, making it look like health and safety people see danger everywhere. But all recognised that there are many serious issues, and the people shown were doing the right thing.

The following quotes are taken from the various papers:

The Telegraph - "If the popular press is to be believed, they’re full-time killjoys and agents of an increasingly spineless culture. Yet watching them as they trudge from one premise to the next, harangued, mistrusted, occasionally shouted down, it’s quite heartbreaking. They may be meddling and almost neurotically preoccupied with catastrophic scenarios, yet on the evidence of this documentary, presented by health and safety expert Ed Friend, their intentions are noble. Perhaps it’s worth considering too, whether they are any more to blame for our risk-obsessed culture than insurers and an increasingly litigious public?"

The Guardian - "This film is nicely non-judgmental. It simply shows these health and safety people, going about their business, doing what they believe is right. And Ed certainly believes it. He's passionate about health and safety, if that's possible. He's not going to shut up about it until there are no more accidents. And even though he's clearly the most annoying and ridiculous man in the world, there's also something quite admirable in that."

The Times - "One of the inspectors said he gets angrier and angrier at the “absolute waste of human life” presided over by lazy companies. His job meant he was an “expert in human misery”. There would never be a recession in “health and safety” — sadly. "

The Independent - "For anybody approaching the film in a Littlejohn state of mind, there was plenty here to confirm any prejudices; being of a nervous, risk-averse disposition, I was less sure about the message. Most of what Mr Friend had to say about the dangers of everyday life wasn't entirely stupid; the joke lay in his bothering to point it out, and in his somewhat pedantic manner. This being TV, it strikes me as entirely possible that in pointing out danger on every hand, he was only doing what he had been asked to do ("Go on, Ed, show us a risk"). Even if he was as neurotic as the film made out, that hardly amounts to an argument about health-and-safety regulation in general. I don't suppose, either, that Ms McIlravey's anxieties about glue would seem quite so petty if you'd found that the glue on the back of your falsies was eating through your actual nails, which is apparently one of the possibilities."

The Herald - "There is no health and safety in this country," Ed stated, "only accidents and ill health." The Fun Police convinced you we need more Ed Friends, not fewer."

My view on the program is that it was pretty boring and a missed opportunity. It will not be worthy of further consideration, unless Ed Friend becomes a TV celebrity as a result.

Andy Brazier

Thursday, December 04, 2008

How to influence people at work

Article in the Times by Carly Chynowth on 3 December 2008. Gives 10 points as follows:

1. Build raport - people enjoy doing business with people they like. Take time to ask about family, holidays etc. It smoothes the way, just like WD40;

2. Earn respect - this is better than being liked. Showing you know what you are doing and are a good leader, and able to bring a sense of order and cohesion will earn you respect;

3. Get your message right - clear and concise. When under pressure we tend to add lots of extraneous words (e.g. 'I hope you don't mind');

4. Get things in context - understand and hence avoid the cultural and professional constraints that others are working under and cause barriers to progress;

5. Open your ears - listening to others means you will know what motivates them and hence how best to influence them;

6. Reciprocity matters - doing something helpful to someone means they will want to help back (e.g. a Big Issuer seller sold far more copies when he held the door open for people);

7. Get the timing right - talk to people when they are least likely to be pressured;

8. Give up the glory - let others take the credit and do not get emotionally attached to your ideas;

9. The element of style - the way you look and act has a big influence, but beware of changing to try and suit the circumstances;

10. Don't manipulate - direct people but let them work our the path for themselves.

Andy Brazier

OECD-CCA Workshop on Human Factors in Chemical Accidents and Incidents

Another long report from the Organisation for Economic Cooperation and Development (OECD). Published 2008. Available free from the OECD website

The report presents the main output of the OECD-CCA Workshop on Human Factors in Chemical Accidents and Incidents, which took place on 8 and 9 May 2007, in Potsdam, Germany. The overall objective of the workshop was to explore human factors related to management and operation of a hazardous installation, and to share information on assessment tools for analysis and reduction of human error in the chemical industry, including small and medium size enterprises (SMEs).

Seems to provide a good summary of issues, but I can't see anything particularly new.

Andy Brazier

Guidance on Developing Safety Performance Indicators

Guidance published by the Organisation for Economic Cooperation and Development (OECD) specifically related to chemical accident prevention, preparedness and response. Second edition published 2008 and available free from the OECD website

It is a very long document, and so will take some studying. But it lays out why you want Safety Performance Indicators (SPI) and how to develop them. Also, there seem to be lots of examples of indicators to use.

The introduction reads "Safety Performance Indicators (“SPIs”) provide important tools for any enterprise that handles significant quantities of hazardous substances (whether using, producing, storing, transporting, disposing of, or otherwise handling chemicals) including enterprises that use chemicals in manufacturing other products. Specifically, SPIs help enterprises understand whether risks of chemical accidents are being appropriately managed. The goal of SPI Programmes is to help enterprises find and fix potential problems before an accident occurs.

By taking a pro-active approach to risk management, enterprises not only avoid system failures and the potential for costly incidents, they also benefit in terms of business effi ciency. For example, the same indicators that reveal whether risks are being controlled can often show whether operating conditions are being optimised."

The Guidance divides SPI into two types: "outcome indicators" and "activities indicators."

* Outcome indicators are designed to help assess whether safety-related actions (policies, procedures and practices) are achieving their desired results and whether such actions are leading to less likelihood of an accident occurring and/or less adverse impact on human health, the environment and/or property from an accident. They are reactive, intended to measure the impact of actions that were taken to manage safety and are similar to what are called “lagging indicators” in other documents. Outcome indicators often measure change in safety performance over time, or failure of performance. Thus, outcome indicators tell you whether you have achieved a desired result (or when a desired safety result has failed). But, unlike activities indicators, they do not tell you why the result was achieved or why it was not.

* Activities indicators are designed to help identify whether enterprises/organisations are taking actions believed necessary to lower risks (e.g., the types of policies, procedures and practices described in the Guiding Principles). Activities indicators are pro-active measures, and are similar to what are called “leading indicators” in other documents. They often measure safety performance against a tolerance level that shows deviations from safety expectations at a specific point in time. When used in this way, activities indicators highlight the need for action when a tolerance level is exceeded.

Thus, activities indicators provide enterprises with a means of checking, on a regular and systematic basis, whether they are implementing their priority actions in the way they were intended. Activities indicators can help explain why a result (e.g., measured by an outcome indicator) has been achieved or not.

Andy Brazier

Simulation will increasingly be used to train pilots for optimum operations

Very interesting article by David Learmount from Flight International, published on the Flight Global website on 25 November 2008

It refers to an analysis of global airline safety data by the UK Civil Aviation Authority (CAA) that said "pilot judgement, decision-making and/or handling are key factors in 75% of catastrophic accidents, whereas technical failures tend to be causal in the less serious events." This is despite advancing technology and improved aircraft, and is expected to remain the case for the foreseeable future. However, the article does say "It is important to note that this [statistic] does not imply that the pilot was at fault or to blame, because it is now well-established that 'pilot error' cannot continue to be the scapegoat for the many and various factors that can lead to the error occurring."

Quality pilot training at all levels remains the critical factor in preventing the most serious accidents. Whilst some airlines are beginning to use simulation to improve crews' wider operational and flight management skills there is a danger that this is at the expense of getting "raw" flying practice, which get less of operationally because of high degrees of automation. Apparently ongoing studies "show that handling skills degrade with time, while cognitive skills are less time-sensitive, and that recent manual flying practice does improve manual flying performance."

The CAA data "highlights the crucial importance of pilot performance in safety, and therefore reminds us to invest resources in anything that might support it - [for example] training and simulation facilities - and to minimise influences that might adversely contribute, [like] time pressure, fatigue, and distraction."

An interesting development is that airlines are using simulators not only to train pilots to fly and manage aircraft, but to fly procedures specific to their own requirements to reduce costs and increase the operational efficiency of the airline. An example is Emirates, who are looking beyond using simulators to meet regulatory requirements, employing them also to hone crew decision-making skills in situations where there are operational options purely from the safety point of view, but where one of the outcomes will be the more efficient.

It is seen that "tier-two" airlines are bringing more training in-house, and are getting quite sophisticated in the business case analysis they are undertaking with respect to their operations. They don't have this barrier to change that the tier-one airlines do. I'm sorry but I haven't seen, for example, British Airways, innovate when it comes to this stuff - they just won't.

Andy Brazier

AWG slashes picking errors with Voice

I'm always interested in claims of quantifiable reductions in human error. In this press release BCP, a systems and software house specialising in the retail and wholesale distribution markets, say one of their clients has reduced their 'picking error' by 97.3% saving over £100,000 per year.

The picking they refer to is people taking items from the shelves in a warehouse to fulfill orders. In the past these people were given a written list of items. This has been replaced by an automated voice instruction to headphones via a wireless system. They have to confirm their understanding by repeating the instruction, which is checked using voice recognition.

The following quote regarding user acceptance gives an insight into why this has been successful - "Pickers, initially sceptical, have adapted to Voice quickly, finding it simple to learn and adopt. Assemblers - even those who’ve been with us for years and very settled into the old system - really like the new technology. They’re keen to extend it to other warehouse activities as they’ve found it makes their jobs much easier."

Andy Brazier

Thursday, November 20, 2008

The Hawthorne Effect

I had cause to refer to this recently, and struggled to remember the name. Hopefully a summary here will help me remember in future.

Studies carried out at the Hawthorne Works (outside Chicago) between 1924-1932 showed that changes in the working environment could improve productivity. But the improvement was only short lived, leading to the conclusion that people were responding because something had changed, and not to the change to the environment itself.

There is an article on Wikipedia

Tuesday, November 18, 2008

Floods, fire and theft rank with ill-advised cost-cutting

Article in Risk Management suplement of the Financial Times By Andrea Felsted on 18 November 2008

"As the economic outlook for the UK becomes more gloomy, so the risks that businesses face multiply because, according to Peter Jackson, sales and marketing director for Aon Risk Services in the UK, "when times are tougher, the worst is more likely to happen."

One reason the risks are higher is because business have more stock because it is difficult to shift. Because of financial constraints they may wish to cut back on insurance, but actually lose more if there was a fire or through theft.

Other cost saving may be cutting back on maintenance, equipment may be replaced less regularly, or plant may be repaired rather than replaced. Directors need to be particularly aware of the risks as they may be held personnaly responsible if someone dies as a result of that equipment failing. Companies should have a clear idea of the minimum level of maintenance spend they can live with rather than just incrementally cutting and seeing what goes wrong.

John Scott, head of risk insight at Zurich Financial Services UK, says that when companies cut costs “management often take their eye off the ball on the simple, really key stuff around health and safety. We often see an increased trend of workplace injuries. That is something that is a consequence of tightening belts. Well-managed companies try to do both. They try to cut back but without jeopardising staff safety.”

Research commissioned by FM Global, an insurer of commercial and industrial property, suggests that risk management is not an area where costs should be cut.

Some 71 per cent of UK investment analysts it polled believed that companies should pay more attention to their risk management activities during the next five years.

Thursday, November 13, 2008

Just culture

A flow chart of an individual's culpability following an unsafe act has been around for some time. I think it was developed by Professor James Reason, and I know a few companies use it (or at least claim to ) to guide their disciplinary processes.

Anyway, I was looking for a copy of the chart, which was proving difficult. Eventually found it in the following paper from Titled 'A roadmap to a just culture: enhancing the safety environment' and published September 2004. It was prepared by the Global Aviation Innovation Network (GAIN) working group for safety information sharing.

I think the paper's forward by Reason gives a very good account of the issues:

"The term ‘no-blame culture’ flourished in the 1990’s and still endures today. Compared to the largely punitive cultures that it sought to replace, it was clearly a step in the right direction. It acknowledged that a proportion of unsafe acts were ‘honest errors’ (the kinds of slips, lapses and mistakes that even the best people can make) and were not truly blameworthy, nor was there much in the way of remedial or preventative benefit to be had by punishing their perpetrators. But the ‘no-blame’ concept had two serious weaknesses. First, it ignored - or, at least, failed to confront - those individuals who wilfully (and often repeatedly) engaged in dangerous behaviours that most observers would recognise as being likely to increase the risk of a bad outcome. Second, it did not properly address the crucial business of distinguishing between culpable and non-culpable unsafe acts."

Andy Brazier

Thursday, November 06, 2008

Report into Morecambe offshore helicopter crash

Accident occurred on 27 December 2006 whilst the helicopter was approaching a gas platform in Morcambe Bay. The two pilots and five passengers were killed.

Report No: 7/2008. Report on the accident to Aerospatiale SA365N, registration G-BLUN, near the North Morecambe gas platform, Morecambe Bay on 27 December 2006.
Report published 23 October 2008 by the Air Accident Investigation Board and available at their website.

The report suggests the co-pilot who was flying the helicopter on its approach to the gas platform became disorientated, probably due to darkness and weather. Handover of control to the pilot was not precise and the pilot himself was not ready to take control. This meant there was not enough to time to work out what to do before the helicopter hit the water at a speed that was not survivable.

This is a difficult accident to comment on. We usually look for root causes in systems and organisations so that we can make good recommendations. This accident is an example of how individuals can fail, and sometimes those failures will result in accidents. In others words, if we accept risk we sometime have to accept tragedy.

The report makes six recommendations, but none in my mind are particularly earth shattering, asking for reviews of standard operating procedures and suggesting some areas for research. A comment is made that a simulator was available for training, but had not been used. No recommendation is made, and given the level of experience of the pilots in this case it is difficult to see whether such training would have had much impact on the risks.

Andy Brazier

Monday, October 13, 2008

Legal Professional Priviledge

An issue came up with a client of mine recently where there appeared to be some confusion regarding legal professional privilege. Luckily I have happened to stumble upon a document on the HSE website which seems to sum things up quite nicely. From this I understand that the only documents that anyone can claim to be priviledged are ones where a client asks their solicitor for legal advice. Any other document (e.g. accident investigation report, audit report etc.) will have been produced for a different purpose and so an HSE inspector could demand a copy if it was considered essential for their investigation.

The text from HSE is shown below - aimed at HSE inspectors.

33. Your powers under section 20 cannot compel the production of documents which are entitled to be withheld on grounds of legal professional privilege.
34. Legal professional privilege extends to communications, statements, reports and information created during the course of a solicitor-client relationship, the broad purpose of which is the obtaining and giving of legal advice.
35. If a document was created for several purposes, it will attract privilege only if the dominant purpose was obtaining legal advice. Such advice may relate to criminal or civil proceedings, actual or contemplated. Legal privilege attaches to communications between solicitors and expert witnesses, but not to the expert's opinion on the case or the documents or objects on which the expert based the opinion.
36. The privilege is that of the client, so that the client is entitled to waive privilege and show you the document or use it in evidence.
37. If a company has prepared a report on an accident, for example, this will be privileged if the dominant purpose was the obtaining of legal advice, but it will not be privileged if it is prepared simply because there has been an accident, or for avoidance of further accidents. You should remember, however, that you would only be entitled to see such a report if it is 'necessary'. An engineer's report that was obtained for the purpose of deciding whether to contest proceedings (civil or criminal) would be privileged.

Monday, September 29, 2008

Technology that eliminates error

I am always on the look out for the claims of technology that eliminate the chance of error. What they all seem to overlook is that, just because one type of error may be eliminated a new type is usually introduced that may actually be worse. I'd say in most cases the chances of recovering an error is usually greatly reduced. These downsides of technology have been known about for many years, but still seem to be overlooked. Also, I very much doubt many of the bold claims are ever properly checked with actual experience.

Here are a couple I have found recently.

Air Products Uses Masternaut Satellite Tracking 18 September 2008. Peter Birdsall, UK Transport Manager, sees this product as "eradicating any chance of human error." I presume he means the satellite tracking combined with customer order information means that delivery drivers cannot turn up at the wrong location. However, what about the programmer error? If the wrong information is entered into the system I would say it would be very unlikely that anyone would notice.

Kelsius wireless monitoring 15 September 2008. Apparently this "monitoring to internet solution removes human error." It seems to rely on wireless sensors being placed in fridges which send information to a centralised database. The information can be used to prove compliance and will alarm if there is a problem with the fridge. However, how do you know the sensor is in the right place or that the alarm points are set correctly? Will this stop the visual checks?

Hospitals purchase blood tracking system 13 September 2008. Use of bar codes on blood used for transfusion is seen as "eliminating the sources of human error." But you still need to make sure the right bar codes are attached and the correct information stored as with this system I am pretty sure most visual checks of data will stop, or become far less effective.

Andy Brazier

Full Disk Encryption should be a legal requirement

Article by PR Artistry at Source Wire on 9 September 2008

If relates to the many recent stories of sensitive data going missing. This would not be such a problem if it had been encrypted.

Marc Hocking, Chief Technology Officer of BeCrypt is quoted as saying "If security is too cumbersome people will find a workaround." In other words, it is no good saying to people they musty encrypt data or even providing a technical solution that takes time and effort.

Hocking goes on to say "encryption technology is now available that is easy to roll out to all computers and data storage devices within an organisation, it can be centrally managed and it is transparent to the end user, so it does not affect their ability to do their job."

Andy Brazier

Formula sickens New Zealand babies

Article by Catherine Woulfe in the Sunday Star Times on 21 September 2008

Heinz changed the supplier of their branded formula baby milk, which consequently meant its ingredients were different. A sudden change to diet of a baby can make them unwell, and this happened in quite a number of cases.

Heinz published an unreserved apology explaining and stated that "human error had let the 400 cans of changed formula slip onto shelves with no warning." But Heinz's idea of a warning was a leaflet under the can lid. But when someone is buying a brand they have been buying for sometime, if the packaging remains the same (as it did in this case) would anyone have actually read any of the leaflets? Pretty poor management of change, especially given the sensitivity of the product.

Andy Brazier

"Worst ever decision"

Widely reported, including by Tom Cary in the Telegraph on 22 Sep 2008

Watford were awarded a goal in their match against Reading on 20 September. Unfortunately everyone, except the referee and his assistant could see it was several meters wide. It seems the result will stand and there will be no replay.

Andy Brazier

The economics of ergonomics

Article by Mike Kind published at on September 26, 2008.

Article states that "over half of employees who use computers for at least 15 hours per week reported musculoskeletal disorder (MSD) issues in the first year of a new job?" It represents
* 50 percent of all lost work days
* Costs U.S. companies over $61 billion per year in lost productivity.
* Results in pay outs of approximately $20 billion annually in benefits for these issues
* At an average cost of a work-related MSD of $27,700.

MSDs are injuries to muscles, tendons, ligaments, joints, cartilage, nerves, blood vessels and intervetebral discs of the spine. They vary from simply being annoying to to crippling and disabling.

The five may ergonomic risk factors are:

* High rate of movement repetition
* High forces
* Poor, deviated work postures
* High contact stress
* High vibration of part of the body, especially in cold conditions

Andy Brazier

Monday, July 07, 2008

UK: Slow Mailing Invalidates 500 Speed Camera Tickets

Article in 7 June 2008

Police in Sussex, England have been forced to cancel 500 speed camera citations after an employee loaded the wrong envelopes into the machine, delivery was slowed enough that a court could have invalidated many of the tickets for lack of proper notice.

"Some notices were sent by second-class post as a result of human error in loading the postal machine... we've canceled all relevant tickets," Chief Superintendent Peter Coll explained.

Using second class saved £45 in postage costs but the canceled tickets were worth £30,000 in revenue.

Seems a bit strange to me. The police have 28 days to get the notices to drivers. They must be sending them late for this error to have caused a problem.

Andy Brazier

Friday, June 20, 2008

A crisis of enforcement: The decriminalisation of death and injury at work

A paper written by Professor Steve Tombs and Dr. David Whyte June 2008. It is available from the Crime and Justice website.

According to the paper "At least twice as many people die from fatal injuries at work than are victims of homicide." The figure of 1,300 work related fatalities was calculated from the HSE data for work place fatalities (241 for year 2006-7) and an estimated figure for road deaths that are work related (considered to be about 1000 of the 3500 killed on UK roads each year). This is compared with with 765 homicide victims.

The report argues that the recent trend towards the `light touch' regulation of business has in effect `decriminalised' death and injury at work. Also, a reduction in the capacity of bodies such as the Health and Safety Executive to inspect business and take appropriate action due to budget and job cuts has led to a situation where the vast majority of the most serious injuries, as well as many deaths, are not subject to any form of investigation. This raises questions about whether the current policy preoccupation with `conventional' crimes such as homicide, street violence and theft should be complemented by a much greater focus on workplace crimes and harms.

Professor Steve Tombs said "Violent street crime consumes enormous political, media and academic energy. But, as hundreds of thousands of workers and their families know, it is the violence associated with working for a living that is most likely to kill and hospitalise."

Dr David Whyte said: "HSE enforcement notices fell by 40% and prosecutions fell by 49% between 2001/02 and 2005/06. The collapse in HSE enforcement and prosecution sends a clear message that the government is prepared to let employers kill and maim with impunity."

The report is described as being part of a project that aims to "stimulate debate about what crime is, what it isn’t and who gets to decide." That being the case I hope the authors are not offended by my comments that follow.

I find the logic of the report very difficult to follow. I just can't see a comparison between work related accidents and murder being valid or useful. Equally, if I were to be arguing such a case I would have thought road accidents or even work related health issues would have been far more interesting to investigate as both do kill far more people than workplace accidents.

I think the argument the report is trying to make is that more HSE inspectors are required so that more accidents can be investigated and more companies prosecuted. But there is no attempt to show that falling numbers of inspectors has actually resulted in more accidents.

Also, I think the authors feel that prosecuting more companies will inevitably improve safety. I am not sure this is the case. I believe our aim in safety is to learn from accidents. Any suggestion of a prosecution immediately creates an adversarial situation. Hence, rather than sharing information and learning the company has to construct a defence, which will almost always reduce the information that is made available.

I'd say the article is interesting and could create a good debate. But I personally, do not agree with the general theme.

Andy Brazier

Monday, June 16, 2008

Overzealous health and safety consultants

Select Committee on Work and Pensions Third Report examines the interpretation of health and safety legislation. It is available at parliament website and was published 2 April 2008.

Paragraph 75 reads as follows:

"A number of witnesses suggested that a key issue for employers was that the risk assessment process was often over-burdensome and it was argued that this could be exacerbated by the approach of some health and safety consultants and advisers. Mr Richard Jones, Policy and Technical Director at IOSH, the professional body that represents health and safety consultants, explained that IOSH has had informal discussions with HSE and was told that inspectors had raised concerns about the credibility of the evidence used by some consultants to form the basis of risk assessments. [74] Lord McKenzie of Luton, Parliamentary Under Secretary of State at DWP also acknowledged this was an issue saying, "I think it is certainly a fact that this happens and there is a lot of evidence and information to suggest that it does."

I am sure it is true there are some consultants out there who are overzealous, but that sounds like scapegoating to me. The most overzealous people I come across are inhouse advisors who quote regulations and fail to provide any practical advice. Consultants only get paid when they deliver, so I really don't believe this is the root cause of the problem. It seems to me that clients get what they ask for, and often consultants are given little scope to drive improvement.

Andy Brazier

Judith Hackitt - Closing remarks at Major Hazards conference

Judith is HSE Chair. She was talking at the end of HSE’s major hazards conference on 29 April 2008 at the QUII centre, London. This conference was attended by senior managers from a number of large companies. I believe it was partly in response to the BP Texas city accident.

Judith's key messages are shown below. They are available from the HSE website:

1. Process Safety cannot be managed or led from the comfort of the Boardroom. Real leaders have to demonstrate their commitment by walking the talk – which means going out and seeing for themselves. All too often senior managers and directors are far too detached from the reality of what is actually taking place on the ground.
2. If the people on your Board don’t know about/understand process safety, then they must learn. We cannot assume that Board members understand the concept. This is not something which can be delegated. You are responsible and you must lead, and to lead you must understand.
3. This is not about glossy volumes of procedures and management systems - it’s about listening to the people at the coalface who really know what’s going on. Procedures which look wonderful but are not being followed in practice are no use. Whatever system is in place has to be geared to ensuring safe operation – not to creating good impressions – whether that be for the senior management of the organisation or indeed your regulators.
4. We have heard also that every Board needs to consider what the real vulnerabilities are and address them – and they also need to know that it is OK to seek help and advice from others – that’s also part of real, honest leadership.

We’ve heard about the importance of consistency – leadership credibility takes a long time to build but an instant to lose with one inconsistent decision – “production comes before safety, just this once” simply will not do – the whole culture will be destroyed.

Errors in Medicine Administration: How Can They Be Minimised?

Article on Red Orbit website 14 June 2008 by By Venkatraman, Ramya Durai, Rajaraman

Errors in medicine administration can be lethal. Neontal, patients receiving chemotherapy and confused elderly patients who are receiving more than five medicines seem to be most vulnerable.

The route of administration and dosage of medicines are of vital importance. Common causes of errors in medicine administration include:

* inattention
* haste
* medicine labelling error
* communication failure
* fatigue (Abeysekera et al 2005).

The Department of Health's document An Organisation with a Memory (DH 2000) reports that 850,000 adverse events may occur each year in the NHS costing more than Pounds 2bn.

To reduce the risk of error, medicines should be prepared for only one patient at a time. Intravenous (IV) medicines should not be prepared at the same time as medicines to be administered via other routes (for example, nasogastric (NG), oral or intrathecal). All medicines whether they are administered via NG or IV should always be clearly labelled with the patient's details including name of the medicine, the dose and the route of administration to avoid confusion. To reduce errors, high risk medicines should be checked with a second qualified person and signed on the prescription chart. This second person should check that it is the correct medicine, the correct dose, the correct frequency, the correct route of administration and the correct patient.

A Spoonful of Sugar (Audit Commission 2001) discussed medicine management in NHS hospitals. The Audit Commission suggests:
* Induction and training of junior doctors regarding medication prescribing and error reporting.
* A focus on near misses to avoid repetition.
* The use of computer technology for avoiding errors from illegible prescribing.
* The integration of clinical pharmacists into clinical teams.

Recommendations from An Organisation with a Memory (DH 2000)*
* Avoiding the use of unsafe abbreviations.
* Reducing polypharmacy.
* Periodic medication reviews.
* Inclusion of the indication for all medications.
* Reading out the prescription and explaining the need to patients.

Electronic prescription systems ('e-prescribing') are a new concept that may help to avoid administering wrong medicines and wrong doses.

Errors in medicine administration can be minimised by applying a systematic approach to administration. Safe administration requires that the correct patient is identified against the prescription (noting allergies and sensitivities), checking the dose with BNF or a pharmacist when there is any doubt, double checking the medications with another qualified staff member together with regular education of staff about the importance of reporting all near misses and adverse events.

Case 1 Route of administration error

A 16 year-old boy presented with polytrauma including a pelvic fracture. A nurse gave soluble paracetamol (1 gram) intravenously by error instead of the nasogastric route. Fortunately, the patient recovered after a few hours without any intervention.


The temporary (bank) staff member was tired. The registrant was unfamiliar with medicines handling and administration. The patient and family were informed fully about the incident, and the bank staff member was cautioned. A decision was made to ensure that all qualified bank staff had undertaken appropriate medicines management training and were competent to administer medicines within that clinical setting.

Case 2 Dosage error

A ventilated 27 week-old premature baby suddenly deteriorated. On examination the baby was found to be inadequately sedated and was trying to take breaths against the ventilator. The ventilator was set to volume control rather than pressure control.


Under stress, the nurse did not label the medications, even though she had completed an IV study day. All nurses and non- nursing qualified staff should receive training and complete competencies before administering any medicines by any route. Wherever possible two registered staff should check medicines for intravenous administration - one of whom should also be the registrant who administers the medication (NMC 2007). After this incident, the nurse was cautioned. The nurse and her colleague who countersigned the CD register had to undergo further training on IV medicine administration and successfully complete their drug administration competency (administration under supervision) and medicine calculations before they were allowed to administer medicines without supervision.

Case 3 Error in frequency of administration

A 30 year-old female, who underwent fixation of a fractured toe with a K-wire, vomited twice in the postoperative period. She had received two doses of cyclizine 50mg at eight hour intervals as recommended in the Special Product Characteristics (SPC) ( The nurse on the night shift gave a third dose of cyclizine, one hour after the second dose, without looking at the time of the previous dose.


Even though the staff member was fully trained, tiredness and stress caused her to make an error. The staff member was cautioned.

Article concludes by saying "Errors con be minimised by applying a systematic approach to administration." Interesting to note the case studies all talk about "cautioning" staff and retraining. I am not sure this shows any systems were improved.

Andy Brazier

Red Faced Council Got Its Sums Wrong

Article in the Press and Journal on 10 June 2008 by Jamie Bachan.

Aberdeenshire Council have had to apologise after it released figures in response to a freedom of information request were dramatically wide of the mark. When asked to provide details of the amount spent on agency and temporary workers in the last financial year they came up with a figure of just over £27.3million.

The actual figure was £4,036,922. The council said "This was down to human error where the cumulative figures for each month were added together, rather than individual monthly figures."

This sounds like another spreadsheet error, which is ironic as I only posted an article on this very subject about a week ago.

On receiving the first figure the GMB union, who asked for the information, descirbed it as a “horrific abuse of the public purse.” I wonder how many times the incorrect figure will be quoted in years to come by people looking for evidence of overspending by doing a quick search of the internet.

Andy Brazier

Monday, June 09, 2008

Lewis Hamilton's pit lane crash

The incidents is summarised on the BBC website

Canadian Grandprix on 8 May 2008. The safety car had been sent on the the circuit following a crash. Several cars used this as an opportunity to get new tyres and fill with fuel by going to the pits. When going to rejoin the race Hamilton crashed into the back of Kimi Raikkonen who was stopped at a red light. Both cars were too badly damaged to continue.

Hamilton clearly made an error. He didn't see the red light and didn't realise the cars in front had stopped until it was too late. But my question is why did he make the error.

I noticed on the TV highlights that a car also went into the back of Hamilton, showing others made the same mistake. This leads me to wonder whether the lights are located correctly. The drivers in front can see them but people behind can't. Clearly Hamilton did not crash on purpose as he was having a great race. The crash allowed Kubica to win and take Hamilton's spot as number 1 in the rankings.

Of course Hamilton is punished and everyone says he is stupid. No ones asks why two people made such a fundamental error, so it will happen again.

Andy Brazier

Friday, June 06, 2008

Workers’ safety fears at Fawley refinery

Article from the The Southern Daily Echo on 3 May 2008 by Peter Law. It has sparked quite a lively debate on the newspaper's website. The HSE report that prompted the article is also available.

Excerpts from the article are shown below.

"Anxious staff at the giant Fawley oil refinery have revealed their fears of a major accident at the plant in a shocking new report obtained by the Daily Echo. The workers highlight the refinery's ageing infrastructure and lack of maintenance among their major concerns. Other staff at the complex - the largest of its kind in Britain - also admit under-reporting minor incidents, accidents and near-misses for fear of losing their cash bonuses received for their safety record, says the document."

This all comes from a report by inspectors from the Health and Safety Executive (HSE) in which they conclude they "had never encountered such a prominent and pervasive blame culture at any other refining and chemical complex in the country. Of particular concern was the extremely high numbers of staff stating that they would not be surprised if a major incident were to occur in the near future,"

In a statement the company said "Esso and ExxonMobil Chemical at Fawley strongly reject any claims that the Fawley site is unsafe. Fawley is the safest refinery in the UK for both personal safety and process safety, according to the latest figures from UKPIA (UK Petroleum Industries Association)." Also, "We take the safety of our people and of the local community extremely seriously. We have rigorous safety procedures in place and are regularly inspected by the Health and Safety Executive as to the safety of our plant and processes."

The HSE's Human Factors Inspection Report was the result of a two-day audit held with about 78 employees on January 8 and 9 and a feedback meeting on January 29.

The report also claims that although people were encouraged to report accidents or incidents, it seemed some staff were under-reporting because their bonuses were linked to safety. "Since the reward scheme is linked to safety eg lack of incident, it appears to have provided individuals with an incentive to cover up and not report minor incidents, accidents and near-misses as otherwise they (and their team) will be blamed for an incident and lose safety bonuses."

"As minor incidents/accidents are not being reported, the site may be missing precursors to something significant.

"The prevailing view is that when something goes wrong, the search is on for someone (and their supervisor) to blame, the fact that systems may be at fault appears not to feature. Worker's wide-ranging complaints also ranged from lack of morale to inadequate staffing levels, endemic overtime and ad hoc training. A fire, which occurred late last year, was put partially down to fatigue as a result of excessive working hours.

The report claims the organisation's blame culture stops some employees from raising issues and taking on additional responsibilities or overtime.

"Some participants felt uncomfortable raising issues, even with managers higher than the shift leader, but others felt that the culture is such that people don't want to raise problems and there would be repercussions if they did," the report states.

Senior staff expressed concern that trainees were not being given sufficient time to consolidate their training and that they may not have enough experience of the plant to deal with emergency situations.

There was a view that staffing levels were adequate on paper but in practice areas were badly staffed. This was partially attributed to stress-related sickness absence brought about by overtime, fatigue and the blame culture.

"There was a general lack-lustre feeling amongst staff, a lack of motivation compounded by fatigue and lethargy. Employees are beginning not to care about their roles or jobs being down to the required standard," the report states.

"The inspectors concluded that significant work needed to be undertaken to achieve full compliance with legal duties."

Andy Brazier

Unacceptable error

Article on This is Total Essex 4 June 2008

This website seems to be related to the local newspaper, so I guess that explains the slightly bizarre reporting.

Apparently when Brentwood Borough Council implemented a new computer system they managed to take council tax payments five days early, affecting 21,000 people "leaving many in the red, short of cash and with the prospect of hefty bank charges looming."

The article calls this a "shocking revelation" that "is just unacceptable" and "was a mistake which quite simply should never have occurred in the first place." It says the council has put this down to human error and it has "promised to refund any charges incurred."

What is bizarre is the article goes on to say that if a resident paid their task bill five days late they would probably be "hit them with a substantial fine and the threat of legal action." Which I doubt completely.

The article suggests the council "should launch an extensive investigation into this appalling error, take the appropriate action to ensure it never happens again, and do everything in its power to regain the trust of those residents affected." I would guess the council has already done this, and has promised to pay refunds.

All I can guess is the un-named reporter has never made an error, and if he/she did would look simply sweep it under the carpet, which he/she seems to have assumed the council have done.

Andy Brazier

The Spreadsheet Love Affair

Article on ZDNet 3 June 2008 by Dennis Howlett.

Dennis discusses the errors that occur in spreadsheets and how it is very difficult to detect them. It is suggested that 95% of spreadsheets have errors. This is because the error per cell is a "few percent" and so for any large spreadsheet at least one error is inevitable.

He says examples of the consequences of spreadsheet errors range from "a mortgage provider that overpaid some $270 million for a debt book, through to energy futures overpaid by $9 billion down to the $2 million a month interest calculation error."

Dennis' view is that "the spreadsheet was never designed for the sophisticated uses to which companies continue to put it. At best it is a development envronment that is rarely documented because users are not trained as developers. The net result is that when things go wrong, errors are notoriously difficult to find. What’s more, there seems to be a fundamental lack of awareness around the extent of spreadsheet error."

He questions why companies continue to use spreadsheets given the risk, but then seems to answer this by saying "the spreadsheet is seen as convenient in a way that other applications are not and that the learning curve is sufficiently shallow for anyone to pick up the basics and do something useful. It’s also cheap, often pre-installed on user machines at low cost in bulk deals."

What I don't understand is what Dennis is proposing as an alternative.

Andy Brazier

Tuesday, June 03, 2008

What do ergonomists do?

Article entitled "Ergonomists: Light relief for desk-bound employees" in The Independent Career Planning Section on 22 May 2008 by Caroline Roberts.

Suzanne Heape, an ergonomist with experience in a wide variety of consultancy work, is quoted. She "relishes the problem-solving aspect of her career, and also enjoys helping people."

"In workplace assessments, you spend time watching people at their desks or at manual work stations, looking at their posture, adjusting equipment and assessing the general environment, such as heating and lighting."

"Collaborating with workers in other professions can be challenging, because some lack awareness of the importance of ergonomics."

Liz Butterworth, a principal ergonomist with consultancy Human Engineering, is also quoted. "Almost every major accident has some element of human error involved, so the emphasis is on supporting people in the tasks that they do and reducing the chances of them making mistakes," she says. "We help establish the requirements and ensure they are captured by the people doing the design."

"To be a successful ergonomist, you need to be methodical, and good at listening to people and gathering information. Communication skills are also important, because you must be able to convey complex information in a way that clients can understand."

Andy Brazier

Approaching Safety and Ergonomics Strategically

Article on the Occupational Hazards website 20 December 2005 by Robert Pater

Strategy entails both vision and action. Part of this is looking at what are already doing and what works and what does not. Which interventions are working and which are in the domain of diminishing returns? Which organizational forces currently block improvements and which are supporters? What are the current leadership strengths and limitations? But you need to go beyond what's been previously done if you want to see different results.

Keep in mind that how you initially look at a problem can funnel you into a limited set of solutions. For example, defining ergonomics as making work fit the worker limits intervention to engineering the environment to fix selected problems. But taking ergonomics literally means the science (-nomics) of work (from the Greek word 'ergon"), giving a more "strategic" definition as: improving the fit between people and their work (to improve safety, productivity and morale). This paradigm opens up three different approaches:

* Bringing work "closer" to people (through design, redesign, positioning, etc.);
* Bringing people "closer" to their work (through improving mental skills of attention control, risk assessment, judgment, team focus, etc., and physical skills of improved coordination, leverage, balance, flexibility, range of motion and more); and
* Bringing work closer to people as well as people closer to their work.

This last approach is most preferred. For example, it is more efficient, whenever possible, to take the stuck lid off a jar by twisting the bottom clockwise and the lid counterclockwise (rather than just holding the bottom stationery while working on the top).

There are limitations beyond initial costs. Ergonomics improvement can deteriorate into safety hazards (think of worn-down nonskid mats with curling-up edges); and might require workers to change. (in one place introducing recoilless rivet guns actually exacerbated those hand and arm injuries they was purchased to prevent - until riveters were trained how to gauge the different kinesthetic feel of setting the rivet with the new tool).

A strategic human factors approach relies on effective communication and training to motivate use of and to transfer new skills; requires a work force able to receive communication (are there language or other blockages?); necessitates time away from job tasks for training and reinforcement; can be logistically challenging for multiple sites (especially where facilities have few employees); and is not automatically in place for new hires. But it has the advantages of

* Improving situations where engineering solutions have been exhausted in difficult-to-control environments;
* Is portable to wherever people are - in multiple locations and environments, at work and at home; and
* Can boost involvement and morale while heightening worker abilities that transfer to other needed arenas.

Andy Brazier

The crash-proof car is coming

Article in The Times 11 May 2008 by Emma Smith

"Imagine a world in which parents could nonchalantly hand over their car keys to their teenage son, safe in the knowledge that the car would look after him. A future in which human error is eliminated by electronic systems capable of foreseeing smashes and taking preventative action; a world in which car crashes almost never happen."

I am always concerned when people say human error will be eliminated by some form of automation. Yes, the opportunity for some operator errors may be reduced, but what about maintenance errors and how does it affect operator behaviour?

In this case the proposal from Volvo is for a system that monitors what is going on around the car and applies the brakes to avoid collisions.

In fact the article goes on to quote Peter Rodger, chief examiner for the Institute of Advanced Motorists. "We have to be very careful not to ‘underload’ the driver. There is an issue in the airline industry that if the pilot is inadequately involved and something goes wrong, it takes them a long time to actively take over.

“There needs to be adequate involvement so the driver isn’t allowed to switch off in this way, so that they are ready to react if something goes wrong. We also need to be confident that these systems have the power to work in myriad real-life situations.”

Volvo refer to some interesting research they have made. They claim "about 50% of drivers don’t brake at all before a crash – perhaps because they are paralysed by fear or simply distracted. The other 50% may brake, but probably not as effectively as they could do."

Andy Brazier

Tuesday, April 29, 2008

Cargo flight near catastophe

Article from BBC website 29 April 2008. Summary of the air accident investigation report of a crash landing of Belgian Boeing 737 cargo plane operated by TNT airways (full report here. The plane tried to land at East Midlands airport, damaged its undercarriage and then made an emergency landing at Birmingham. None of the crew were injured.

The report says that at a critical moment in its approach to East Midlands airport, air traffic control passed a message to the pilot from his company instructing a change of destination. This should not have been done at this time, it confused the pilot who inadvertently turned off the autopilots. The plane lost height but, instead of aborting the landing he continued whilst trying to re-engage the autopilots. The plane came down on grass alongside the runway but then became airborne again. By the time of the landing in Birmingham the plane had no right hand landing gear and its flaps were jammed.

Failures identified by the report include
* Weather forecast did not warn of mist or fog that caused an unexpected diversion from Standsted;
* Air traffic control passed on a message at an inappropriate time;
* Captain lost situational awareness when he inadvertently disconnected the autopilots
* Neither the captain or co-pilot called a "go around" even though they knew they had problems during the approach.

The report recommends that TNT Airways review its standard operating procedures. They actually sacked the pilot a month after the accident saying the incident was down to human error.

It seems pretty poor to me that the pilot got sacked. Even the company said he showed skill in handling the situation. Whilst errors were made, there are a number of contributory factors. TNT claim to have a "zero accident tolerance level" but clearly no understanding of human factors.

Andy Brazier

Friday, April 25, 2008

Awareness training is not enough

Article on Secure Computing website by Paul Fisher 11 April 2008. An interview with Brat Hartman, Chief Technical Officer at RSA.

Hartman is asked about preventing human error where data security is concerned. "Less and less information is actually under the control of central IT these days. Information is created everywhere, it's out on everybody's laptops, it's outsourced, it's developed all over the world" He believes that having the right technologies in place to maintain that control is only the solution and that training, whilst crucial, is of limited value.

Everybody goes for training is told about the company security policy, read it and then ignore it, happily sending out data on USB sticks and web-based email, because they are under pressure to get things done and achieve results.

"The world is too complicated and, frankly, it's too difficult to be able to follow those policies under strain. I'm a believer that the right technologies have to be in place to be able to control and enforce that."

To him then, no matter how much training people have or how often you remind them of the importance of security, they will go on making mistakes. Security is typically down the list in terms of priorities. Most people view barriers as an impediment.

A nice turn of phrase: clever people doing stupid things - it could be the title of a self-help book for information security professionals. Hartman believes that all the technology needed already exists, but that the real problem is a failure of application.

I can see Hartman's point of view, but I am not sure how it works in practice. There is the danger that you put all the technical controls in place but they then make the job too difficult. Rather than making errors, people then have to implement far more sophisticated work arounds that ultimately can be more risky. Equally I agree most training in all domains often fails to achieve its objectives.

Andy Brazier

Oldies but goodies

Article from the April 2008 edition of DC Velocity by Tobey Gooley.

Populations in the US (and I guess the UK) are getting older. In many cases this is currently being counteracted by an influx of younger immigrants, but this may not continue. Whilst it is easy to focus on the negatives of the ageing population, there are many positives with some studies showing employing older people can result in improved productivity and safety.

The article lists 16 steps to a safer workplace with older people in mind (although they probably help for all ages).

1. Improve illumination and add colour contrast.
2. Eliminate heavy lifts, elevated work from ladders and long reaches.
3. Design work floors and platforms with smooth and solid decking while still allowing some cushioning.
4. Reduce static standing time.
5. Remove clutter from control panels and computer screens, and use large video displays.
6. Reduce noise levels.
7. Install chain actuators for valve hand wheels, damper levers, or other similar control devices. This brings the control manipulation to ground level, which helps to reduce falls.
8. Install skid-resistant material for flooring and especially for stair treads.
9. Install shallow-angle stairways in place of ladders when space permits and where any daily, elevated access is needed to complete a task.
10. Utilize hands-free, volume-adjustable telephone equipment.
11. Increase task rotation, which will reduce the strain of repetitive motion.
12. Lower sound-system pitches, such as on alarm systems, as they tend to be easier to hear.
13. Lengthen time requirements between steps in a task.
14. Increase the time allowed for making decisions.
15. Consider necessary reaction time when assigning older workers to tasks.
16. Provide opportunities for practice and time to develop task familiarity.

Andy Brazier

Report on the loss of the "Bourbon Dolphin"

Article in The Norway Post by Rolleiv Solholm 29 March 2008.

"It is not possible to show that an individual error, whether technical or human, led to the loss of the anchor-handling vessel “Bourbon Dolphin” on 12 April 2007." 8 people died and 7 survived the accident.

The Commission's report concludes that a series of circumstances acted together to cause the loss of the vessel. The proximate causes were the vessel’s change of course to port (west) so as to get away from mooring line no. 3, at the same time as the inner starboard towing pin was depressed, causing the chain to rest against the outer port towing pin. The chains altered point and angle of attack on the vessel combined with its load condition and the fact that the roll reduction tank was probably in use caused the vessel to capsize.

A combination of weaknesses in the design of the vessel, and failures in the handling of safety systems by the company, by the operator and on the rig, are major contributory factors. System failures on the part of many players caused necessary safety barriers to be lacking, were ignored or were breached.

Recommendations include in the future requirements are made for the preparation of stability calculations subject to approval by the authorities, formal training of winch operators, a review of requirements for survival suits, plus placement and installation of rescue floats. Safety management systems and risk assessments must be improved, there must be routines for overlap of new personnel and identification of the necessary crew qualifications, plus the preparation of vessel-specific anchor-handling procedures.

Operators’ rig move procedures must be made specific for every operation and be simple to understand for those operating under them. Operator and rig must prepare risk assessments for the entire operation before it is commenced. When the operation is executed, safety and coordination must be continuously evaluated. The Commission also proposes that an attention zone be introduced along the anchor line, indicating a maximum distance within which the vessel shall remain when running out anchors.

Andy Brazier

Lets' dehumanise management

Article on by John Pope 8 April 2008

It starts "we all know that managers can make mistakes in selecting, managing and promoting staff. " HR often have the role of baling them out, which is "an enormous waste of time." Adverts suggest that the science of management allied to the power of IT can solve these problems. The author goes on to examine the reality.

* Recruitment - selection can be done online, and an interview is not necessary. This may well avoid employing mavericks or those who do not fit in. It may be safe, but a bit boring.
* Induction - Initial induction can be at the gate, watching a video and completing an online questionnaire. If they pass they can move on to their department where something similar will happen. They can then learn about their new co-workers online.
* Problems with pay and allowances - again online or via a call centre
* Attitude and retention surveys, 360o appraisals - all online.

So everything can be done without any personal contact between employee and supervisor/manager. But this means opportunities to find out what is really happening and to form relationships are missed. Machines don't make mistakes, although the people who instruct them do, but they have not imagination or ability to take a chance on someone who may be a bit of a maverick but may be innovative or have the ability to create new business.

All too often we focus on the negatives of having people in the system by talking about the errors they make. We somehow overlook the positives. This even applies to accidents investigations where we are always quick to identify who did something wrong, but the vast majority would have had far worse consequences if people had not intervened when they did.

Andy Brazier

Human factors integration: cost and performance benefits on army systems

Paper from the US Army Research Laboratory by Harold R Booher published July 1997. Examines cost savings from human factors integration in design of army systems. The basic premise is that the solider is an integral part of the system and not an add on. Primary objectives are to assure that:

1. Adequate number of personnel with the right skills with the proper training are accounted for in the design
2. The system being designed will adequately perform the missions it is being design to do
3. The system will perform safely with a minimum potential for health hazards or soldier casualties.

Cost savings or avoidance are expected, but as a secondary objective.

Case studies

Comanche lightweight helicopter:
* Workload study showed that a one-person crew would be overloaded in critical events. Design for a two-person crew was adopted, but was justifiable.
* Task analysis was used to prioritise information to crew at specific points during missions. As an example, procedural steps required during target reporting were reduced from 34 to only 5.
* Standard rotor design met government specifications. But a new design taking into account human factors resulted in a rotor that could be maintained by less people requiring lower skill level, was less prone to maintenance error and damage during transport. Design changes required 395 man-hours (estimated cost $50,000) but lifecycle changes estimated as $150 million.
* Engine maintenance simplified. Torque wrenches not required. Connectors unique to prevent improper installations. Training burden reduced by 40%.
* Use of graphite-epoxy composite materials allows 50% of exterior skin to have access doors and panels. Identifying tasks to be performed in field allowed these to be conveniently located for access to required parts. Also, some act as work platforms eliminating the need for ladders etc.
* Projected 12,200% return on cost of human factors on the project
* Predicted that 91 soldier fatalities and 116 disabling injuries will be avoided over 20 years of use of the helicopter due to improved outside visibility, improved situational awareness, better warning of engine problems, avoidance of ground accidents during maneuvers and maintenance.

Apache Helicopter
* Original design of a control panel meant it interfered with seats during a crash and reduced how they absorbed energy and hence crew injury. Human factors analysis allowed the panel to be redesigned so that it was smaller and hence did not interfere with the seat.
* A review of maintenance practices showed that personnel habitually stood on engines, supports and hinges to gain access, which can all cause damage and injury. Support structures were redesigned to incorporate a work platform that avoided the problems.
* Analysis of maintenance task showed that unrelated components had to be removed by additional personnel to gain access. Redesign removed this problem.
* $600,000 costs for human factors give a predicted lifecycle saving of $16.8 million, which equates to a 2,8000% return.

Fox reconnaissance vehicle for nuclear, biological and chemical sample pick up and analysis
* Workload assessment showed predicted four person crew would be overloaded. Redesign of workstations allowed crew to be reduced to three
* Improved interface with sample probe reduced mission time by 12%

Overall, the case studies showed that human factors allows new technology to be used so that more benefits are achieved. Using an iterative "design," "test" and "evaluate" model allows systems to be evaluated before they are built. In one case (Fox vehicle) human factors turned the failing project from being cancelled to a success.

Andy Brazier

Thursday, April 24, 2008

Manchester Patient Safety Framework

A number of resources are available on the National Patient Safety Agency website to allow NHS organisations to assess their progress in developing a safety culture.

The principles are based on Westrum's typography of organisational communication (1992) that was expanded by Parker and Hudson (2001) to describe five levels of increasing organisational safety culture as follows

A. Pathological - prevailing attitude is "why waste our time on safety" so that there is little or no investment in improving
B. Reactive - only think about safety after an incident
C. Bureaucratic - paper-based approaches involving ticking boxes to show to auditors and assessors
D. Proactive - place a high value on improving, actively invest in continuous improvement and reward staff who raise safety-related issues
E. Generative - the nirvana where safety is an integral part of the organisation

Andy Brazier

Medical device use-safety: Incoporating human factors engineering into risk management

Guidance for industry from the US Food and Drug Administration (FDA). Written by Ron Kaye and Jay Crowley and issued 18 July 2000

Hazards related to medical device use should be addressed during device development as part of the risk management process. Human factors engineering is a key element in achieving the goal of ensuring users are able to use medical devices safely and effectively throughout the product lifecycle. This requires an understanding of the interaction between users, devices and the use environment.

Evidence suggests that the frequency and consequence of hazards resulting from medical device use might far exceed those arising from device failures. Some of these errors will result in fatality. Whilst direct hazards associated with devices (i.e. chemical, mechanical, electrical etc.) are often well understood, use errors can cause medical problems through misdiagnosis (i.e. assigning the wrong cause to a condition and hence providing inappropriate treatment), failure to recognise and act on information (e.g. information from a monitoring device) or provision of improper treatment (i.e. device set up incorrectly when implementing a therapy).

Use-related hazards occur for one or more of the following reasons:
* Devices used in ways that were not anticipated in design;
* Devices used in ways that were anticipated but inadequately controlled for;
* Device use requires personal abilities (e.g. physical, perceptual, cognitive) that exceed those of the user;
* Device use is inconsistent with users' expectation or intuition about device operation;
* The use environment effects device operation and this effect is not understood by the user;
* The users capacities are exceeded when using the device in a particular environment.

Whilst user trials of devices are an important part of development, it is important to be aware of their limitations. For example, unusual circumstances often represent the greatest threat to safe and effective use of a medical device because users are less able to react appropriately to situations that occur infrequently, but these are difficult to predict during early development or to test during user trials. Also, users will often have a preference for ease of use and aesthetics, whereas the safest arrangement may require some design features that can slow down use of effect aesthetics (e.g. shields over critical controls, mechanical or software-based interlocks).

Human factors engineering shows us the following apply:
* Use environment - light, noise, distraction, motion/vibration, workload;
* User - knowledge, abilities, expectations, limitations;
* Device - operation requirements, procedures, device complexity, specific user interface characteristics

User characteristics include
* General health and mental state (stressed, relaxed, rested, tired, affected by medication or disease) when using the device,
* Physical size and strength,
* Sensory capabilities (vision, hearing, touch),
* Coordination (manual dexterity),
* Cognitive ability and memory,
* Knowledge about device operation and the associated medical condition,
* Previous experience with devices (particularly similar devices or user interfaces),
* Expectations about how a device will operate,
* Motivation, and
* Ability to adapt to adverse circumstances.

A good example for the user is diabetics. They are required to monitor their blood sugar levels on a regular basis, and electronic devices are available to do this. However, diabetics often suffer from retinopathy which affects eye sight. Blood monitoring devices have in the past been provided with small displays which many of the intended users cannot use reliably.

The following list can help identify potential scenarios that could result in hazard
1. Why have problems occurred with the use of other similar products?
2. What are the critical steps in setting-up and operating the device? Can they be performed adequately by the expected users? How might the user set the device up incorrectly and what effects would this have?
3. Is the user likely to operate the device differently than the instructions indicate?
4. Is the user or use environment likely to be different than that originally intended?
5. How might the physical and mental capabilities of users affect their use of the device?
6. Are users likely to be affected by clinical or age-related conditions that impact their physical or mental abilities and could affect their ability to use the device?
7. How might safety-critical tasks be performed incorrectly and what effects would this have?
8. How important is user training, and will users be able to operate the device safely and effectively if they don’t have it?
9. How important are storage and maintenance recommendations for proper device function, and what might happen if they are not followed?
10. Do any aspects of device use seem complex, and how can the operator become "confused" when using the device?
11. Are the auditory and visual warnings effective for all users and use environments?
12. To what extent will the user depend on device output or displayed instructions for adjusting medication or taking other health-related actions?
13. What will happen if necessary device accessories are expired, damaged, missing, or otherwise different than recommended?
14. Is device operation reasonably resistant to everyday handling?
15. Can touching or handling the device harm the user or patient?
16. If the device fails, does it "fail safe" or give the user sufficient indication of the failure?
17. Could device use be affected if power is lost or disconnected (inadvertently or purposefully), or if its battery is damaged, missing or discharged?

Having identified potential hazards, it is then necessary to implement a combination of mitigation and control strategies. The following should be considered in the stated order
1. Modify device design to remove hazard or reduce its consequence;
2. Make user interface, including operating logic, error tolerant
3. Alert users to the hazard
4. Develop written procedures and training for safe operation.

Andy Brazier

Friday, April 11, 2008

Developing an effective safety culture

James Roughton has been in touch with me having seen my blog. He runs a couple of blogs himself, including GotSafety which provides an ever expanding set of articles about accidents and safety news, mostly US based.

Having looked around James' sites I can see he has some really useful content. I particularly like his presentation on Developing an Effective Safety Culture. Some key points include
* Safety should not be a priority because priorities change. Instead it needs to be a core value installed in all parts of the organisation
* We are still using traditional approaches relying on signs and rules, lagging indicators and holding individuals responsible. A more proactive, integrated approach is required.
* A culture is about building relationships with employees.
* If you do what you've always done your get what you always got (Demming).
* To improve your culture you need to know where you want to get to. "If you don't know where you are going, the chances are you will get somewhere else" (Yogi Berra)
* Good leaders are accessible, available, believable, creative, reliable, and trustworthy.
* Sometimes we make things look so easy, we tend to forget the risk and their related hazards.
* Sometimes it’s the trivial things that results in a major “event.”
* Sometimes we do things wrong so often they become right….
* Accident statistics are basically a measure of luck.

James' page here is well worth a look

Andy Brazier

Monday, April 07, 2008

Video refs embrace technology but some caution is needed

Article in the Marlborough Express website on 3 April 2008.

It talks about the role of the video referee in sports such as Rugby and tennis. The concern is that they are becoming too involved and this may not be in the interests of the game because it slows it down and can undermine the referees on the ground. The article concludes "We have to accept that sometimes it is better for the game to include some human error, rather than letting technology run rampant."

I can see parallels here with industry. If we going to have people involved in our processes we need to let them get on with their job. We have to accept there will be errors, but also recognise the ability of people to detect problems, adapt to situations and improvise solutions is highly beneficial. We can't have one without the other.

Andy Brazier

Ergonomic tools: Science or fiction?

John Hedbor, the marketing manager at L.S. Starrett. posted the following checklist on in March 2008. I thought it was a very good summary of issues related to hand tools

A good tool should reduce the risk of direct injury. It should:
* not have any sharp edges on the handle
* minimize wear and tear on the skin
* reduce the risk of users' hands getting caught in tight spots
* reduce the risk of users' hands coming into contact with sharp edges and shoulders
* be slip-resistant

A good tool should reduce the risk of long-term injury. It should:
* have the optimal weight for its purpose
* have a grip that protects the user from hot/cold temperatures
* minimize the build-up of muscular tension during lengthy jobs
* have a large gripping surface that exerts low, even pressure across the hand
* deliver the greatest power with the least possible effort
* vibrate as little as possible
* be perfectly balanced

A good tool should make the tool user's job easier. It should:
* be the correct size and design for its purpose
* be able to be used in different positions
* not require the user to change grip, if possible
* be adjustable in many different positions
* be adjustable even when wearing gloves
* be designed for use with both hands, if required
* be easy to hold, with the right degree of friction against skin
* be available in different sizes, suitable for different tasks
* tolerate oil and grease

Andy Brazier

Thursday, March 13, 2008

Constructing Excellence

Article on the Contract Journal website on 12 March 2008 by

It starts "Give a man a fish and you feed him for a day. Teach a man to fish and you feed him for a lifetime, or so the saying goes. The same may well be true of health and safety. While it's great to have the latest safety equipment, it's irrelevant if nobody shows you how to use it properly."

The article is actually about an organisation called 'Constructing Excellence' that aims to improve safety in the industry. "Through this initiative, companies put forward current projects that in one way or another are demonstrating innovation or best practice in their development." Constructing Excellence then "work alongside these leading-edge projects to capture the knowledge, benchmark their performance and use the resulting case studies to demonstrate the business case."

I know nothing about the scheme or whether it is being successful. But it very much fits with my view of the world. Too often I hear about centralised initiatives where everyone is expected to follow what is considered to be best practice. However, I never understand how a single approach can really be best for everyone. In fact, the best things are done at a local level, where people are prepared to take a risk to be innovative. Hence, rather than running things centrally I feel it is better to give local groups some scope to be innovative and take some risks, but in a context where learning (about success and failure) is shared.

The article is a bit strange because it does not provide a link to the organisations website. It also gets the name wrong on several occassions, calling is 'Construction Excellence' rather than 'Consulting Excellence.'

Andy Brazier

Monday, March 10, 2008

Q&A: UK forces equipment failures

Interesting article on BBC website 15 February 2008 by Paul Adams.

Inquests into service men deaths in Afganistan have highlighted shortages of equipment have put people in danger. But this article also references board inquiries that also referenced "poor tactical decision-making" and a "lack of SOPs" (Standard Operating Procedures) on the ground.

The value of some of the missions (high risk for minimal benefit) and whether commanding officers could have taken more time to organise and prepare their men before sending them out is questioned. With the unpredictable nature of 21st century counter insurgency operations is another factor.

I often hear "we don't have enough equipment" or "we don't have enough people" in the industries I work in. It always strikes me that, with the exception of very small cash strapped companies, this is an unhelpful comment. It is not the absolute number but whether it is enough for what you want to do. The option is always to scale back activity. The trouble is companies seem to aim to still do everything, instead of focussing on what is important. The consequence is everything gets done rather poorly and this introduces risk.

Andy Brazier

Monday, February 18, 2008

Process Safety Leading and Lagging Metrics

Published by Center for Chemical Process Safety December 2007.
Available at AIChemE website

Seems to provide a good insight into the types of indicators companies can use.

Andy Brazier

Friday, February 15, 2008

Tube driver in strain injury claim

Ariticle in the Scotsman on 14 February 2008.

Latona Allison, a train driver on the London underground, developed tenosynovitis in her right-hand wrist because she was not given adequate training in the use of the "dead man's handle" safety brake.

Although the claim was initially dismissed, it was upheld in the court of appeal hearing on 13 February. It came about because the design of the deadmans handle was changed, but no assessment was made or any training given to drivers on how to use it safely.

Lord Justice Smith said in a judgment that London Underground should not have introduced a new design for the safety device without taking advice from an expert. Had it done so, it would have identified the need for the drivers to be trained in the way in which they held the handle in order to minimise the risk of strain injury." Because no advice was taken, the company was in breach of health and safety laws as the training was not adequate.

I believe this could have wide implications. We seem to be particularly bad a managing change in the workplace, and this gives a great example of what can go wrong as a result.

Andy Brazier

Tuesday, January 15, 2008

Corporate manslaughter - homicide

I attended a very interesting talk today given by John Dyne from Dyne Solicitors

The Corporate Manslaughter and Corporate Homicide Act 2007 is due to come into force on 6 April 2008. It will mean that companies and organisations can be found guilty of corporate manslaughter if its activities are managed or organised by its senior managers in such a way that
1. causes a person’s death, and
2. amounts to a gross breach of a relevant duty of care owed by the organisation to the deceased.

The offence will be Corporate Manslaughter in England, Wales and Northern Ireland and Corporate Homicide in Scotland.

It will lead to prosecutions of companies where gross deficiencies in management lead to fatalities. The penalties will include

1. Unlimited fines
2. Remedial orders - the court can tell the company how to improve systems etc. (this seems to overlap with HSE's remit to a certain extent)
3. Publicity orders - not entirely clear but it could require companies to advertise the fact they have been prosecuted. This may be in the paper or even posters at company sites.

John suggests that to avoid prosecution companies need to:
1. Determine where health and safety management responsibilities lie
2. Determine how responsibilities are delegated and monitored
3. Ensure all senior managers are in a position to control risks
4. Increase health and safety training for senior management
5. Review policies
6. Maintain constant review
7. Consult with employees and give them the opportunity to raise issues
8. Ensure they can demonstrate correct attitudes, policies and systems.

John made the point that an investigation would involved interviewing employees. They may say a lot of things about the company that management may not know themselves before the event. Management can no longer afford to keep their head in the sand as the defence of ignorance will not apply.

The introduction of the act will make organisations liable for Corporate Manslaughter if a fatality results from the way in which its activities are managed or organised. This approach is not confined to a particular level of management within an organisation. The test considers how an activity was managed within the organisation as a whole. However, it will not be possible to convict an organisation unless a substantial part of the organisation’s failure lay at a senior management level.
Corporate manslaughter will continue to be an extremely serious offence, reserved for the very worst cases of corporate mismanagement leading to death. The offence is concerned with the way in which an organisation’s activities were managed or organised. Under this test, courts will look at management systems and practices across the organisation, and whether an adequate standard of care was applied to the fatal activity. Juries will be required to consider the extent to which an organisation was in breach of health and safety requirements, and how serious those failings were. They will also be able to consider wider cultural issues within the organisation, such as attitudes or practices that tolerated health and safety breaches.

The threshold for the offence is gross negligence. The way in which activities were managed or organised must have fallen far below what could reasonably have been expected.

More information is available at the Ministry of Justice website

Andy Brazier

Tuesday, January 08, 2008

Using consultants

Just discovered "HSE statement to the external providers of health and safety assistance" on HSE website

It explains, very briefly, the duties of consultants and other people who provide advice to companies regarding health and safety. It say "You can help employers to manage risk sensibly, ie, focussing on reducing real risks, both those which arise more often and those with serious consequences. As the provider you must be competent, give a good quality service and deliver help that is fit for purpose."

In general terms advice needs to be

1. Correct
2. Tailored
3. Sensible

I think this is really useful. It appears that some consultants concentrate on number 1, which results in masses of generic paperwork. This rarely, in my opinion, helps the client.

Andy Brazier

Medical negligence due to lack of NHS funds

Negligence worries - Article by Ken Thomas (specialist medical negligence lawyer with South Wales solicitors Harding Evans) writing in the Western Mail on 7 January 2008

Ken says that in the course of his work as a medical negligence lawyer he routinely speaks to medical experts. Over the years, many have hinted strongly, or have said – sometimes quite bluntly and expressly – that medical errors can in part be attributed to lack of money in the NHS. Some of those errors can be gross and even fatal.

But is lack of financial resources a root cause of clinical negligence? Ken thinks it is fair to say that it may well be a factor in some medical mistakes. However, lack of resources is rarely, if ever put forward as an outright excuse or explanation for a failure of care. Put simply, lack of money would not be an attractive defence in court.

Ken makes the point that even in a well-funded healthcare system, mistakes will occur. But over-stretched resources and under-staffed teams cannot help in this regard.

My opinion is that an organisation the size of the NHS can not say they don't have resources. They may not have enough to do everything they want to, but that is different. It is not a lack of resources that cause errors, but it may well be poor prioritisation or organisation.

It is probably quite correct for the NHS to say they would like more money, but that will always be the case. I am pretty sure there is a lot of waste in the system at present, and reducing this should be a priority. I think the NHS can learn a lot from other industries, but they seem unable or unwilling to do this to any great extent.

Andy Brazier

Thursday, January 03, 2008

Human factor investigations

Presentation by John Chappelow from his website

John describes a taxonomy that he uses in training non-human factors specialists on accident investigation courses. Once a narrative description of an incident has been
broken down into discrete events, each event is examined to determine the type of error involved according to a simple classification based on the cognitive elements of any task cycle. They are:

1. Perception,
2. Intention,
3. Action.

This is achieved by asking questions as follows.

1. Did you perceive the situation correctly? If no, was it
a. Detection failure
I didn’t see it
I didn’t hear it
I’m sure it was green when I looked
It appeared to be locked when I checked
b. Misjudgement
The gap looked big enough
It didn’t seem to be going that fast
c. Communication failure
I thought he said…

2. Were your intentions appropriate?
a. Inappropriate model
I hadn’t appreciated that…
I obviously misunderstood what was required
In retrospect, the briefing could have been clearer
Suddenly, the plan went pear-shaped
b. Inappropriate evaluation of risk
I saw a simple way to solve the problem
We thought it would work
To save time, I used a different tool/method
We always do it this way on this unit
The laid down procedure takes too long
c. Responsibility management
I thought someone else would…
d. Malicious intent

3. Did you do what you intended to do?
a. Lapse
I forgot to…
b. Slip
I intended to do A but did B instead
c. Skill
I applied too much force
d. Response time
I was too slow/too quick

According to John "this approach has proved robust and easy to use, and, importantly, it can facilitate investigation of possible causal factors by identifying the more likely candidates."

Andy Brazier