Tuesday, April 29, 2008

Cargo flight near catastophe

Article from BBC website 29 April 2008. Summary of the air accident investigation report of a crash landing of Belgian Boeing 737 cargo plane operated by TNT airways (full report here. The plane tried to land at East Midlands airport, damaged its undercarriage and then made an emergency landing at Birmingham. None of the crew were injured.

The report says that at a critical moment in its approach to East Midlands airport, air traffic control passed a message to the pilot from his company instructing a change of destination. This should not have been done at this time, it confused the pilot who inadvertently turned off the autopilots. The plane lost height but, instead of aborting the landing he continued whilst trying to re-engage the autopilots. The plane came down on grass alongside the runway but then became airborne again. By the time of the landing in Birmingham the plane had no right hand landing gear and its flaps were jammed.

Failures identified by the report include
* Weather forecast did not warn of mist or fog that caused an unexpected diversion from Standsted;
* Air traffic control passed on a message at an inappropriate time;
* Captain lost situational awareness when he inadvertently disconnected the autopilots
* Neither the captain or co-pilot called a "go around" even though they knew they had problems during the approach.

The report recommends that TNT Airways review its standard operating procedures. They actually sacked the pilot a month after the accident saying the incident was down to human error.

It seems pretty poor to me that the pilot got sacked. Even the company said he showed skill in handling the situation. Whilst errors were made, there are a number of contributory factors. TNT claim to have a "zero accident tolerance level" but clearly no understanding of human factors.

Andy Brazier

Friday, April 25, 2008

Awareness training is not enough

Article on Secure Computing website by Paul Fisher 11 April 2008. An interview with Brat Hartman, Chief Technical Officer at RSA.

Hartman is asked about preventing human error where data security is concerned. "Less and less information is actually under the control of central IT these days. Information is created everywhere, it's out on everybody's laptops, it's outsourced, it's developed all over the world" He believes that having the right technologies in place to maintain that control is only the solution and that training, whilst crucial, is of limited value.

Everybody goes for training is told about the company security policy, read it and then ignore it, happily sending out data on USB sticks and web-based email, because they are under pressure to get things done and achieve results.

"The world is too complicated and, frankly, it's too difficult to be able to follow those policies under strain. I'm a believer that the right technologies have to be in place to be able to control and enforce that."

To him then, no matter how much training people have or how often you remind them of the importance of security, they will go on making mistakes. Security is typically down the list in terms of priorities. Most people view barriers as an impediment.

A nice turn of phrase: clever people doing stupid things - it could be the title of a self-help book for information security professionals. Hartman believes that all the technology needed already exists, but that the real problem is a failure of application.

I can see Hartman's point of view, but I am not sure how it works in practice. There is the danger that you put all the technical controls in place but they then make the job too difficult. Rather than making errors, people then have to implement far more sophisticated work arounds that ultimately can be more risky. Equally I agree most training in all domains often fails to achieve its objectives.

Andy Brazier

Oldies but goodies

Article from the April 2008 edition of DC Velocity by Tobey Gooley.

Populations in the US (and I guess the UK) are getting older. In many cases this is currently being counteracted by an influx of younger immigrants, but this may not continue. Whilst it is easy to focus on the negatives of the ageing population, there are many positives with some studies showing employing older people can result in improved productivity and safety.

The article lists 16 steps to a safer workplace with older people in mind (although they probably help for all ages).

1. Improve illumination and add colour contrast.
2. Eliminate heavy lifts, elevated work from ladders and long reaches.
3. Design work floors and platforms with smooth and solid decking while still allowing some cushioning.
4. Reduce static standing time.
5. Remove clutter from control panels and computer screens, and use large video displays.
6. Reduce noise levels.
7. Install chain actuators for valve hand wheels, damper levers, or other similar control devices. This brings the control manipulation to ground level, which helps to reduce falls.
8. Install skid-resistant material for flooring and especially for stair treads.
9. Install shallow-angle stairways in place of ladders when space permits and where any daily, elevated access is needed to complete a task.
10. Utilize hands-free, volume-adjustable telephone equipment.
11. Increase task rotation, which will reduce the strain of repetitive motion.
12. Lower sound-system pitches, such as on alarm systems, as they tend to be easier to hear.
13. Lengthen time requirements between steps in a task.
14. Increase the time allowed for making decisions.
15. Consider necessary reaction time when assigning older workers to tasks.
16. Provide opportunities for practice and time to develop task familiarity.

Andy Brazier

Report on the loss of the "Bourbon Dolphin"

Article in The Norway Post by Rolleiv Solholm 29 March 2008.

"It is not possible to show that an individual error, whether technical or human, led to the loss of the anchor-handling vessel “Bourbon Dolphin” on 12 April 2007." 8 people died and 7 survived the accident.

The Commission's report concludes that a series of circumstances acted together to cause the loss of the vessel. The proximate causes were the vessel’s change of course to port (west) so as to get away from mooring line no. 3, at the same time as the inner starboard towing pin was depressed, causing the chain to rest against the outer port towing pin. The chains altered point and angle of attack on the vessel combined with its load condition and the fact that the roll reduction tank was probably in use caused the vessel to capsize.

A combination of weaknesses in the design of the vessel, and failures in the handling of safety systems by the company, by the operator and on the rig, are major contributory factors. System failures on the part of many players caused necessary safety barriers to be lacking, were ignored or were breached.

Recommendations include in the future requirements are made for the preparation of stability calculations subject to approval by the authorities, formal training of winch operators, a review of requirements for survival suits, plus placement and installation of rescue floats. Safety management systems and risk assessments must be improved, there must be routines for overlap of new personnel and identification of the necessary crew qualifications, plus the preparation of vessel-specific anchor-handling procedures.

Operators’ rig move procedures must be made specific for every operation and be simple to understand for those operating under them. Operator and rig must prepare risk assessments for the entire operation before it is commenced. When the operation is executed, safety and coordination must be continuously evaluated. The Commission also proposes that an attention zone be introduced along the anchor line, indicating a maximum distance within which the vessel shall remain when running out anchors.

Andy Brazier

Lets' dehumanise management

Article on HRZone.co.uk by John Pope 8 April 2008

It starts "we all know that managers can make mistakes in selecting, managing and promoting staff. " HR often have the role of baling them out, which is "an enormous waste of time." Adverts suggest that the science of management allied to the power of IT can solve these problems. The author goes on to examine the reality.

* Recruitment - selection can be done online, and an interview is not necessary. This may well avoid employing mavericks or those who do not fit in. It may be safe, but a bit boring.
* Induction - Initial induction can be at the gate, watching a video and completing an online questionnaire. If they pass they can move on to their department where something similar will happen. They can then learn about their new co-workers online.
* Problems with pay and allowances - again online or via a call centre
* Attitude and retention surveys, 360o appraisals - all online.

So everything can be done without any personal contact between employee and supervisor/manager. But this means opportunities to find out what is really happening and to form relationships are missed. Machines don't make mistakes, although the people who instruct them do, but they have not imagination or ability to take a chance on someone who may be a bit of a maverick but may be innovative or have the ability to create new business.

All too often we focus on the negatives of having people in the system by talking about the errors they make. We somehow overlook the positives. This even applies to accidents investigations where we are always quick to identify who did something wrong, but the vast majority would have had far worse consequences if people had not intervened when they did.

Andy Brazier

Human factors integration: cost and performance benefits on army systems

Paper from the US Army Research Laboratory by Harold R Booher published July 1997. Examines cost savings from human factors integration in design of army systems. The basic premise is that the solider is an integral part of the system and not an add on. Primary objectives are to assure that:

1. Adequate number of personnel with the right skills with the proper training are accounted for in the design
2. The system being designed will adequately perform the missions it is being design to do
3. The system will perform safely with a minimum potential for health hazards or soldier casualties.

Cost savings or avoidance are expected, but as a secondary objective.

Case studies

Comanche lightweight helicopter:
* Workload study showed that a one-person crew would be overloaded in critical events. Design for a two-person crew was adopted, but was justifiable.
* Task analysis was used to prioritise information to crew at specific points during missions. As an example, procedural steps required during target reporting were reduced from 34 to only 5.
* Standard rotor design met government specifications. But a new design taking into account human factors resulted in a rotor that could be maintained by less people requiring lower skill level, was less prone to maintenance error and damage during transport. Design changes required 395 man-hours (estimated cost $50,000) but lifecycle changes estimated as $150 million.
* Engine maintenance simplified. Torque wrenches not required. Connectors unique to prevent improper installations. Training burden reduced by 40%.
* Use of graphite-epoxy composite materials allows 50% of exterior skin to have access doors and panels. Identifying tasks to be performed in field allowed these to be conveniently located for access to required parts. Also, some act as work platforms eliminating the need for ladders etc.
* Projected 12,200% return on cost of human factors on the project
* Predicted that 91 soldier fatalities and 116 disabling injuries will be avoided over 20 years of use of the helicopter due to improved outside visibility, improved situational awareness, better warning of engine problems, avoidance of ground accidents during maneuvers and maintenance.

Apache Helicopter
* Original design of a control panel meant it interfered with seats during a crash and reduced how they absorbed energy and hence crew injury. Human factors analysis allowed the panel to be redesigned so that it was smaller and hence did not interfere with the seat.
* A review of maintenance practices showed that personnel habitually stood on engines, supports and hinges to gain access, which can all cause damage and injury. Support structures were redesigned to incorporate a work platform that avoided the problems.
* Analysis of maintenance task showed that unrelated components had to be removed by additional personnel to gain access. Redesign removed this problem.
* $600,000 costs for human factors give a predicted lifecycle saving of $16.8 million, which equates to a 2,8000% return.

Fox reconnaissance vehicle for nuclear, biological and chemical sample pick up and analysis
* Workload assessment showed predicted four person crew would be overloaded. Redesign of workstations allowed crew to be reduced to three
* Improved interface with sample probe reduced mission time by 12%

Overall, the case studies showed that human factors allows new technology to be used so that more benefits are achieved. Using an iterative "design," "test" and "evaluate" model allows systems to be evaluated before they are built. In one case (Fox vehicle) human factors turned the failing project from being cancelled to a success.

Andy Brazier

Thursday, April 24, 2008

Manchester Patient Safety Framework

A number of resources are available on the National Patient Safety Agency website to allow NHS organisations to assess their progress in developing a safety culture.

The principles are based on Westrum's typography of organisational communication (1992) that was expanded by Parker and Hudson (2001) to describe five levels of increasing organisational safety culture as follows

A. Pathological - prevailing attitude is "why waste our time on safety" so that there is little or no investment in improving
B. Reactive - only think about safety after an incident
C. Bureaucratic - paper-based approaches involving ticking boxes to show to auditors and assessors
D. Proactive - place a high value on improving, actively invest in continuous improvement and reward staff who raise safety-related issues
E. Generative - the nirvana where safety is an integral part of the organisation

Andy Brazier

Medical device use-safety: Incoporating human factors engineering into risk management

Guidance for industry from the US Food and Drug Administration (FDA). Written by Ron Kaye and Jay Crowley and issued 18 July 2000

Hazards related to medical device use should be addressed during device development as part of the risk management process. Human factors engineering is a key element in achieving the goal of ensuring users are able to use medical devices safely and effectively throughout the product lifecycle. This requires an understanding of the interaction between users, devices and the use environment.

Evidence suggests that the frequency and consequence of hazards resulting from medical device use might far exceed those arising from device failures. Some of these errors will result in fatality. Whilst direct hazards associated with devices (i.e. chemical, mechanical, electrical etc.) are often well understood, use errors can cause medical problems through misdiagnosis (i.e. assigning the wrong cause to a condition and hence providing inappropriate treatment), failure to recognise and act on information (e.g. information from a monitoring device) or provision of improper treatment (i.e. device set up incorrectly when implementing a therapy).

Use-related hazards occur for one or more of the following reasons:
* Devices used in ways that were not anticipated in design;
* Devices used in ways that were anticipated but inadequately controlled for;
* Device use requires personal abilities (e.g. physical, perceptual, cognitive) that exceed those of the user;
* Device use is inconsistent with users' expectation or intuition about device operation;
* The use environment effects device operation and this effect is not understood by the user;
* The users capacities are exceeded when using the device in a particular environment.

Whilst user trials of devices are an important part of development, it is important to be aware of their limitations. For example, unusual circumstances often represent the greatest threat to safe and effective use of a medical device because users are less able to react appropriately to situations that occur infrequently, but these are difficult to predict during early development or to test during user trials. Also, users will often have a preference for ease of use and aesthetics, whereas the safest arrangement may require some design features that can slow down use of effect aesthetics (e.g. shields over critical controls, mechanical or software-based interlocks).

Human factors engineering shows us the following apply:
* Use environment - light, noise, distraction, motion/vibration, workload;
* User - knowledge, abilities, expectations, limitations;
* Device - operation requirements, procedures, device complexity, specific user interface characteristics

User characteristics include
* General health and mental state (stressed, relaxed, rested, tired, affected by medication or disease) when using the device,
* Physical size and strength,
* Sensory capabilities (vision, hearing, touch),
* Coordination (manual dexterity),
* Cognitive ability and memory,
* Knowledge about device operation and the associated medical condition,
* Previous experience with devices (particularly similar devices or user interfaces),
* Expectations about how a device will operate,
* Motivation, and
* Ability to adapt to adverse circumstances.

A good example for the user is diabetics. They are required to monitor their blood sugar levels on a regular basis, and electronic devices are available to do this. However, diabetics often suffer from retinopathy which affects eye sight. Blood monitoring devices have in the past been provided with small displays which many of the intended users cannot use reliably.

The following list can help identify potential scenarios that could result in hazard
1. Why have problems occurred with the use of other similar products?
2. What are the critical steps in setting-up and operating the device? Can they be performed adequately by the expected users? How might the user set the device up incorrectly and what effects would this have?
3. Is the user likely to operate the device differently than the instructions indicate?
4. Is the user or use environment likely to be different than that originally intended?
5. How might the physical and mental capabilities of users affect their use of the device?
6. Are users likely to be affected by clinical or age-related conditions that impact their physical or mental abilities and could affect their ability to use the device?
7. How might safety-critical tasks be performed incorrectly and what effects would this have?
8. How important is user training, and will users be able to operate the device safely and effectively if they don’t have it?
9. How important are storage and maintenance recommendations for proper device function, and what might happen if they are not followed?
10. Do any aspects of device use seem complex, and how can the operator become "confused" when using the device?
11. Are the auditory and visual warnings effective for all users and use environments?
12. To what extent will the user depend on device output or displayed instructions for adjusting medication or taking other health-related actions?
13. What will happen if necessary device accessories are expired, damaged, missing, or otherwise different than recommended?
14. Is device operation reasonably resistant to everyday handling?
15. Can touching or handling the device harm the user or patient?
16. If the device fails, does it "fail safe" or give the user sufficient indication of the failure?
17. Could device use be affected if power is lost or disconnected (inadvertently or purposefully), or if its battery is damaged, missing or discharged?

Having identified potential hazards, it is then necessary to implement a combination of mitigation and control strategies. The following should be considered in the stated order
1. Modify device design to remove hazard or reduce its consequence;
2. Make user interface, including operating logic, error tolerant
3. Alert users to the hazard
4. Develop written procedures and training for safe operation.

Andy Brazier

Friday, April 11, 2008

Developing an effective safety culture

James Roughton has been in touch with me having seen my blog. He runs a couple of blogs himself, including GotSafety which provides an ever expanding set of articles about accidents and safety news, mostly US based.

Having looked around James' sites I can see he has some really useful content. I particularly like his presentation on Developing an Effective Safety Culture. Some key points include
* Safety should not be a priority because priorities change. Instead it needs to be a core value installed in all parts of the organisation
* We are still using traditional approaches relying on signs and rules, lagging indicators and holding individuals responsible. A more proactive, integrated approach is required.
* A culture is about building relationships with employees.
* If you do what you've always done your get what you always got (Demming).
* To improve your culture you need to know where you want to get to. "If you don't know where you are going, the chances are you will get somewhere else" (Yogi Berra)
* Good leaders are accessible, available, believable, creative, reliable, and trustworthy.
* Sometimes we make things look so easy, we tend to forget the risk and their related hazards.
* Sometimes it’s the trivial things that results in a major “event.”
* Sometimes we do things wrong so often they become right….
* Accident statistics are basically a measure of luck.

James' page here is well worth a look

Andy Brazier

Monday, April 07, 2008

Video refs embrace technology but some caution is needed

Article in the Marlborough Express website on 3 April 2008.

It talks about the role of the video referee in sports such as Rugby and tennis. The concern is that they are becoming too involved and this may not be in the interests of the game because it slows it down and can undermine the referees on the ground. The article concludes "We have to accept that sometimes it is better for the game to include some human error, rather than letting technology run rampant."

I can see parallels here with industry. If we going to have people involved in our processes we need to let them get on with their job. We have to accept there will be errors, but also recognise the ability of people to detect problems, adapt to situations and improvise solutions is highly beneficial. We can't have one without the other.

Andy Brazier

Ergonomic tools: Science or fiction?

John Hedbor, the marketing manager at L.S. Starrett. posted the following checklist on ReliablePlant.com in March 2008. I thought it was a very good summary of issues related to hand tools

A good tool should reduce the risk of direct injury. It should:
* not have any sharp edges on the handle
* minimize wear and tear on the skin
* reduce the risk of users' hands getting caught in tight spots
* reduce the risk of users' hands coming into contact with sharp edges and shoulders
* be slip-resistant

A good tool should reduce the risk of long-term injury. It should:
* have the optimal weight for its purpose
* have a grip that protects the user from hot/cold temperatures
* minimize the build-up of muscular tension during lengthy jobs
* have a large gripping surface that exerts low, even pressure across the hand
* deliver the greatest power with the least possible effort
* vibrate as little as possible
* be perfectly balanced

A good tool should make the tool user's job easier. It should:
* be the correct size and design for its purpose
* be able to be used in different positions
* not require the user to change grip, if possible
* be adjustable in many different positions
* be adjustable even when wearing gloves
* be designed for use with both hands, if required
* be easy to hold, with the right degree of friction against skin
* be available in different sizes, suitable for different tasks
* tolerate oil and grease

Andy Brazier