Publication 770 from the American Petroleum Institute (API). Full title 'A manager's guide to reducing human errors. Improving human performance in the process industries.' Published March 2001 and written by D.K. Lorenzo
9 years since publication, it is easy to think that things have moved on and this may be somewhat out of date. However, I don't think this is the case. I particularly like section 3.1 which lists "examples of error likely situations." They include:
1. Deficient procedures - good procedures help ensure qualified people can operate correctly and safely
2. Inadequate, inoperative or misleading instrumentation - poor instrumentation means workers have to 'fill in the blanks' by deduction and inference when working out what is going in on
3. Insufficient knowledge - not just knowing the 'what' and 'how;' but also the 'why'
4. Conflicting priorities - particularly between production (which usually has more tangible rewards) and safety
5. Inadequate labelling - useful for new workers, workers who only use the system infrequently or frequent users when in stressful situations
6. Inadequate feedback - if feedback on actions is not prompt, people tend to over-react
7. Policy/practice discrepancies - once any discrepancy is tolerated between what is written and what happens in practice, the workers will use their own judgement to decide which policies are to be applied
8. Disabled equipment - either means workers are not aware of what is happening or are distracted by spurious alarms and trips
9. Poor communication - two way verbal communications to confirm understanding, backed up by written where it is critical
10. Poor layout - if an instrument, control or other item is not located conveniently, it is unlikely to be used as intended or required
11. Violations of populational stereo types - the way people will respond without thinking because of what they are used to in everyday life
12. Overly sensitive controls - people are not very precise and so controls have to be designed with that in mind
13. Excessive mental tasks - the more demanding, the greater chance of error
14. Opportunities for error - if there is the chance for error, even if likelihood is considered low, it will eventually occur
15. Inadequate tools - proper tools expand human capabilities and reduce the likelihood of error
16. Sloppy housekeeping - appearance of a facility is usually perceived as a reflection of management's general attitude.
17. Extended, uneventful vigilance - if people need to monitor something for some time (more than 30 minuets) when nothing is happening, their ability to detect something becomes very low
18. Computer control failure - computers prone to errors in software and operator input, and so whilst having the potential to improve efficiency, safety etc. the risks need to be understood and managed properly
19. Inadequate physical restriction - as long as it does not impede normal operations, interlocks, unique connections and other physical characteristics can reduce error likelihood
20. Appearance at the expense of functionality - during design it is easy to concentrate on aesthetic factors such as consistency and order; whereas how the person will use the system is actually most important.
I believe the publication is still available from the API website