Report by Lord Young of Graffham to the Prime Minister published 15 October 2010 and available from the Number 10 website
Lord Young was given the job of reviewing the operation of health and safety laws and the growth of the compensation culture. The main concern being that the standing of health and safety has in the eyes of the public has dramatically reduced.
According to the report, over 800,000 compensation claims were made in the UK during 2009. Some of these were for trivial matters, and the rise of claims management companies seems to have been a driving factor. The problem is that companies, voluntary organisations, schools, emergency services and others are becoming overly risk-adverse and bureaucratic because of their fear of compensation.
There are quite a number of recommendations, but taken as a whole they cover:
* Reviewing the way that compensation claims can be made, including the role of companies that assist in these claims, to get society to move away from a compensation culture;
* Making sure well-intentioned volunteers (i.e. good Samaritans) are not held liable for consequences that may arise;
* Simplify compliance processes for low-hazard workplaces, and provide more help so that organisations can easily check and record their compliance;
* Implement an accreditation scheme for health and safety consultants with the aim of improving professionalism and raising standards;
* Encouraging insurance companies to take a more reasonable approach to minimise the burden on their customers;
* Simplifying processes for schools;
* Provide a means for citizens to challenge local authorities if they want to ban events on health and safety grounds.
The report suggests some aspects of health and safety legislation should be reviewed, but does not appear to advocate significant change. It is the application that is of most concern.
One suggestion that I think is particularly power is the suggestion to "Shift from a system of risk assessment to a system of risk–benefit assessment." Although I am disappointed that this is only directed towards Education establishments, and not for every organisation. I think this fundamental shift could significantly improve the way risks are managed in practice.
Friday, October 15, 2010
Wednesday, October 13, 2010
Similar to Snail Mail
Article at the Daily WTF (curious perversions in Information Technology) by Remy Porter on 20 July 2010.
A story relating to a company that sells addresses for use with direct marketing (junk mail). Maintaining lists of addresses is relatively easy, but to have names of residents at those addresses requires more work to acquire and update. Therefore, it is cheaper to use a generic name such as "The resident" or "The car owner." Apparently these generic terms are known as "slug names."
The company in question provided a web based service whereby the direct marketers could go online and download the address lists for a fee. They updated the service so that the customer could choose the cheaper, unnamed list.
The code used by the website checked whether the 'Slug' option was selected, and knew to not include names. However, an oversight meant that an alternative to the name was not supplied, and instead it was labelled "slug." This meant, when mail was sent it was addressed "To the Slug." This was only discovered after several mailings had been sent out.
The underlying cause of this error was in the specification for the code. It simply required the option to be provided to the customer, and did not same anything about how that was to be handled. The code went through full quality assurance, but this simply checked against the original specification.
A story relating to a company that sells addresses for use with direct marketing (junk mail). Maintaining lists of addresses is relatively easy, but to have names of residents at those addresses requires more work to acquire and update. Therefore, it is cheaper to use a generic name such as "The resident" or "The car owner." Apparently these generic terms are known as "slug names."
The company in question provided a web based service whereby the direct marketers could go online and download the address lists for a fee. They updated the service so that the customer could choose the cheaper, unnamed list.
The code used by the website checked whether the 'Slug' option was selected, and knew to not include names. However, an oversight meant that an alternative to the name was not supplied, and instead it was labelled "slug." This meant, when mail was sent it was addressed "To the Slug." This was only discovered after several mailings had been sent out.
The underlying cause of this error was in the specification for the code. It simply required the option to be provided to the customer, and did not same anything about how that was to be handled. The code went through full quality assurance, but this simply checked against the original specification.
Family's Titanic secret revealed
BBC Website 22 September 2010
According to new information from novelist Louise Patten, granddaughter of Titanic's Second Officer Charles Lightoller, the ship hit the iceberg because the helmsman turned the wrong way when ordered to change direction.
The explanation of why such a fundamental error occurred is that the accident happened at a time when ship communications were in transition from sail to steam. Two different systems were in operation at the time, Rudder Orders (used for steam ships) and Tiller Orders (used for sailing ships). Crucially, Mrs Patten said, the two steering systems were the complete opposite of one another, so a command to turn 'hard a-starboard' meant turn the wheel right under one system and left under the other."
It just so happens that the helmsman had been trained on sail, but was steering a steam vessel.
Mrs Patten claims that only a very small number of people knew about this mistake, but they kept quiet because if the White Star Line had been found to be negligent, it would have gone bankrupt and everyone would have lost their jobs.
According to new information from novelist Louise Patten, granddaughter of Titanic's Second Officer Charles Lightoller, the ship hit the iceberg because the helmsman turned the wrong way when ordered to change direction.
The explanation of why such a fundamental error occurred is that the accident happened at a time when ship communications were in transition from sail to steam. Two different systems were in operation at the time, Rudder Orders (used for steam ships) and Tiller Orders (used for sailing ships). Crucially, Mrs Patten said, the two steering systems were the complete opposite of one another, so a command to turn 'hard a-starboard' meant turn the wheel right under one system and left under the other."
It just so happens that the helmsman had been trained on sail, but was steering a steam vessel.
Mrs Patten claims that only a very small number of people knew about this mistake, but they kept quiet because if the White Star Line had been found to be negligent, it would have gone bankrupt and everyone would have lost their jobs.
Tuesday, October 12, 2010
A manager's guide to reducing human error
Publication 770 from the American Petroleum Institute (API). Full title 'A manager's guide to reducing human errors. Improving human performance in the process industries.' Published March 2001 and written by D.K. Lorenzo
9 years since publication, it is easy to think that things have moved on and this may be somewhat out of date. However, I don't think this is the case. I particularly like section 3.1 which lists "examples of error likely situations." They include:
1. Deficient procedures - good procedures help ensure qualified people can operate correctly and safely
2. Inadequate, inoperative or misleading instrumentation - poor instrumentation means workers have to 'fill in the blanks' by deduction and inference when working out what is going in on
3. Insufficient knowledge - not just knowing the 'what' and 'how;' but also the 'why'
4. Conflicting priorities - particularly between production (which usually has more tangible rewards) and safety
5. Inadequate labelling - useful for new workers, workers who only use the system infrequently or frequent users when in stressful situations
6. Inadequate feedback - if feedback on actions is not prompt, people tend to over-react
7. Policy/practice discrepancies - once any discrepancy is tolerated between what is written and what happens in practice, the workers will use their own judgement to decide which policies are to be applied
8. Disabled equipment - either means workers are not aware of what is happening or are distracted by spurious alarms and trips
9. Poor communication - two way verbal communications to confirm understanding, backed up by written where it is critical
10. Poor layout - if an instrument, control or other item is not located conveniently, it is unlikely to be used as intended or required
11. Violations of populational stereo types - the way people will respond without thinking because of what they are used to in everyday life
12. Overly sensitive controls - people are not very precise and so controls have to be designed with that in mind
13. Excessive mental tasks - the more demanding, the greater chance of error
14. Opportunities for error - if there is the chance for error, even if likelihood is considered low, it will eventually occur
15. Inadequate tools - proper tools expand human capabilities and reduce the likelihood of error
16. Sloppy housekeeping - appearance of a facility is usually perceived as a reflection of management's general attitude.
17. Extended, uneventful vigilance - if people need to monitor something for some time (more than 30 minuets) when nothing is happening, their ability to detect something becomes very low
18. Computer control failure - computers prone to errors in software and operator input, and so whilst having the potential to improve efficiency, safety etc. the risks need to be understood and managed properly
19. Inadequate physical restriction - as long as it does not impede normal operations, interlocks, unique connections and other physical characteristics can reduce error likelihood
20. Appearance at the expense of functionality - during design it is easy to concentrate on aesthetic factors such as consistency and order; whereas how the person will use the system is actually most important.
I believe the publication is still available from the API website
9 years since publication, it is easy to think that things have moved on and this may be somewhat out of date. However, I don't think this is the case. I particularly like section 3.1 which lists "examples of error likely situations." They include:
1. Deficient procedures - good procedures help ensure qualified people can operate correctly and safely
2. Inadequate, inoperative or misleading instrumentation - poor instrumentation means workers have to 'fill in the blanks' by deduction and inference when working out what is going in on
3. Insufficient knowledge - not just knowing the 'what' and 'how;' but also the 'why'
4. Conflicting priorities - particularly between production (which usually has more tangible rewards) and safety
5. Inadequate labelling - useful for new workers, workers who only use the system infrequently or frequent users when in stressful situations
6. Inadequate feedback - if feedback on actions is not prompt, people tend to over-react
7. Policy/practice discrepancies - once any discrepancy is tolerated between what is written and what happens in practice, the workers will use their own judgement to decide which policies are to be applied
8. Disabled equipment - either means workers are not aware of what is happening or are distracted by spurious alarms and trips
9. Poor communication - two way verbal communications to confirm understanding, backed up by written where it is critical
10. Poor layout - if an instrument, control or other item is not located conveniently, it is unlikely to be used as intended or required
11. Violations of populational stereo types - the way people will respond without thinking because of what they are used to in everyday life
12. Overly sensitive controls - people are not very precise and so controls have to be designed with that in mind
13. Excessive mental tasks - the more demanding, the greater chance of error
14. Opportunities for error - if there is the chance for error, even if likelihood is considered low, it will eventually occur
15. Inadequate tools - proper tools expand human capabilities and reduce the likelihood of error
16. Sloppy housekeeping - appearance of a facility is usually perceived as a reflection of management's general attitude.
17. Extended, uneventful vigilance - if people need to monitor something for some time (more than 30 minuets) when nothing is happening, their ability to detect something becomes very low
18. Computer control failure - computers prone to errors in software and operator input, and so whilst having the potential to improve efficiency, safety etc. the risks need to be understood and managed properly
19. Inadequate physical restriction - as long as it does not impede normal operations, interlocks, unique connections and other physical characteristics can reduce error likelihood
20. Appearance at the expense of functionality - during design it is easy to concentrate on aesthetic factors such as consistency and order; whereas how the person will use the system is actually most important.
I believe the publication is still available from the API website
Subscribe to:
Posts (Atom)