Following a comment I received on my previous post on the topic, I have given some further thought to leading indicators.
When you talk about performance indicators people often say they need to be S.M.A.R.T - Simple/sensible; measurable; attainable; realistic; time-based.
But there is a counter argument (I am sure I read this as a quote somewhere once) that says 'all that is important cannot be measured and things that can be measured are not always important.'
The comment made to my earlier post says that lagging indicators need to be based on actual consequences rather so that they are precise, accurate, difficult to manipulate and easily understood. Therefore near misses and high potential incidents cannot be used as indicators. This is an interesting point, and now I have time to think about it I am pretty sure it is correct. I still maintain that a huge amount can be learnt from near misses, but agree that this is not the same as using them to provide performance indicators.
The comment also said that leading indicators are certainly more difficult than lagging ones, but if we ask the people who are close to the risk and working with it every day, we will very quickly get a good indication of which of our systems are weak. Then we can hang indicators on those systems to drive improvement.
So from this I conclude that
1. Our traditional lagging indicators are useful, and there but there is probably no need to look for many new ones.
2. Leading indicators can be identified, but they need to be fluid in order to reflect the issues most relevant to an organisation at any time.
3. Near misses are an excellent source of important information but do not provide data that we can use to measure performance.