A simple case of human error

​Accounts of human error have appeared in the press quite a lot In New Zealand over the past 3-4 years. A 'simple case of human error', or 'she'll be alright', are terms used on many occasions. Human error seems to becoming a bit of a buzz word when faced with negative media attention.

Now at risk of making my peers baulk and throw their organisational resilience and systems engineering literature at me, in my opinion this attention is not such a bad thing. On the one hand, by using the term human error as a fait accompli (as a known fact) – potentially places unfair culpability on the individual that made the error. Thus avoiding the organisational issues that could have set that person up to fail in the first place.

On the other hand, with this frequent usage of the term human error it is bringing it more into the limelight for discussion. As we face a future of increased complexity of organisational systems (IT, policy, project management, process etc.) we also have to face the fact that humans, by their very nature, are limited by their ability to make mistakes particularly under pressure and working in complex environments. As a result, if an employee is disciplined or dismissed for making a mistake the chances of someone else making a similar mistake in the same organisation in the future are significant – ask any large organisation that have frequently been in the press for all the wrong reasons.

So, I was on a flight back home the other week and got the comment, "So you work with human error? That's to do with health and safety right?" Wrong! Although when we think of some of the key people who have researched human error, they generally focus on health and safety incidents – it appears to me that it is the errors in everyday organisational operation that can also have significant damaging effects. Okay, no one has been injured or worse, which is a good thing, but the damage to the organisation's reputation can be just as severe. Again, it is not hard to think of some organisations that have been put through the media mill lately.

Already, in 2014, we are seeing a spate of human errors – the most eye catching for me was the estimated loss of $810,000 due to a market trader entering the wrong figures causing a 10% spike in shares in one of the world's largest banking institutions – the loss occurred over a period of 30 seconds.

Human error is part of the human condition and it will never go away. Even when we add automation we still require a human to design, build, monitor and maintain the system. Human error is not just related to health and safety. The key, is to know the right questions to ask when it occurs, learn from it, use those learnings to change and ensure it does not happen again. Of course, if you are more proactive, then taking a risk and awareness based approach on any significant organisational change is the way to go. Unfortunately, in my time as a Human Factors professional in New Zealand, I have yet to see an adequate design, investigation, analysis, prevention or risk assessment in action, which adequately embraces human reliability and human behavioural response to the complexities of the work environment.

If you have any comments or would like to know more, please email us atinfo@hfex.co.nz. We would love to hear from you.