OEE Consulting's Managing Director Mark Palmer explores how we can apply Black Box Thinking from the aviation industry for robust operations improvement.
Before you start reading, I want to ask you a question: are you someone who believes things can be better, different, improved - or do you believe the die is cast?
I believe there is a correlation between those people who have ‘the growth mindset’ (in other words - people who have a genuine desire to learn from mistakes) and the way in which a business can build a robust quality system which delivers very low error rates.
As Matthew Syed points out in his book, ‘Black Box Thinking’, there is a huge contrast between the aerospace sector and virtually any other sector. Air travel is instinctively dangerous and yet surprisingly safe, in an environment where the consequences of failure are terrible.
Why is that?
A culture of transparency
Du Pont research shows that the lowest error rate in the simplest task (in the best environment, with the highest skilled people who are highly motivated) is 5 errors in 1,000. If we applied this number to airline safety, it would mean that five thousand of every one million flights would encounter a problem.
Aircrafts are incredibly complex, and are designed, built, maintained and operated by people who will make mistakes - so how do planes stay in the sky?
The introduction of the black box on every aircraft provided an objective means of conducting root-cause analysis and research into near misses and crashes. It has driven a culture of safety and improvement which has been championed by industry leadership - holding transparency and honesty as priorities, above all else.
When there has been an incident or near miss leading to procedural, engineering or systemic change, learning has been universally applied across the industry. For those who have never worked in this environment, the amount of collaboration between competitors, with regulators and the supply base comes as a real surprise.
The right culture, and the ability to remove cognitive dissonance from post-event analysis, has made learning from mistakes very powerful.
Learning from failure
Syed contrasts aerospace with medicine in the UK and the USA, where there are many avoidable deaths. Successful adoption of the aerospace approach to 'learning from failure' has been pioneered by the Virginia Mason hospital group in the US. This approach to analysing failure, coupled to a leadership regime which encourages honesty and transparency, has delivered a far lower error rate, and consequentially a much lower avoidable death rate.
A failure to learn from failure is very much apparent in other industries: financial services, utilities, telecoms - the list goes on.
The problem is systemic. All regulators are driven by a genuine desire to improve quality across an industry, yet most apply outdated thinking and build an improvement regime based on competency and inspection. This is a surprising response since we know that some of the most competent people in society (such as doctors) are part of a system where preventable deaths still exist. The system has been set up in such a way that people are not encouraged to learn from mistakes - in fact, honesty and transparency is more likely to see medical professionals punished or sued.
Time to change
Regulators, organisations, and leaders need to abandon outdated practices and realise that the route to a dramatic improvement in quality and consistent service delivery is to create environments where there is an acceptance that the process is more complicated than an individual’s ability.
Process thinking in service operations means that the big picture must be integrated - the ‘how’ or the method, the inputs, the teams and the robots all play an important part.
What you measure and how you interpret failure sets the tone and helps create an environment where we can learn from mistakes.
If we can all understand, and believe, that when things go wrong it is more complex than just an opportunity for individual blame, then we can remove a fear of failure, learn from our mistakes and allow room for real change.
So let me ask you another question: are you open to thinking differently?