“Insanity: doing the same thing over and over and expecting different results”. Albert Einstein.
You could be right in thinking that “safety crusaders” are the glass half empty type, right? Those that believe “what can go wrong, will go wrong” (Murphys Law).
I’m not a pessimist, but I do believe in being prepared for the potential for error, or for when things do genuinely go wrong. We need to avoid Einstein’s insanity; repeating that same thing and expecting different results the next time (as the next time might be a catastrophic outcome). As HEMS clinicians we have a responsibility to get our patients from the pre-hospital to the hospital environment without harm, to the best of our abilities. A culture of safety and forethought, identifying and mitigating for potential hazards (threat and error management) is a prominent facet of our work.
Both Aviation and medicine involve teams of highly skilled people working together in a high-stakes environment involving people’s lives. The obvious major difference is human beings are not planes or computer controlled, and interactions with people are so multifaceted and the human body so complex, that events cannot always be predictable. Therefore we need to mitigate somewhat for that unpredictability. (Perhaps introduce the concept of ‘Threat and error management?’)
Aviation has bred a culture of safety for many years now and, although a little way down the track, Medicine (especially Anaesthetic and Surgical services) has been learning from this and applying appropriately modified safety techniques including:
- · Human Factors and (team) training
- · Ergonomic design and usability testing
- · Checklists
- · Communication Techniques
- · Pre-procedural briefings
- · Simulation Training (muscle memory)
- · Debriefing
- · Error/Incident reporting
Auckland HEMS has, from the outset actively promoted the already robust culture of patient safety at ARHT. We have introduced RSI checklists, human factors training, dedicated medical simulation training and (thanks to generous funding from ARHT and ADHB) a now a fully functioning clinical simulation lab. Furthermore we now have access to a multidimensional computerised safety management system (SMS) which the ARHT has recently introduced. Originally developed for aviation, the system contains enough flexibility to expand into the clinical role of the HEMS service. This includes safety and quality reports (which can be anonymous), inventory management (circulation, maintenance and ordering of drugs and equipment) and personnel currency – ensuring all crew take a personal responsibility in being “flight-ready”. There is also scope for a clinical risk register, which we are currently developing.
This brings me on to the value and/or role of Safety Management Systems (SMS) from a clinical perspective:
Definition of an SMS from the Civil Aviation Authority NZ is a “formal organisational framework to manage safety. Under an SMS, organisations will need to have systems for
- · Error, threat and hazard identification
- · Risk management
- · Safety targets
- · Reporting processes
- · Procedures for audit, investigations, remedial actions
- · Safety education…
… and to be effective it must be part of everyday practice”.
All these facets have an application to clinical care, not just aviation. Although it is arguable that we do all of the above currently, a systems approach means that the formally isolated clinical components of risk, safety, and quality of care are brought together in everyday practice in an integrated manner. This approach requires a strong safety culture within an organisation together with consistent managerial support. It also needs individual accountability and that personnel are empowered to speak out. It is in this complex environment where the integration offered by an SMS comes into its own.
The system and practice of safety not only relies on people putting their hand up and reporting incidents, it is also dependent on the staff on the ground having access to outcomes, therefore facilitating behavioural and organisational change. Auckland HEMS safety reports allow all operational crew to have visibility of current and past events, to read comments from the experts’ on a real time forum, as well as the opportunity to comment themselves. This whole-crew approach enables transparency and has been invaluable in providing a 360 degree scrutiny of the incident from both a clinical and aviation perspective, as well as the potential for change, outcomes and trends in incidents.
In the initial stages encouraging reporting can be difficult: apart from the medico-legal aspect other barriers to reporting can include:
- · Time
- · Personnel buy-in
- · Lack of “champions” in the organisation
- · The thought that it is “somebody else’s job”
Error, near misses and incident reporting has not translated as well into medicine as it has in aviation. Humans in general have an innate distrust of any Orwellian “Big Brother” watching their every move looking for mistakes. This also stems from a prominent medico-legal “blame” culture, which medicine has been slow in overcoming. Patient safety depends on open disclosure of error or near misses, primarily to avoid the same happening to someone else. Open disclosure (naively) would be the ideal, however may never be feasible due to the potential for individual blame for system faults.
To quote the Institute of Healthcare Improvement (IHI) “the focus must shift from blaming individuals for past errors to a focus on preventing future errors by designing safety into the system”.
The next question: would it be feasible to introduce an integrated SMS into the healthcare setting? A one-stop systematic shop for inventory, maintenance of equipment, personnel, rostering (shift-work and fatigue), clinical risk registers and safety reporting – and making this part of everyday practice? This has been approached by Toney in his paper (free online access to PDF here.)
Finally I would be interested in finding out how other HEMS / EMS services have developed their clinical risk registers. Feel free to comment below.