Checklists – Part 1

By Damjan Gaco, MD, ARHT HEMS Fellow

History

The origin story of checklists goes as follows: A pilot in the 1930’s stepped off a newly built bomber and said something along the lines of “that is too much plane for one person to handle”. In an ever more complicated world, those words echo true today – especially in the field of medicine. For example, the act of intubation carries many steps – all important: pre-treatment, induction, intubation, back-up plans, confirmation of tube placement, post-tube sedation, and post-intubation care. A post written two years ago by then Auckland HEMS Fellow Dr. Robert Gooch outlines this ever-complicated environment, and the ultimate goal of reducing burden on clinicians.

Continue reading

Patient safety in helicopter emergency medical services (HEMS): The safety management system

“Insanity: doing the same thing over and over and expecting different results”. Albert Einstein.

You could be right in thinking that “safety crusaders” are the glass half empty type, right? Those that believe “what can go wrong, will go wrong” (Murphys Law).

I’m not a pessimist, but I do believe in being prepared for the potential for error, or for when things do genuinely go wrong.  We need to avoid Einstein’s insanity; repeating that same thing and expecting different results the next time (as the next time might be a catastrophic outcome).  As HEMS clinicians we have a responsibility to get our patients from the pre-hospital to the hospital environment without harm, to the best of our abilities.  A culture of safety and forethought, identifying and mitigating for potential hazards (threat and error management) is a prominent facet of our work.

Both Aviation and medicine involve teams of highly skilled people working together in a high-stakes environment involving people’s lives. The obvious major difference is human beings are not planes or computer controlled, and interactions with people are so multifaceted and the human body so complex, that events cannot always be predictable.  Therefore we need to mitigate somewhat for that unpredictability. (Perhaps introduce the concept of ‘Threat and error management?’)

Aviation has bred a culture of safety for many years now and, although a little way down the track, Medicine (especially Anaesthetic and Surgical services) has been learning from this and applying appropriately modified safety techniques including:

  • ·         Human Factors and (team) training
  • ·         Ergonomic design and usability testing
  • ·         Checklists
  • ·         Communication Techniques
  • ·         Pre-procedural briefings
  • ·         Simulation Training (muscle memory)
  • ·         Debriefing
  • ·         Error/Incident reporting

Auckland HEMS has, from the outset actively promoted the already robust culture of patient safety at ARHT. We have introduced RSI checklists, human factors training, dedicated medical simulation training and (thanks to generous funding from ARHT and ADHB) a now a fully functioning clinical simulation lab. Furthermore we now have access to a multidimensional computerised safety management system (SMS) which the ARHT has recently introduced.  Originally developed for aviation, the system contains enough flexibility to expand into the clinical role of the HEMS service. This includes safety and quality reports (which can be anonymous), inventory management (circulation, maintenance and ordering of drugs and equipment) and personnel currency – ensuring all crew take a personal responsibility in being “flight-ready”.  There is also scope for a clinical risk register, which we are currently developing.

This brings me on to the value and/or role of Safety Management Systems (SMS) from a clinical perspective:

Definition of an SMS from the Civil Aviation Authority NZ  is a “formal organisational framework to manage safety.  Under an SMS, organisations will need to have systems for

  • ·         Error, threat and hazard identification
  • ·         Risk management
  • ·         Safety targets
  • ·         Reporting processes
  • ·         Procedures for audit, investigations, remedial actions
  • ·         Safety education…

… and to be effective it must be part of everyday practice”.

All these facets have an application to clinical care, not just aviation. Although it is arguable that we do all of the above currently, a systems approach means that the formally isolated clinical components of risk, safety, and quality of care are brought together in everyday practice in an integrated manner. This approach requires a strong safety culture within an organisation together with consistent managerial support.  It also needs individual accountability and that personnel are empowered to speak out.  It is in this complex environment where the integration offered by an SMS comes into its own.

The system and practice of safety not only relies on people putting their hand up and reporting incidents, it is also dependent on the staff on the ground having access to outcomes, therefore facilitating behavioural and organisational change.  Auckland HEMS safety reports allow all operational crew to have visibility of current and past events, to read comments from  the experts’ on a real time forum, as well as the opportunity to comment themselves. This whole-crew approach enables transparency and has been invaluable in providing a 360 degree scrutiny of the incident from both a clinical and aviation perspective, as well as the potential for change, outcomes and trends in incidents.

In the initial stages encouraging reporting can be difficult:  apart from the medico-legal aspect other barriers to reporting can include:

  • ·         Time
  • ·         Personnel buy-in
  • ·         Lack of “champions” in the organisation
  • ·         The thought that it is “somebody else’s job”

Error, near misses and incident reporting has not translated as well into medicine as it has in aviation. Humans in general have an innate distrust of any Orwellian “Big Brother” watching their every move looking for mistakes. This also stems from a prominent medico-legal “blame” culture, which medicine has been slow in overcoming.  Patient safety depends on open disclosure of error or near misses, primarily to avoid the same happening to someone else. Open disclosure (naively) would be the ideal, however may never be feasible due to the potential for individual blame for system faults.

To quote the Institute of Healthcare Improvement (IHI) “the focus must shift from blaming individuals for past errors to a focus on preventing future errors by designing safety into the system”.

The next question: would it be feasible to introduce an integrated SMS into the healthcare setting? A one-stop systematic shop for inventory, maintenance of equipment, personnel, rostering (shift-work and fatigue), clinical risk registers and safety reporting – and making this part of everyday practice? This has been approached by Toney in his paper (free online access to PDF here.)

Finally I would be interested in finding out how other HEMS / EMS services have developed their clinical risk registers. Feel free to comment below.

A Military Aviation model for Patient Safety?

In the September 2013 edition of the British Medical Journal, Robyn Clay-Williams has published a thought provoking article on the modelling of clinical risk management on civil aviation practices, and questions whether a military aviation model may be more prudent when assessing and managing risk in the healthcare environment.  The abstract can be found HERE.

The author questions the appropriateness of translating sometimes rigid civil aviation processes (and a zero tolerance for risk) into healthcare, as some healthcare systems (such as emergency departments and intensive care units) need more flexibility and autonomy in their workings and risk management. She suggests managing risk in high stakes clinical environments such as these would be more conducive to a military aviation model – the parallels being teams with limited resources who deal routinely with unpredictable situations, complex and time critical operations (as would happen frequently in the pre-hospital environment or the ED resus room).

Suggestions for improving the adaptability and resilience of health care organizations in the realms of risk management derived from a military model include:

  • planning for the unexpected
  • training for the worst: simulation training of worst case scenarios allows decision making under pressure and can help develop spare capacity
  • training disparate teams together: multidisciplinary and inter-departmental simulation training
  • learning about the limits of human performance
  • supported simulation allowing development of
    • self-awareness
    • contingency planning
    • communication skills.

At Auckland ED we have begun multi-disciplinary simulation afternoons with other clinical departments, out first event included HEMS, Emergency Department, Trauma Surgery, Cardiothoracics, Anaesthetics and Operating Theatres.  This was invaluable in ‘testing the system’ involving handover, clinical management, resourcing (labs, radiology, blood bank, theatre) and most especially inter-departmental communication and teamwork.  Our first simulation has garnered resounding positive feedback from all involved.

I would be interested in comments from others who are doing inter-departmental simulation and team training.

Click HERE for the full version of the article discussed above (secure area limited to ADHB staff)