In commercial aviation, the last passenger fatality on a large U.S. jet was more than a decade ago. In health care, there may be as many as 200,000 preventable deaths each year in this country alone. We must stop thinking of these as unavoidable, and instead think of them as unimaginable.
Long before I became known for the Hudson River landing of US Airways Flight 1549, I had spent my professional life becoming expert at the science of safety. Decades in the cockpit, combined with years of airline safety work as an accident investigator and an airline crew instructor, taught me that good outcomes are the result of reliable systems, good leadership, consistent use of best practices, clear communication – and years of preparation. It doesn’t matter if your domain is the cockpit or the operating room: safety requires a system and a culture that must be learned and practiced by every member of the team. And that is why I so strongly believe that there is much our health care system can learn from the impressive system and culture of safety that have been developed in the airline industry.
How can two seemingly disparate worlds be connected? Consider that aviation and health care are both high-risk, complex, evidence-based domains that require high-level human performance. Now contrast the safety records of these two fields. In commercial aviation, the last passenger fatality on a large U.S. jet was in November 2001, more than a decade ago. Not so in health care. As we know from the Institute of Medicine reports and others, there may be as many as 200,000 preventable deaths each year in this country alone, including deaths resulting from what are considered to be medical errors – but are really system failures – and health-care-associated conditions. That’s the equivalent of 20 large jetliners crashing a week with no survivors, nearly 3 a day. After about the second day, we would see what we had after September 11, 2001 – a nationwide ground stop. There would be a Presidential commission, Congressional hearings; the National Transportation Safety Board (NTSB) would search out causes. No one would fly until we had solved the problems. Because airline accidents are very rare, they involve many people at once, they are noteworthy and newsworthy, we have achieved in aviation the public awareness and the political will to act. And that’s what’s lacking currently in medicine, along with leadership and direction, and a real sense of urgency, to address a problem that is systemic, huge and immediate. There are many who still think of these deaths as an unavoidable consequence of providing care. We must stop thinking of them as unavoidable, and instead think of them as unimaginable.
One remedy would be the establishment of an entity like the NTSB to investigate select, representative medical failures. (See An NTSB for Health Care — Learning From Innovation: Debate and Innovate or Capitulate). This, I believe, would help move medicine from the current blame-based system to a learning-based system in which accountability and learning are fairly and accurately balanced, and people feel free to report not only their own mistakes but system deficiencies that might lead to an accident. Through the NTSB the aviation industry has a formal lessons learned process. It comes up with probable causes and contributing factors. It makes recommendations to the rule makers and the industry about how to prevent a recurrence. This information is globally disseminated, but locally actionable.
Another remedy is to change the culture involving what I call human skills. In the old days of aviation, captains could be gods and cowboys. They often ruled their cockpits by whim, according to idiosyncrasies and preferences, with little consideration of best practices. If someone spoke to a captain about an unsafe practice, they put their jobs on the line. Thankfully, those days are long gone. We have achieved much better standardization; we have taught captains that they have to be the builders and leaders of teams; we set the tone, we create an environment of psychological safety, where there are no stupid questions, where we create a shared sense of responsibility for the outcome. It’s not about who’s right, it’s about what’s right. And paradoxically, it’s this reliability, this standardization of processes that becomes the firm base on which we can innovate when we face the unexpected. That’s what my crew and I did on Flight 1549. This was something we never trained for, it was something we had never envisioned, and we had 208 seconds to solve this life-threatening problem we had never seen before.
For more than a hundred years now, we have been learning important lessons at great cost, many of them literally bought in blood. Almost everything we know in aviation, every procedure, every rule, we have because people have died. All these lessons that have finally made aviation so ultra-safe, we are now offering up to medicine for the taking.
What would it take for health care systems to adopt some of the practices of aviation? If there were a national reporting agency for medical errors and near misses, would you be more likely to report? Tell us what you think in the Comment box below.
Best known as the hero pilot from the “Miracle on the Hudson,” Chesley B. “Sully” Sullenberger III has been dedicated to the pursuit of safety for his entire adult life. An aviation safety expert and accident investigator, Mr. Sullenberger serves as a CBS News Aviation and Safety Expert, as well as founder and chief executive officer of Safety Reliability Methods, Inc., a company dedicated to management, safety, performance, and reliability consulting. He is also on the editorial board of the Journal of Patient Safety and a member of the Greenlight Group, a team of world class experts supporting a number of global health care research and development initiatives.