ORLANDO—A patient with a history of having a “difficult airway” is under the care of an intensive care unit resident who is under pressure to transfer patients from the ICU. The patient develops an air leak around the tube, which is seen as a sign to extubate, but it’s done without extra airway expertise from anesthesiology or otolaryngology personnel. There’s a rapid airway obstruction and the ICU can’t resolve it, so an emergency tracheostomy has to be performed, and the delay in securing the airway leads to anoxic brain injury.
Explore This IssueNovember 2014
The incident, discussed during the meeting’s panel session “Human Error and Patient Safety,” shows how errors can result not from a lack of knowledge but from biases that have nothing to do with how much you know, said Karthik Balakrishnan, MD, MPH, a pediatric otolaryngologist at Mayo Clinic in Rochester, Minn.
The session explored the pitfalls of bias into which physicians can easily fall if they’re not careful, and ways to make system changes to support them and to prevent errors from occurring.
In the extubation example, the resident and the ICU team fell victim to several clear biases. There was anchoring bias—the team relied too heavily on the air leak as a reason to extubate and discounted the history of difficulty. There was also a motivational bias: They wanted to keep the attending physician happy by moving patients along. And there was confirmation bias, Dr. Balakrishnan said. “There was intense pressure to get the patient extubated, so they found reason to extubate,” he said.
Since then, an airway checklist has been developed at the institution. Any ICU patient extubated at the bedside has to go through a checklist identifying possible complications, and ancillary staff and equipment are put in place to prevent those.
Ellis Arjmand, MD, PhD, director of otolaryngology at Texas Children’s Hospital in Houston and the session moderator, said most decisions that lead to errors “are not intentional or due to a knowledge deficiency, but rather are due to the quality of the decision—thus, awareness of the factors that can bias decision quality is essential.”
Jo Shapiro, MD, chief of otolaryngology at Brigham and Women’s Hospital in Boston and director of the Center for Professionalism and Peer Support, said much of that center’s purpose is to help doctors move away from a natural inclination to distance themselves from colleagues who’ve made errors. “We want to say to ourselves, ‘I would never have done that,’” she said.
—Jo Shapiro, MD
An atmosphere in which errors lead to sadness, shame, and fear makes for a system in which errors can’t be deconstructed so that lessons can be learned from them, she said. So, at the Center for Professionalism, physicians receive support from fellow physicians who are separate from those actually deconstructing what went wrong. “The purpose is to have people feel and experience a culture, a system, where we acknowledge the devastation that most of us feel after things have gone wrong,” she said.