Reducing Medical Errors Requires Leadership, Culture of Learning
How do you create a caregiving culture where employees are free to speak up if they see a problem and know what to do if there’s an issue? Two physicians addressed this topic at Clinical Center Grand Rounds on Mar. 22.
First, Clinical Center CEO Dr. James Gilman described how a military hospital responded to a large-scale adverse event that affected more than 2,000 patients. Then, Dr. Peter Pronovost, senior vice president for patient safety and quality at Johns Hopkins Medicine and director of Johns Hopkins’ Armstrong Institute for Patient Safety and Quality, chronicled efforts to improve patient safety using principles of high-reliability organizations.
In 2009, Gilman had oversight responsibility of a military hospital. Back then, a young nurse alerted the hospital’s chief nurse to the improper use of insulin pens. The pens are intended for single patient use because of the risk of transmitting blood-borne diseases.
Although nurses changed the needle for every patient, they used the same pen on multiple patients.
The chief nurse alerted hospital leadership. They determined 2,114 patients with diabetes received treatment at the hospital. There was no way to find out who received insulin from a pen used in another patient.
“There is really no way to keep this quiet with 2,114 patients,” said Gilman. “There was a lot of interest in the media.”
The hospital alerted the Food and Drug Administration, let the public know what happened and set up a call center to answer questions from patients. The hospital also contacted affected patients and offered HIV and hepatitis A, B and C testing and treatment.
Once the incident went public, local press and elected officials called for someone to be held accountable. The hospital commander suggested reprimanding the chief nurse. Gilman advised this approach wouldn’t help. It would discourage other nurses from speaking up when they saw a problem. Although many nurses were involved, they didn’t realize they violated safety protocols. The commander never reprimanded anyone.
“This was clearly a systems error,” Gilman said. The hospital did everything it needed to do—from building a culture where young nurses can speak up without fear to offering free treatment. Scapegoating a nurse wouldn’t have helped the situation.
Next, Pronovost recounted his efforts to reduce the frequency of medical errors at Johns Hopkins Hospital and Health System, which consists of 6 hospitals, 10 surgery centers, 90 primary care sites and 3 home care companies.
In 2001, a young girl named Josie came into the Hopkins pediatric intensive care unit with burn wounds. Although she was treated, she contracted a catheter-related bloodstream infection and died. At the time, Pronovost thought, “When you care for sick people, sometimes little girls like Josie are going to die.”
On the anniversary of Josie’s death, her mother asked Pronovost if things [at the hospital] were better. He couldn’t say yes, but promised improvements.
He introduced a 5-item checklist for preventing bloodstream infections. Soon after, the infections disappeared at Hopkins. “We took this program and spread it state by state across the U.S. and, now, to 15 countries,” he said. These types of infections are now down by 80 percent across the country.
The hospitals most successful at fighting infections create an “infrastructure to improve.” Pronovost said these organizations did several things. They declared a goal of zero infections, organized training programs, gave clinicians relevant data and helped with project management. Doctors regularly met with leadership. The organizations also transparently reported results.
Pronovost said clinicians at the hospitals believed a kind of mantra: “Infections are preventable and I am capable of doing something about it.” They thought this because their bosses trusted them and, in turn, clinicians talked among themselves about how to improve. There was a culture of learning, not judging.
“We felt pretty good about that story until I had to look into the eyes of another mother who lost a child,” he said. “Nothing humbles you more than looking into the eyes of a parent who had a child die needlessly from medical errors.”
The girl succumbed to cardiac arrest after undergoing elective orthopedic surgery. She had been given pain medication that stopped her breathing. In the operating room, a drug infusion pump didn’t communicate with the monitor that measures breathing. No one knew what happened until it was too late.
That prompted Pronovost to study highly reliable systems or organizations that regularly perform complex, dangerous actions without catastrophe.
He discovered that such organizations standardized wherever they could and fostered resiliency. They also respected every employee, each of whom was taught that he or she has two jobs—the one they are hired to do and the job to make the workplace better.
“These organizations don’t see quality as this whack-a-mole game,” Pronovost explained. “Rather, it’s an integrated structure of governance, leadership and management, capacity-building and informatics, all aimed towards zero harm.”
At Hopkins, a computer program keeps track of 1,300 measures across the entire system. The analytics focus on safety, patient experience, best practices and health care equality.
“Too often, accountability means ‘I shame you by starving you of resources,’” Pronovost concluded. “What we created in this shared accountability system is a very simple model that says ‘Any higher level in the organization is only allowed to hold a lower level accountable if they hold themselves accountable to maximally set those under them up to succeed.”