During two months starting in November 2010, more than 200 patients at seven hospitals in the MedStar Health system in the Maryland and Washington, D.C., region received double their prescribed concentration of potassium acetate.
In response to a shortage of the usual stock dose of the drug (2 mEq/mL), MedStar acted to fill the gap by procuring vials containing a 4-mEq/mL solution. The wrong concentration was entered into the IV compounder software, which then identified the new vials as containing the lower dose, and the drug was administered to patients receiving total parenteral nutrition. No patients experienced injuries or medical abnormalities, which is one reason why the mistake went undetected for so long.
That the error occurred is disturbing, but unfortunately no longer surprising in a time when medication errors are rampant. Yet something does set this event apart other than its scope and duration—the fact that MedStar chose to shine a light on the event and allow non-MedStar colleagues to observe its failings, its investigation and the changes it implemented to increase medication safety.
“We knew this error could happen anywhere because the compounding device we used is common and the problems with drug shortages are everywhere,” said Bonnie Levin, PharmD, MBA, the corporate assistant vice president of pharmacy services at MedStar Health. “We didn’t want this happening at another hospital; we want others to learn about the approach we took to identify and fix the problem and prevent it from happening again.”
The incident was central to a session at the 2012 American Society of Health-System Pharmacists (ASHP) Midyear Clinical Meeting (MCM) that explored the link between drug shortages and medication errors and how hospitals can mitigate their negative impact.
In addition to Dr. Levin, the speakers at the MCM session included Michael R. Cohen, RPh, MS, ScD, the president of the Institute for Safe Medication Practices (ISMP), and Terry Fairbanks, MD, MS, FACEP, an associate professor of emergency medicine at Georgetown University, in Washington, D.C., and director of the National Center for Human Factors Engineering in Healthcare (NCHFEH), a division of MedStar Health.
Rapid Response Team
“This is a rare situation where an entire health system allowed its staff to step forward and teach others what they went through,” Dr. Cohen told Pharmacy Practice News. “I’m sure it was embarrassing to go public with this event and I hope they get credit for stepping up like this.”
Dr. Cohen and other ISMP representatives, along with Dr. Fairbanks and other consultants from the NCHFEH were part of a response team summoned by MedStar within a day or two after the error was discovered. They worked closely with hospital clinicians and administrators to determine the cause of the error, examine IV compounding policies and procedures, and offer recommendations to decrease the risk for future errors.
The ISMP reviewed every step of the IV admixture process, from drug procurement, labeling and packaging, to communication issues, staffing competency and education. Among their findings:
Communication lapses played a prominent role. For instance, Dr. Cohen and his ISMP colleagues found that staff members were not always aware of the reasons for changing processes within the IV admixture service, and that a systematic approach to communication about shortages and product changes with staff was lacking.
“Communication always turns out to be a major issue when the ISMP conducts consultations,” Dr. Cohen said. “Everything from the way orders are communicated by computers, the way drugs are listed on a screen, look-alike drug names, handwriting problems, abbreviations, verbal orders, telephone orders, the way we speak with one another—all that falls under communication. In this case, the health system’s purchasing department knew it was buying the higher concentration because that’s all they could obtain, but that information never reached the people who make the adjustments in the automated compounder system.”
Dr. Cohen urged health systems to use the ISMP’s “Ten Key Elements” of the medication-use system when evaluating pharmacy processes (http://www.ismp.org/faq.asp#Question_3). “The Ten Key Elements offers a template for designing an interlocking system of safety checks and balances that is very helpful when alternative drugs are brought in,” Dr. Cohen said. “Hospitals will want to know what it has to teach the medical and nursing staffs about a drug in order to monitor it because it’s different. Do we need to adjust equipment because the concentration is different, for instance? Do we need to collect additional patient information that allows safer use of the substitute? What about differences in how to store the drug and the nomenclature we use? All these things come into play when using alternative therapies.”
Dr. Cohen also discussed the role of “inattentional blindness” in medication errors (http://www.visualexpert.com/Resources/inattentionalblindness.html). The phenomenon occurs when someone performing a task fails to see what should have been plainly visible. Later, the individual cannot explain the lapse. Inattentional blindness certainly played a part in the MedStar incident, because clinicians repeatedly looked at the stock drug vial but did not register the new drug concentration on the label, Dr. Cohen said.
Inattentional blindness has been the culprit in many errors documented by the ISMP, such as when a nurse pulled a vial of heparin from an automated dispensing cabinet, read the label, prepared the medication and administered it intravenously to an infant, despite the heparin concentration being 1,000 times greater than prescribed. The child died. In another event, a pharmacist entered a prescription for methotrexate into the pharmacy computer daily. A dose warning appeared on the screen, which the pharmacist read and bypassed, leading to a fatal overdose. According to a 2009 ISMP Medication Safety Alert, “In many cases, people involved in the errors have been labeled as careless and negligent. But these types of accidents are common—even with intelligent, vigilant and attentive people. The cause is usually rooted in inattentional blindness, a condition all people periodically exhibit” (http://www.ismp.org/newsletters/acutecare/articles/20090226.asp).
Dr. Fairbanks identified root causes of the MedStar incident from the vantage point of safety engineering. He found, for example, that the staff exhibited decreased sensitivity to changes after long-term drug shortages; to facilitate bar-code scanning, vials at the compounding station were hung upside down with labels turned away from the operator; and nonformulary drug concentrations were stored on formulary drug shelves. He also noted that a change in a drug’s concentration is a rare and, therefore, unexpected event—one that individuals are unlikely to detect unless forewarned. Among his group’s recommended corrective actions were measures to restrict access to the IV compounder database; define a formulary list in the drug purchasing system (purchasing outside of which required the buyer to involve the pharmacist); procedural review and revision; staff retraining and re-education; and promoting a culture of continual feedback among staff.
More broadly, he emphasized that the success of medication safety hinges on designing systems that compensate for inevitable human error—a tenet that is central to human factors engineering. “The key to designing safe systems is to design them around human performance and failings,” he said. “You can’t teach humans to not make mistakes; they’re an underlying fact of life, so we design systems to mitigate for inevitable errors.”
Lessons From Other Industries
For decades, human factors engineering has been a fixture in high-risk, high-consequence fields like aviation, nuclear power and the military. Only recently has it begun to rise within health care, Dr. Fairbanks said. “Health care has not become any safer since the 1999 Institute of Medicine report, To Err is Human, probably because we continue to take the misguided approach that if there’s a human error involved in an adverse event we can prevent it from happening again by telling other clinicians not to make that same mistake, or by disciplining or punishing the person who made the mistake.” A far more realistic and effective strategy, he noted, is to approach safety breaches with the attitude that if it happened once it will happen again, then take steps to make sure that when the error is repeated it won’t affect the patient.
“It’s striking to me, coming from a background as a safety engineer who then went into health care, that these concepts are so foreign in medicine,” Dr. Fairbanks said.
To make the case that safety improvements need to be targeted toward systems and not individuals, Dr. Fairbanks, a licensed pilot, noted that an average of four communications errors occur between pilots and air traffic controllers every hour on every commercial airliner. “Yet we hardly ever crash a plane anymore” because aviation safety adjusts for human imperfection, he noted.
But changing the entrenched way of thinking about safety will take some time. “It’s hard to change the culture in health care, where we really want to blame someone and punish them when something goes wrong,” Dr. Fairbanks said. “If our answer to patient safety is to stop humans—even those who are very well trained—from making mistakes, we’re going to fail.” The inertia within health care can be overcome by health system leadership and changes implemented hospital-wide or system-wide, he added.
A Just Culture—Just Do It
One approach is to adopt the concept of Just Culture, in which individuals are provided with the knowledge to act safely and are encouraged—even rewarded—for upholding safety standards, including reporting their own errors. In other words, people must feel secure that there will be no disproportionate consequences for doing the right thing. Yet they will still be held accountable for their actions, and ongoing infractions or purposefully ignoring policies can still result in disciplinary action.
“If the leadership changes the way they approach safety, then front-line people will change their approach as well,” Dr. Fairbanks said. “It’s a huge change of culture and it allows people to be more comfortable talking about their errors, which allows for learning and mitigation of those errors. It alters the entire tone of discussion and the environment. Even more importantly, it helps the people on the front line to recognize that they can mitigate errors in the future.”
It was the intersection of Just Culture and transparency within MedStar that empowered the people involved in the incident to openly discuss errors and system shortcomings with managers without fearing unfair repercussions, according to Dr. Fairbanks. “Our hospital isn’t different from any other, and this could happen anywhere,” he said. “We didn’t injure a patient, but someone else could injure a patient, so we have a duty to report it and talk about it.”
Another reason that human factors are so important in health care, noted Medstar Health’s Dr. Levin, is because “we have cobbled together over the years a collection of different technologies that don’t integrate with each other very well. At each point, there is an opportunity for a human intervention, and, therefore, an error. Human factors looks at the whole picture and tries to understand all of it, identify where the weak points are and strengthen them.”