A colonoscopy may be the routine medical test that people fear the most. But for more than 10,000 Veterans Affairs patients, the most alarming part came long after the test was complete. In February, they each received a letter saying You received a colonoscopy with an instrument that may have been contaminated with hepatitis and/or HIV. The letter confessed that those serpentine, fiber-optic cameras used to perform routine screening tests for colon cancer had been improperly sterilized in three hospitals in Florida, Georgia, and Tennessee for the past four years. Although we may never know how many patients actually became infected from the procedure, at last count, 38 patients tested positive for hepatitis or HIV after their colonoscopies.
While upsetting, this story could be seen as just another preventable error in our dangerous health care system. In the past year, we have heard lots about medical mistakes: Dennis Quaid’s twins and their blood thinner overdose at Cedars-Sinai. Esmin Green, who died from a blood clot while waiting for a bed in the E.R. Before that there was Jessica Santillan, the child who died from a mismatched lung transplant at Duke. Infected instruments aren’t new news, either: Two patients died at Johns Hopkins in 2003 after being exposed to deadly bacteria in contaminated bronchoscopes, which are used to look into the lungs.
Medical horror always makes headlines. But one of the biggest headlines of all was the 1999 Institute of Medicine report To Err Is Human,which announced that up to 98,000 preventable deaths occur each year in U.S. hospitals. Since then, health care improvement organizations such as Leapfrog Group have invested copious resources in reducing preventable errors. But a key issue has been overlooked in this movement: The original estimate—the 98,000 deaths—may have been way off. In fact, some of the researchers who conducted the original studies used in the IOM report re-evaluated their data in 2002 and reported that had they used a different calculation method, the number of estimated deaths would have been less than 10 percent of the original. Oops.
Another example of fuzzy math in patient-safety calculations: This article points to 5,000 preventable deaths per year in intensive care units. But other patient-safety researchers estimate that 53,000 lives could be saved annually by staffing intensive care units with only specialists. So how many ICU patients are really dying unnecessarily? Five thousand? Fifty thousand? It is hard to square these numbers, which are different by a whole order of magnitude.
The problem is that it’s really difficult to measure medical errors. Distinguishing avoidable injuries from expected complications is a particular challenge. Patients who are in the hospital getting high-quality care are often there precisely because they are quite sick. They sometimes get worse despite excellent care by the best nurses and doctors. And while some hospitals may be better than others, it turns out that monitoring medical mistakes can’t differentiate the “good” ones from the “bad.” In fact, many of the items used by Leapfrog to measure patient safety—such as encouraging a “culture of safety” to promote an atmosphere in which staff members can discuss safety concerns freely or requiring hand-washing to prevent the spread of germs—have been shown to have no real effect on your chance of leaving the hospital alive.
Because it’s so tricky to calculate deaths and adverse reactions caused by medical errors, tabulating mistakes can’t reliably define high-quality medical care. Clearer metrics include measures of care coordination, such as how long you wait before an appointment or to be seen in the E.R., or whether care is in line with evidence-based guidelines. In this context, let’s revisit the VA debacle. Believe it or not, the VA excels at delivering high-quality, well-coordinated, and evidence-based care, as documented by Dr. Philip Longman in his recent book (the foreword was written by Slate’s Timothy Noah). But the VA’s success doesn’t make the contaminated colonoscope fiasco any easier in the (proverbial) end.
A well-run hospital can still have system failures. In the case of the VA colonoscopies, the problem was unlikely due solely to incompetence. An injury-control or human-factors analysis, which we have written about on these pages, would likely find that either the scope or the sterilizing equipment was designed improperly. When it comes to medical errors, the best solutions don’t focus on retraining or firing the workers who bungled the job. Instead, they come from initiatives to invest in equipment designed so that the mistake couldn’t be made in the first place. In this case, the scope manufacturer claims that an auxiliary water tube wasn’t cleaned as instructed. The VA should demand a safety mechanism to make it difficult or impossible to use an unsterile scope. The same principle applies to the Quaid twins: The bottles should have been designed to make it hard, not easy, to give an adult dose of the medication.
Ultimately, what’s a patient to do when medical care is needed? When high-profile medical errors occur at the U.S. News & World Report No. 1 hospital and cultures of safety don’t save lives, it might be difficult to trust you’ll leave the hospital healthier than when you arrived. The answer is to have realistic expectations and recognize that things are improving. (The fact that trusted journals and health care organizations report preventable death rates that may be overestimated by a factor of 10 should also be some small comfort.) Medical ranking systems are imperfect, but strides are being made to find better ways to differentiate high-quality care. Designing (and using) processes that make it hard to make mistakes through incremental change are a step in the right direction, even if individual safety initiatives don’t change aggregate mortality rates. But you don’t need big numbers to demand improvement. Whether 98,000 or 9,800 preventable deaths occur in hospitals each year, it is still way too many.
And realize that dramatic medical numbers (even not entirely correct ones) can be a call to action that saves lives.