At the end of yet another 36-hour shift, just one task lay between me and blessed sleep: threading a central venous catheter from my fortysomething patient’s neck into the blood vessels near his heart. Bob (not his real name) was suffering from liver failure, caused by the hepatitis C virus he’d contracted from shooting heroin with contaminated needles. In most cases, an ordinary IV does the job ferrying fluids and medications. But Bob was a tough IV stick. A “central line” was essential.
Hands shaking, Bob signed a consent form after I outlined the risks of the procedure–bleeding, collapsed lung, infection. Bob was all too familiar with the risks of infection, thanks to me, an inexperienced, haggard medical intern. A day earlier, Bob had been the recipient of my first central-line placement. The incision became infected–a potentially deadly complication for Bob because liver damage weakens the immune system–and I almost had to wheel him into the intensive-care unit.
My supervising resident walked me through the central-line procedure after we gowned up. I felt the spot on Bob’s throat, nestled under muscle, where I would pierce the skin and hit the internal jugular vein. My fourth jab with a “finder” needle filled the syringe with a reassuring flashback of blood. After making certain I hadn’t nicked an artery, I inserted a larger needle into the skin just above the first needle. After the large needle’s flashback, I threaded a long coil into the blood vessel and into the larger vessels near the heart.
Finished, we ordered a chest X-ray to ensure the catheter was sitting in the right location. I left the hospital relieved. But while on rounds 12 hours later–after my blessed sleep–a nurse told me about the yellowish liquid oozing from Bob’s neck near the catheter placement. I had punctured his thoracic duct, a collection of the lymphatic fluid that carries infection-fighting cells throughout the body.
When my father began his internship in pediatrics 34 years ago, most interns had, as medical students, practiced catheter placement and surgery on dogs for months in a vivisection laboratory. Today, thanks to protests by anti-vivisectionist groups and a queasy public, few medical students practice on animals. A 1994 study by the Association of American Medical Colleges found that only 23 of the nation’s 125 medical schools required vivisection in physiology courses and just eight required it on surgery rotations.
That means that someone’s grandfather is the first living thing most medical interns stick with a needle, cut open, or sew back up. Studies have found that rookie doctors’ patients suffer no more complications than the experienced doctors’ patients. But these studies are flawed. Most were either conducted under well-controlled circumstances in the most prestigious teaching hospitals or were too small to be statistically significant. Also, some of these studies don’t account for “near miss” complications such as Bob’s infection and punctured thoracic duct. Although such mishaps may not have long-term effects, they do contribute to patient stress. I know that Bob would have benefited if I had performed more line placements–on either dogs or patients.
Some hospitals avoid teaching the technique of central-line placement by employing teams of non-doctors who specialize in the task. Such teams work well, but most hospitals can’t afford them. Doctors should learn the technique themselves, especially in case of emergencies.
Many medical schools offer computer simulations and other alternatives such as “Stan” for students who refuse to operate on animals. Manufactured by Medical Education Technologies Inc., Stan (short for “Standard Man”) is a realistic dummy upon which doctors practice resuscitation. But despite all the sophisticated medical monitors attached to him, Stan has his limitations: He’ll never experience a life-threatening emergency that forces you to work in less-than-ideal conditions. If you fail Stan, his family won’t be waiting in the hall for the terrible news. And you can’t practice central-line placement on him.
Doctors take their practice where they can find it. Last March, the Israeli army provoked public outrage when it admitted to using the bodies of dead soldiers to teach complex medical procedures to doctors-in-training. The practice was revealed after one fallen soldier’s father, himself a doctor, noticed an incision on his son’s neck. His son had been killed in the line of duty, and no resuscitation attempt could explain the surgical wound. Evidently, the army doctors had long condoned an unwritten rule allowing medical practice on soldiers whose families had given permission for autopsies. (But this soldier’s family hadn’t given autopsy permission.)
Practicing on dead patients is nothing new. In Robin Cook’s semi-autobiographical book, The Year of the Intern (1973), the protagonist takes aim with an epinephrine-filled syringe at the still heart of a patient who moments before he had unsuccessfully tried to resuscitate. Next he shoves a breathing tube down the dead patient’s throat. Neither exercise is intended to revive the patient. Last year, medical student David Cook wrote in the journal the Pharos about his own experience observing as medical students and doctors practiced placing a breathing tube into a freshly deceased patient.
Exploiting the newly dead sounds ghoulish, but the medical establishment rationalizes the practice–at least in private–by saying that it’s better than letting interns fumble on live patients. The practice is common enough that a medical ethicist didn’t raise eyebrows at the hospital where I completed my internship when he surveyed residents about how many had placed central lines into comatose or on-the-verge-of-death patients. Likewise, the Israeli army’s chief medical officer, responding to the public outrage, said that if authorities ban his doctors from practicing on fallen soldiers, it will be the injured who will suffer.
So what’s worse: defiling cadavers or practicing on patients or animals? My two-fold answer, even if it makes us all uneasy, is to reintroduce vivisection on a limited basis to all medical schools and to allow surgical practice on the recently dead donated bodies.
Each year, about 16,000 students begin medical school in the United States. One animal for every four students would mean the sacrifice of 4,000 dogs per year. Hundreds of dogs are euthanized every day. Might they be used for this purpose before meeting their ends if we make their deaths as humane as possible? Students who object to animal experimentation should be allowed to decline, but be required to spend extra time with Stan and his friends.
As for human surgical practice, we should create a system akin to organ donation: Individuals could give consent on their drivers’ licenses or when they’re admitted to the hospital. We could easily convince folks of the similarities between donating your body to medical schools for dissection and donating it to hospitals for practice. Still, there won’t be enough dead bodies to go around. If we won’t sacrifice a few dogs, we’ll have to sacrifice our own comfort and safety, a sacrifice most patients or hospitals don’t really want to make.
Even though Bob survived my amateur central-line techniques, I’m guessing that he would have preferred a more experienced intern. I know how he feels. I rue the day when they strap me down for my inevitable bypass operation. As I drift off into unconsciousness, the last thing I want to hear the senior surgeon say to a junior associate is “So this is your first bypass? Well, the very first step is the incision. Cut him there. No! Not there! There! “