Soon after Roe v. Wade was overturned, a neonatal nurse took to a local Ohio newspaper to share how strongly she agreed with the Supreme Court’s opinion. Instead of explicitly expressing religious views or personal beliefs, she shared that in her “professional experience” the 1973 cementing of national abortion rights “led to the utter demise of respect for humanity at any lifestage and has, singlehandedly, led to a demise in our societal culture and ethical values.” She noted that 99 percent of people seeking abortions are doing so “as a birth control method.”
The newspaper piece is a startling artifact of the anti-choice movement. The American College of Obstetricians and Gynecologists is firm in its own stance: “Abortion is an essential component of comprehensive, evidence-based health care.” Research shows that the reasons why people seek abortions are myriad—including because they want to be able to adequately care for their current children.
A nurse straying from expert consensus isn’t so unusual. A surprising amount of anti-science drivel is spread by health care workers themselves, whether from dubious professional associations of OB-GYNs falsely claiming that abortion raises suicide rates (it doesn’t), fringier types who tell legislatures that vaccines leave people magnetized, or skilled merchants of doubt with dozens of employees and formidable financial war chests who profit off all sorts of unproven treatments.
Now, calls are growing to sanction doctors who weaponize their white coats to undermine science, in particular when it comes to one of the biggest health concerns of our time: the pandemic. (Doctors are usually only disciplined for egregious misconduct, and boards have been reluctant to punish doctors who make false statements outside of health care settings—that Dr. Oz is still has a license is telling). The Federation of State Medical Boards warned that spreading information about COVID-19 vaccines or treatments that wasn’t “consensus-driven” could lead to suspension or revocation of a physician’s medical license. Accrediting societies followed suit with a sternly worded email to members warning that peddling misinformation could put their board certifications at risk. Even lawmakers have jumped into the fray: The California Legislature is considering a bill to strip doctors who spout misinformation to their patients of their medical licenses.
These efforts are well-meaning and might seem like common sense, particularly in a world where lawmakers display confusion over the female reproductive system even as they regulate it, and where a considerable slice of Americans are still hesitant to get vaccinated against COVID. But they are all too likely to backfire.
Defining “medical misinformation” is much more difficult than you’d think. Sure, a doctor who claims vaccines cause autism is wrong, plain and simple. There’s no good-faith way to engage with the facts and come to that conclusion, because decades of high-quality research justify a scientific consensus. But in a fast-moving pandemic where “consensus” is sometimes elusive, it’s far more difficult to distinguish between misinformation and legitimate differences of opinion. Remember when everyone was torn on whether the coronavirus could spread through the air? In hindsight, the answer might seem obvious, but the debate included plenty of nuanced discussions.
The misinformation fighters struggle to come up with a solid definition of the term misinformation themselves. The Federation of State Medical Boards defines the term as “health-related information or claims that are false, inaccurate or misleading, according to the best available scientific evidence at the time.” (It defines the related term disinformation as misinformation spread intentionally and maliciously for, by way of example, financial or political gain.) But as objective as these definitions strive to be, there’s simply no way for them to avoid subjectivity. Take one particularly slippery word: misleading. Many claims that might be deemed misinformation aren’t outlandish falsehoods but are rather sophisticated assemblages of observations, data, and speculation. Whether a particular assemblage is “misleading,” a thin argument, or a good argument that just deviates from other arguments, is often in the eye of the beholder.
A recent presentation shown at a meeting of the Centers for Disease Control and Prevention’s vaccine advisory committee illustrates how slippery the term misinformation can be. At the meeting, a scientist presented data stating that COVID was one of the top five causes of deaths for young children, and the statistic was widely circulated by scientists. Among other problems, the presenter compared cumulative COVID deaths over two years of the pandemic to annual deaths for other causes—thus artificially inflating the toll of COVID. After adjusting, COVID deaths dropped to a top 10 cause of death. Don’t get me wrong; “top 10” is nothing to celebrate. But bumping COVID up to the “top five” raises the stakes of authorizing the vaccine for young children and provides an ear-catching, cable-news-ready talking point. This sleight-of-hand, intentional or not, was misleading. It failed the fact-checking test. And while it wasn’t part of the kind of coordinated disinfo campaign that drives vaccine hesitance, it did spread widely. Was it misinformation? Hard to say.
Even the term scientific evidence is tricky to define. The federation says evidence must come from “peer-reviewed journals, methodologically-sound clinical trials, nationally or internationally recognized clinical practice guidelines, or other consensus-based documents.” But so much important scientific information during the pandemic—whether real-time data from health departments or preprints—has not undergone peer review. If you’re among the first in your field to propose a new idea, or to blow the whistle on a harmful trend, you’re by definition going against consensus. And, as we’ve seen during this pandemic, what is misinformation today can be consensus tomorrow.
Doctors aren’t alone in struggling with how to define misinformation. As neuroscientist and author Erik Hoel pointed out in a recent newsletter, philosophers of science call this the “demarcation problem.” Some of the top minds of the century—think Thomas Kuhn or Karl Popper—have tried to determine, philosophically, what the difference is between “science” and “pseudoscience.” They’ve failed. Social media companies have tried to determine what counts as misinformation and what doesn’t; though they may not be incentivized to truly tackle the problem, they have poured in significant resources. And they too have failed. It strikes me as a bit hubristic to think that a panel of doctors can succeed where brilliant minds and deep-pocketed corporations have not. To be sure, the boards are aware of these complications—see this insightful article by Richard Baron, president and CEO of the American Board of Internal Medicine—but history suggests the problem of defining misinformation is a bigger roadblock than they’d like to admit. Even Baron admitted to Politico that he’s not sure of what’s the most effective way to act.
It’s easy to avoid these philosophical quandaries when sanctioning a doctor for recommending orange juice instead of insulin—which goes against decades of empirical evidence and runs counter to biological understanding. It’s another thing if medical authorities were to sanction a doctor for contradicting the CDC on boosters for healthy young adults. The CDC urges everyone eligible for a vaccine to get a booster shot, but it’s reasonable to make a case that an extra dose confers little benefit for healthy vaccinated males between 16 and 29 years of age, who are at low risk for COVID but at increased risk for vaccine-caused myocarditis. Or take this example from reproductive science: The Food and Drug Administration maintains that Plan B could prevent implantation of a fertilized egg when research shows that’s actually not the case (Plan B seems to work not by preventing implantation but by preventing ovulation and fertilization). This means that a doctor who accurately explains that Plan B does not prevent a fertilized egg (which is, in some places, a political “life”) from growing could be accused of peddling misinformation for contradicting the FDA.
Punishing doctors for medical misinformation won’t occur in a political vacuum—it will make martyrs of the outcasts and could end up undermining the medical licensing system. Already, Tennessee and North Dakota have passed laws prohibiting medical licensing boards from taking action based on prescribing ineffective COVID drugs like hydroxychloroquine or ivermectin, and similar bills have been introduced in at least 20 other states. It’s unlikely the involvement of the medical powers that be would even solve much, though it could come with a steep political cost. When it comes to the most harebrained ideas, the typical physician isn’t plucking them out of thin air. According to a recent report by the Center for Countering Digital Hate, only 12 people are responsible for almost two-thirds of COVID vaccine–related mis- and disinformation on social media. Doctors peddling misinformation are a symptom, not a cause of the problem.
The urge for medical boards to do something is understandable, but the hard truth is that misinformation may be a social problem that boards and certifying organizations aren’t equipped to handle. Setting aside the problems in defining what is or isn’t true, at the heart of many misinformation wars are not facts but values. Whether or not parents should be able to opt out of vaccines for their children, whether or not pregnant people should be able to access abortions—those are not scientific questions, but legal and moral ones. Regardless of the anti-choice Ohio nurse’s misuse of the term birth control, her central contention is that abortion shouldn’t be a part of routine health care. The American College of Obstetricians and Gynecologists contention is that it should. This simply isn’t resolvable with facts.
This isn’t to say that medical authorities should let themselves be bullied, but they should understand that prosecuting practitioners over facts, when the real problem is differing values, could allow things to backfire. After all, if the onslaught of Republican bills attempting to defang the medical licensure system is any hint, medical boards would be served by taking a light touch to discipline, or they could find their hands tied for years to come.