Elizabeth Holmes—once the world’s youngest self-made female billionaire, known for her Steve Jobs black turtleneck—has gotten a lot of press in her short career, but these days it focuses more on her legal trouble than her acumen. Her startup Theranos was supposed to revolutionize health care by creating medical tests that required only a single drop of blood. Instead it has closed its last blood-testing facility, and Holmes is giving away her own shares in the company, following investigative journalist John Carreyrou’s series of articles that showed its technology didn’t work as promised—in fact, it didn’t really work at all.
As a brilliant Stanford undergraduate, Holmes could have followed a conventional path to assess her blood-testing idea, and it runs through the low-paying corridors of academia. She could have applied to graduate school in chemical engineering, gotten a Ph.D., spent another few years in a postdoc, and then—with luck on her side—she might have landed an academic position as the principal investigator of her own lab. In that role, she could have competed with other researchers for cramped lab space and meager federal grants. She could have published her results in scientific journals, but only after getting past a phalanx of peer reviewers demanding additional analyses and experiments. Best case scenario, after several decades of toiling away at the lab bench, Holmes could have achieved tenure, gotten a high citation count, amassed a decent 401(k), and had the satisfaction of having contributed to the public good.
Instead, Holmes went the “disruptive” route. The billionaire route. She dropped out of college, started Theranos, and raised millions of venture capital money. She assembled a board of high-flying advisers—George Shultz, Sam Nunn—with more political expertise than scientific know-how. She inked a lucrative partnership with Walgreens. She enlisted Gen. James Mattis to pilot Theranos testing within the Department of Defense, even though its Food and Drug Administration approval was still pending.
Along the way, Holmes became a poster child for a certain kind of free-market thinking about science. Why stifle innovation within the plodding, punishing ivory tower? Why not test good ideas within the marketplace? Trump’s proposed budget includes steep cuts to the National Institutes of Health and other federally funded research agencies and programs. In a section on energy spending cuts, he sums up his philosophy quite succinctly: “[T]he private sector is better positioned to finance disruptive energy research and development and to commercialize innovative technologies.” Translation: If that new experimental idea is promising enough, it’ll garner support from investors and customers. Research that isn’t marketable will fall by the wayside, and that’s a good thing. Picky regulations just slow down progress.
But Theranos illustrates the flaws in this argument. Holmes kept the company’s blood-testing technology closely guarded, essentially treating it as a trade secret. As a result, it never withstood the pressure tests that peer review and academic publication might have provided. Holmes and her staff had a massive profit motive—an all-too-powerful incentive to push a faulty product. Although scientists on her payroll tried to alert her of problems with Theranos’ tests, she didn’t heed their warnings.
In contrast, academia works hard to manage investigator conflicts of interest. Publication, grants, and tenure are all adjudicated by committees of fellow scholars, not shareholders. Sure, the incentives can get warped within universities and publicly funded laboratories as well. Getting tenure often rides on getting research results; I’m an early-career researcher myself, and the pressure to publish is a constant drumbeat in the back of my skull. But, though these pressures can lead to problematic research practices, the scientific enterprise, at its best, is self-critical and self-correcting.
Moreover, academia is designed so that research careers don’t rest on a single significant finding. Tenure protects academic freedom and makes sure that researchers’ jobs are secure even if their studies and projects don’t always pan out. Federal grant awards are based on the soundness of an investigator’s theory and hypotheses, not on whether their ideas will yield bankable results. Basic science, which lays the building blocks for applied science, is supported by agencies like the National Science Foundation even if its “usefulness” within the commercial marketplace is not yet obvious.
Academic careers can be a long slog. But the careful, peer-reviewed pace of public science ultimately provides a set of checks and balances that helps ensure that research is solid before it reaches the public. When we favor “disruptive” biomedical innovation over iterative, careful science, we risk the public’s safety and trust. No patients died because of faulty Theranos testing, but thousands of people paid for a technology that didn’t work.
The private sector can help jump-start promising research and translate it into treatment. But at best, it works in dialogue with the basic research base generated by publicly funded science. If you think you can slash NIH, NSF, NASA, and other research agencies, and expect the free market to move science forward, I’ve got some discounted shares of a blood-testing startup to sell you.