When the National Institutes for Health issued new ethics regulations earlier this month, its doctors and scientists squawked. The rules ooze zero tolerance: They bar all NIH employees from consulting for outside entities (read: drug companies) either for money or for free. They forbid all forms of stock ownership over $15,000, even by secretaries, in a company with a medical tie. Any entanglement other than teaching classes and writing textbooks is off limits. Violators risk losing their jobs and getting slammed with civil or criminal penalties. The NIH does “anticipate” that it will allow Nobel Prize winners to go to Stockholm to collect their cash as well as their citations, but that hasn’t mollified the docs, who were given 90 days to sell off all their company holdings in drug or biotech companies.
If the protests sound to you like the howls of a greedy biomedical elite, consider that the NIH’s policy is likely to drive qualified physicians and scientists out of the national labs where they’re most needed—and slow the pace at which treatments and cures come to market. To be sure, the previous policy, put in place in the mid-1990s by then-NIH head Harold Varmus, may have gone too far by lifting all the restrictions on outside income that senior NIH researchers could collect from consulting jobs. The enticement of greener pastures appears to have led to serious lapses in judgments by at least two senior NIH officials. Dr. Bryan Brewer Jr. published an article extolling the virtues of Crestor, an AstraZeneca * drug used to control cholesterol, that, according to a Los Angles Times investigation, failed to disclose some of its known potential safety problems. And Dr. P. Trey Sutherland didn’t disclose the $500,000-plus in consulting fees that he received from Pfizer while working both for the NIH and with the company in studying patient response to Alzheimer’s disease—raising, though not establishing, the possibility that he’d shown favoritism toward the company.
These two incidents and others like them flatly violated the pre-existing version of the NIH rules, which requires full disclosure of potential conflicts. The private sector often follows a similar model. The NIH’s main response to the miscreants should have been to boost enforcement of the sensible rules it already had in place rather than piling on new ones. Disclosure gives the management team at NIH or anywhere else an opportunity to respond to danger before something untoward happens. Well-devised rules would allow that response to take a number of forms: Scientists could be relieved of a particular assignment; they could be asked to share responsibility with an outsider; they could be required to file reports about their activities; or they could be told to give up the outside gig in order to keep their NIH position. According to the NIH’s own account, 100 of its scientists and officials may have failed to file these critical disclosure forms. That points to an inexcusable breakdown in NIH discipline. But the institute would be better off letting a few heads roll at the top than crippling the collaborative efforts of thousands of underlings.
My solution probably won’t satisfy members of Congress like Colorado Democrat Diana DeGette, who is clamoring for “restoring public confidence in our nation’s premier research institution.” But DeGette doesn’t get what the NIH and the nation are giving up because she doesn’t understand why so many conflicts of interest arise. The explanation lies in the extremely thin nature of certain critical markets. A quick online search might identify dozens of researchers and docs who work on heart disease or diabetes. But only a small fraction of them will be up to speed on the particular exotic problem that faces a large biotech firm working on alloys, say, for a new cardiac stent. Strict ethical firewalls avert conflicts of interest, but they also slow down the rate at which valuable insights produced by basic research are brought to market. That delay translates not only into lost profits but into lost lives for the next generation of cardiac patients.
In my experience as a member of a conflict-of-interest committee at the University of Chicago, it is par for the course for the leaders in cutting-edge technologies to take consulting jobs or stocks in outside ventures that harness the results of their academic research. Rules that aim to manage conflicts of interest are likely to produce more innovation at a faster pace than efforts to sever government science from commerce. Allowing scientists to moonlight for industry makes it easier for them to stay in government laboratories precisely because they can supplement their income and broaden their experience by working both sides of the street. Private universities, for their part, often deal with conflicts on a one-by-one basis, especially since the 1980 passage of the Bayh-Dole Act, which actively encourages universities and their researchers to patent commercially valuable inventions.
Should government institutes be more sensitive about conflicts of interest than private universities? The answer is yes at the Food and Drug Administration, which has the sort of vast regulatory power that justifies a tough across-the-board rule. But the scientists in government service at the NIH do not look all that different from their counterparts in academic institutions—except, of course, for the political heat that leaders like NIH head Elias Zerhouni face in Congress.
Rather than caving, however, Zerhouni should have cleaned up the current mess, impressed on his research staff that disclosure obligations are serious business, and otherwise have gone slow. As Zerhouni noted, only 369 out of a total of 6,000 of the institutes’ scientists had consulting arrangements between 1999 and 2004, and of these more than 80 percent had received less than $5,000 in compensation. The numbers suggest that barring NIH employees from accepting outside fees above some sensible amount, say, $25,000 per year, along with tightened disclosure requirements, would allow most of the useful collaborations to continue while reducing the problem of big-time abuse. The new rules, on the other hand, run the risk of causing hardship for hundreds of NIH employees who’ve done no wrong. It is hard to say whether the current mess will lead to mass exodus or demoralization. But the NIH should ensure that neither will happen by regaining its senses and telling the ethics police to back off.
Correction, Feb. 16, 2005: The article originally misspelled the name of the drug company AstraZeneca as Astro-Zeneca. (Return to corrected sentence.)
Correction, Feb. 16, 2005: The original version of the author’s biographical note did not include the information about Epstein’s consulting work. The information was known to the editor who composed the note, who neglected to include it.