The CDC is the US public health service that is primarily tasked with protecting the nation from epidemics. But the NIH matters too because it develops vaccines to prevent the spread of epidemics and therapies to treat the infected. So in the light of Ebola, we should reflect on the value of NIH research and on what it costs.
Francis Collins, the Director of the NIH, made a controversial claim that
“…if we had not gone through our 10-year slide in research support, we probably would have had a vaccine in time for this that would’ve gone through clinical trials and would have been ready.”
Collins’ statement… irks scientists [because it conveys] certainty, the idea that if only more money had been spent, we’d likely have a vaccine by now. They know that’s not how vaccine development works. Scientists don’t get to name a price for the development of a vaccine — the science is just too uncertain.
Put another way: if it’s so easy, why is there still no effective HIV vaccine?
Eisen believes that Collins could have made a stronger case for funding the NIH: Basic research is essential for coping with novel pathogens.
Collins should be out there pointing out that the reason we’re even in a position to develop an Ebola vaccine is because of our long-standing investment in basic research, and… by having slashed the NIH budget…, we’re making it less and less likely that we’ll be equipped to handle… future challenges to public health. (emphasis added)
For example, when AIDS appeared we didn’t know that HIV existed, let alone how to treat it. Scientists were able to identify the pathogen and develop treatments because the cancer research community had achieved a deep understanding of retrovirus biology. Retroviruses were, at the time, a relatively recondite topic with no evident application to a highly lethal and communicable disease. My friend David States (@statesdj) recalls that:
Gallo and Montagnier [the co-discoverers of HIV] were both retroviroligists funded for work on cancer viruses. If their labs were not already highly skilled in identifying and culturing retroviruses, and the lentivirus family had not already been characterized, it would have taken at least a year to develop the necessary skills and technology to detect HIV, diagnostic tests would have been similarly delayed.
Let’s accept David’s guess that funding of research on retroviruses sped up the development of a medical treatment for HIV by a year. What was the benefit of saving a year in drug development?
There is no easy answer, because the dynamics of the AIDS epidemics depend on many factors. However, I think we can find a plausible lower bound on the benefits of accelerating treatment research by a year as follows.
The rate of new HIV infections in the US has stabilized at about 50,000 cases per year, so the principal benefit of getting treatment a year earlier is that effective drugs were available to an additional 50,000 HIV+ people. (Unfortunately, not all of them get them, but that’s a discussion for another time.)
Moreover, Goldman and his colleagues estimate that early treatment of HIV using highly active antiretroviral drugs has prevented 13,500 new infections per year since 1996. So, another benefit of getting an effective treatment a year earlier is an additional 13,500 HIV- people.
AIDS is expensive, so preventing HIV infections also saves a lot of money. The CDC estimates that the costs of the new infections that occur in the US in just one year, summed over the lifetimes of all the newly infected patients, will be $16.6 billion. So the lifetime health care costs of a single infection will be about $330,000. If so, just one additional year of effective treatments will save the US about $4.5 billion dollars by preventing 13,500 infections. That $4.5 billion is 15% of the $30 billion annual NIH budget.
Funding basic research at the NIH helps protect Americans and others against the risk of emerging global epidemics. Even obscure health research saves lives. The 10-year slide in the NIH budget has been penny wise but pound foolish.