My long cri de couer last week continues to bring in a number of comments, which I appreciate. Matthew Holt of the Health Care Blog asks:
How much money does the NIH spend on basic research and how much does the pharma business spend on it (and you can include development if you like)? I don't have these numbers but I suspect they are closer to each than it would appear from a reader of your article who might think that it's about 90-10 on pharma's side."
Well, I hope that's not how I came across. I'm sure that more basic research goes on in academia, of course. That's what they're funded for, and what they're equipped for. Some basic work goes on in the drug industry, too, but most of our time and effort is spent on applied research. It's confusion about the differences between those two (or an assumption that the basic kind is the only kind that counts) that leads to the whole "NIH-ripoff" idea.
It's easy to get NIH's budget figures, but it's next to impossible to get the drug industry's. One good reason is that companies don't release the numbers, but there's a more fundamental problem. It would even hard to figure it out from inside a given company, with access to all the numbers, because you can easily slip back and forth between working on something that applies only to the drug candidate at hand and working on something that would be of broader use.
Some years ago, several companies (particularly some European ones) had "blue-sky" basic research arms that cranked away more or less independently of what went on in the drug development labs. I can think of Ciba-Geigy (pre-Novartis) and Bayer as examples, and I know that Roche funded a lot of this sort of thing, too. In the US, DuPont's old pharma division had a section doing this kind of thing as well. I'm not sure if anyone does this any more, though. In many cases, the research that went on tended to either be too far from something useful, or so close that it might as well be part of the rest of the company.
So without a separate budget item marked "basic research", what happens is that it gets done here and there, as necessary. I can give a fairly trivial example: at my previous company, I spent a lot of time making amine compounds through a reaction called reductive amination. I used a procedure that had been published in the Journal of Organic Chemistry, a general method to improve these reactions using titianium isopropoxide. It worked well for me, too, giving better yields of reactions that otherwise could be hard to force to completion.
The original paper on it came from a research group at Bristol-Meyers Squibb. They had been looking for a way to get some of these recalcitrant aminations to go, and worked this one out. That is a small example of basic research - not on the most exalted scale, but still on a useful one. It's not like BMS had a group that did nothing but search for new chemical reactions, though. They were trying to make specific new compounds, applied research if there ever was some, but they had to invent a better way to do it.
Meanwhile, I needed some branched amines that this reaction wouldn't give me, and there wasn't a good way to make them. I thought about the proposed mechanism of the BMS reaction and realized that it could be modified as well. Adding an organometallic reagent at the end of the process might form a new carbon-carbon bond right where I needed it. I tried it out, and after a few tweaks and variations I got it to work. As far as I could see from searching the chemical literature, no one had ever done this in this way before, and we got a lot of use out of this variation, making a list of compounds that probably went into the low thousands.
When I was messing around with the conditions of my new reaction, trying to get it to work, I was doing it with intermediate compounds from our drug discovery program, and when the reactions produced compounds I submitted them for testing against the Alzheimer's disease target we were working on. Basic research or applied? Even though there are clear differences between the two, taken as classes, the border can be fuzzy. One's blue and one's yellow, but there's green in between.
Tomorrow I'll go over a more important example - it's pretty much basic research all the way, but untangling who figured out what isn't easy. My readers who work in science will be familiar with that problem. . .
One other thing, in response to another comment: I didn't go wild about the NIH argument because I'm trying to prove that drug companies are blameless servants of the public good or something. We're businesses, and we do all kinds of things for all kinds of reasons, which vary from the altruistic to the purely venal. You know, like they do in all other businesses. Nor is it, frankly, the largest or most pressing argument about the drug industry right now.
No, the reason I took off after it is that it's so clearly mistaken. Anyone who seriously holds this view is not, in my opinion, demonstrating any qualifications to being taken seriously. (And that goes for former editors of the New England Journal of Medicine, too, a position that otherwise would argue for being taken quite seriously indeed.) The "all-they-do-is-rip-off-academia" argument is so mistaken, and in so many ways, that it calls into question all the other arguments that a person advocating it might make. They are talking about the pharmaceutical industry, seriously and perhaps with great passion, but they do not understand what it does or how it works at the most basic level. Isn't that a bit of a problem? What other defects of knowledge or reasoning are waiting to emerge, if that one has found a home?