About this site
Here we'll review recent developments in drug discovery and medicine and the IP issues and financial implications they have, along with general thoughts about research. Also likely to make an appearance: occasional digressions into useful topics like which lab reagents smell the worst.
About this author
Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases.
To contact Derek email him directly.
My Reactions Do Better Without Me
I've put my unpurified reactions into the freezer at work, under nitrogen, and taken today off to add to the long weekend. They'll do fine in there. I even labeled them so I'll know what the heck they are - there's a real advanced technique for you. Of course, there are a couple of dozen flasks in there already, some of them works-in-progress from about three projects ago, but my labmates and I will deal with those eventually. (Like when we need some more flasks, for example.) The stuff in them is gradually turning darker, but at least more slowly. . .
The brain is a strange territory for drug discovery. I've written before the difficulties that CNS-targeted medicinal chemists face, and those were heartfelt rants based on several years of experience. But it's not like the biologists have it any easier. One reason things are so hard is that enzymes and receptors that we think we're familiar with take on new and weird roles once they're behind the blood-brain barrier.
Take the insulin-glucagon system, which I freely admit is powerful and mysterious enough out in the rest of the body. But there are insulin receptors all over the brain, localized in interesting patterns that mean something important (we're only beginning to figure out what it might be.) And now there's a paper coming out in Nature Medicine about a related hormone, glucagon-like peptide 1 (GLP-1,) another familiar player to diabetes researchers.
GLP-1 is important for blood sugar control, but it certainly takes on a new personality in the brain. It turns out that this peptide (and its receptor) are important for learning and memory, which is a new development. People had made GLP-1 knockout mice before - they had some mild high blood sugar and such, but appeared otherwise pretty normal. But on closer inspection, they turn out to be learning impaired. (This highlights something that doesn't get as much attention as it should, that knockout animals probably show many more subtle effects than we notice.)
The memory deficits in these mice could be reversed by a gene therapy approach, restoring GLP-1 function to the hippocampus. The research team went a step further with that, though, and added extra GLP-1 receptors to normal mice by the same technique. And these mice, most interestingly, "showed marked enhancement in maze learning," that is, they were better than normal.
Targeting gene therapy to the brain is a mighty risky mode of treatment, which is why you only see people using it to go after things like Parkinson's. Using it as a cognition booster would be getting ahead of things, to say the least. But there's potentially another approach: the same paper goes on to show that administration of a GLP-1 peptide analog can cause similar effects, reversing the deficit in the knockout mice and enhancing the function of normals. The peptide is given by an intranasal route, which is a known trick for taking large molecules into the brain by a sort of pharmacokinetic back door.
Making a small molecule that would do the same thing would be a real challenge, though. Receptors for large peptides like GLP-1 tend to be resistant to small-molecule approaches, because they're made for much large ligands (which is why there is, as yet, no such thing as an oral pill that mimics the effects of insulin; you still have to inject the real protein.) But just finding this pathway and pointing out its importance is a discovery worth having. Clearly, there's a lot to cognition (and its enhancement) that we haven't even stumbled over yet. For example, see this, for the completely opposite sort of therapy. There's a lot to learn. . .
One Man's Nanotechnology. . .
Charles Murtaugh comments from the lab trenches at Harvard on my nanotech roundup of the other day. As he correctly points out, some of the papers I cited would be classified more as molecular biology than anything else. I left out a lot of even more pure molecular biology that appeared in the same issue of the journal I selected, but I take his point - up to a point.
From a chemist's perspective, molecular biology has always been a bit scary. It's clearly a molecular-level science, like chemistry, but the molecules that it deals with are so enormous (by our standards) that chemists have never felt comfortable with it. Proteins and polynucleotides are terrifying huge for those of us who try to keep molecular weights below about 500 (and medicinal chemists know that your chances for success are higher if you don't cross that line too often.) But one hundred times that size isn't at all unreasonable for a biomolecule. We just can't deal with it.
And as chemists, we're also used to doing all our reactions in more or less homogeneous solutions, relying on tweaking the reaction conditions to do things selectively. If it doesn't work the first time, we try a different catalyst, a more polar solvent, a less reactive Lewis acid, running the reaction in an ice bath or at a boil. Molecular biology, on the other hand, depends on the extraordinary tools that already exist in nature, all the restriction enzymes and the like, which it takes over and uses for other ends. They work so fast, so selectively, and under such mild conditions that they're the envy of organic chemistry.
There's no chemical equivalent, as I've pointed out before. Nothing ever evolved to use oxalyl chloride and DMSO as an oxidizing reagent (my fellow chemists will appreciatively wrinkle their noses at the mention of the Swern oxidation, which generates its pungent smell of success by reducing the DMSO to reeking dimethyl sulfide.) We just had to figure that one out on our own and use it as best we can. Our tools are blunt compared to those of the biologists, because they're all home-made.
A side effect of this, one that makes communication between the fields more difficult, is that many molecular biologists get in the habit of treating their systems at another level of abstraction. At its extreme, you have all these assay systems that are sold in kits: you take this stuff up in this buffer at this temperature for this amount of time, dilute the stuff from the second vial down and add it, stop the reaction with the stuff in the third vial and read it out by putting your plate in the counter. . .and so on. This kind of thing, while hugely convenient for molecular biologists, sometimes makes chemists a bit uneasy. We're still over here boning chickens and dicing up carrots, tasting the stew to see if it needs more pepper, while down the hall they're popping frozen entrees into the microwave. (Don't swamp me with nasty mail; I know that there's a lot of home cooking in molecular biology, too. It's just that there aren't any kits at all in organic synthesis.)
The thing that impresses me about current molecular biology, then, is that it's gradually been able to treat its molecules the way chemists would, as physical entities with distinctive shapes and sizes. This trend has been coming on for years now, and modern techniques make it both more feasible and more useful. The single-molecule spectroscopy papers I cited from PNAS are a good example of that.
Meanwhile, the chemists have been getting into the act from their end. There's a paper coming out in Science that is just the kind of thing I mean. A group of chemists at Purdue took a mixture of standard proteins and ran them through a mass spectrometer. Nothing remarkable there - for some years now, we've been able to get molecules of that size to "fly" down a mass spec (those general techniques were the subject of a recent Nobel prize.) But what this crew did was adjust the back end of the machine to land the protein molecules where they wanted, in ordered piles. The lysozyme piled up over here, the insulin over there, and so on.
That's a neat trick, and potentially a useful way to make a protein microarray. When it's working, you wouldn't have to start with pure proteins at all - just sort 'em out on the fly. This group demonstrated that the sorted proteins (at least, a couple of enzymes among them) retained their activity after this rather unnatural treatment (being ionized and flown down a tube in hard vacuum.) Next up will be more fragile proteins and tougher mixtures. They could be on to a real time-saver here.
By the way, I'm glad to see Charles back to more regular blogging. I just hope it doesn't mean that his lab work has hit a dry spell!
Though Some Have Called Thee So
It's really too early to say, but it's possible that the world just changed. This might be one of the days we look back on, saying "Yes, that's when it happened." It's more likely - it's always more likely - that it didn't, and it won't. But it might have, and here's why:
I've written several times about a life-extension candidate enzyme called Sir2, last August on my old Lagniappe site, and most recently here, back in May. That last post covered some work from David Sinclair's group at Harvard, and has a lot of background that newcomers might want to check out.
Now the same research team has an extremely interesting new paper in Nature - they're having quite a year up there, I have to say. They and the BIOMOL company have been running a screen for "sirtuin" activity. That's the name for the entire family of Sir2-like enzymes, which are found in everything from yeast up to humans. They're looking for molecules that might increase it - rather surprisingly, they've found some.
A group of plant-derived phenols showed up as actives, especially one called resveratrol. What really started the bells ringing was that the same compound had already been suspected of having good effects. It's found in some red wines, and is a candidate for the health benefits seen in their consumption. Having it turn up independently as a sirtuin activator is a very suggestive result indeed.
The potential world-changing result? Adding the compound to yeast cultures increased life span up to 70%. Well, fine, long live the yeast. But those homologs in other species. . . the paper mentions preliminary confirmation in roundworms (C. elegans) and fruit flies (Drosophila.) That's the thing: the jump from fruit flies to humans isn't any larger than the ones they've made so far.
On a molecular scale, what's going on is likely the stabilization of DNA and the delay of a fast apoptotic (cell death) response. Consistent with the work Sinclair's group has been reporting, the effect seems to be very similar to that caused by caloric restriction (which treatment, fittingly, doesn't add anything to the resveratrol-treated cells.) Perhaps we have a bit too much of a hair-trigger when it comes to programmed cell death, at least in the post-reproductive part of our lives. It's going to take some nerves to find out, though. If I can quote myself from that May posting:
"All the processes needed for cell division have the plastic explosive charge of apoptosis wired up to them, and if anything goes wrong, the ignition sequence is triggered and the cell dies. Or, at least, that's how it's supposed to work. The way most cancer cells start their career is by evading this fall-on-your-sword programming, and their descendants never look back. Attaining longevity through better DNA and protein repair, through minimizing oxidative damage and so on - that's presumably going to be benign. But attaining it through messing with the cell cycle could be a tricky business."
It still is, and every attempt at lifespan extension is going to have to get past a lot of doubts about increased risk for cancer. Resveratrol itself is not necessarily a benign compound. But its consumption is already being investigated for lowering cancer risk, which certainly raises ones hopes. (On a side note, I'm sure that it's already raised the hopes of people with a lot of grape skins (the best source of the compound.) It's not the most stable stuff, though. Resveratrol oxidizes in air pretty easily, and you'd have to wonder about the purity of a lot of the stuff that's already being sold - which is going to be but a trickle compared to what's about to be flogged, without waiting on any further proof.
That proof will take some time, but we're going to get it, one way or another. Here are a few things I'm certain of already: First, this is going to stimulate a lot more companies to look into running a sirtuin-activator screen. You have to wonder what lead structures might be lurking inside some of the large pharma compound repositories. The BIOMOL folks better order some extra shipping containers for mailing out their assay reagents.
Second, there is no reason to think that resveratrol itself is an optimized molecule. It's a great starting point, but as a medicinal chemist I can see several things I'd like to do to it immediately. Hey, fellow chemists, let's talk shop here: anyone want to fluorinate those aromatic rings a bit? Replace that trans alkene with a cyclopropane? Constrain one of them into an indene-like ring? Go heteroaromatic instead of phenyl? Hinder rotation with an ortho substituent or two? Look for an acidic subsitute for those phenolic groups? Believe me, thousands of folks like me are looking at this structure today and having these exact thoughts, and some of them are going to act on them (if someone hasn't already.)
Third, this compound is surely being given to higher animals as we speak. I can see no reason not to start feeding it to mice in a long-term study. Mice live around two years - let's try for three! After all, it's already been given to rodents in other studies. (And those are just papers from the last year or two!) But as far as I can tell, none of these have allowed the mice to age to their full normal lifespan under resveratrol dosing. Time to find out!
Well, I might remember for the rest of my life - my unexpectedly more spacious life - waiting by the printer for this paper to come out so I could read it for the first time. Or this could be a false dawn. We have a lot of those. But maybe. . .
Items From the Playground
Tonight's posting is brought to you courtesy of a high, thin cloud cover moving in from the west. If it weren't for that, I'd be outside right now with the telescope, taking a crack at observing Pluto for the first time. Posting would have waited until tomorrow morning, but going outside now is like looking through a newspaper.
I was up until all hours last night (a fine clear one here, a rare event this summer,) observing Mars and more. Any amateur astronomers out there can be the recipients of my surprise at being able to see Andromeda's G76 from my back yard - and other readers can scratch their heads at that one, which I think a reasonable number of visitors here do anyway on any given day. Sometimes I'm right there with 'em, too.
August is a fairly slow month in the drug industry, like it is everywhere else. Looking around, I note that Schering-Plough, my ownership of whose stock I complain about regularly, has revisited 1996 prices after doing what the financial pages always call "slashing" its dividend. Why the stock tanked so suddenly on that news escapes me, because they'd already announced that they didn't have enough case to pay the dividend as it stood. Good luck to Fred Hassan, good luck to the people I know there, and, finally, good luck to my stock. Luck will be needed.
And speaking of that, I mentioned a while back that I was short some Imclone stock. I still am, having passed up a chance to make an easy 25% or so by waiting for it to go lower. 'Twas ever thus. It's now back to almost exactly my entry price, but I'm going to watch it a while. As everyone knows, BMS and Imclone resubmitted Erbitux to the FDA recently, and this will be just about the only thing moving the stock. If there's any tremor of uncertainly about the approval, I'll probably get out of the position for now. I assume, actually, that approval is coming (and that people will step in and buy the stock when it does.) I also believe that Imclone is priced too high at the moment; I don't think Erbitux is going to make the optimistic sales case that would be needed to justify it. But I learned long ago that you can lose serious money while waiting for people to come to their senses.
The other drug-stock story I've noted is last week's trading in Pfizer. It sank a bit, attributed to the US approval for Bayer and GSK's Levitra, a Viagra rival. But there was no equivalent move up for either of those companies, probably because approval was widely expected. "But," you say, "if it was widely expected, why did Pfizer's stock take it so poorly?" Well, one reason is that asking the market for minutely rational behavior is like asking a cat for the weather forecast. But my guess is that this is the beginning of a long, slow return to reality for Pfizer's growth forecasts, about which I've been sceptical. To put it mildly.
More of the Above
Thanks to Glenn InstaReynolds for linking to both the nanotech and clomipramine pieces. I strongly suspect it was the latter one that brought in more traffic, but the combination seems to have broken my one-day blogging record. Pity there's not a long list of drugs with side effects like that one, so I could trot one out whenever traffic slows down around here. Frankly, though, most such libidinous side effects work in the opposite direction, if you know what I mean. We in the industry have received the occasional complaint, as you'd imagine, but fixing the problem would require that we understood what the heck was going on.
As for the other post, I wanted to emphasize that the issue of the Proceedings of the National Academy of Science that I went over yesterday wasn't any sort of special-topic collection. You can find similar things in Science or Nature, almost every issue. In my field, there are regular nanotech publications in the Journal of the American Chemical Society and Angewandte Chemie, two of the more prestigious venues. While new journals are springing up that specialize in various aspects of nanotechnology, much of the best work is showing up in those high-profile general coverage journals. It's a hot field, so people are going for the good publication slots while they can. Rightly so.
You can tell when a field has matured when the editorial staffs at these places start rejecting a higher fraction of the papers and referring them to their sister publications. (JACS, for example, often rejects papers by saying that they would be better suited for the Journal of Organic Chemistry or another ACS publication.) But it's going to be quite a while before that starts happening to the nanotechnology papers, I can tell you. There's still so much to do. We're only getting started. Don't you wish that Richard Feynman were around, like Freeman Dyson is, to see it happening?
I'll see everyone on Monday, by which time I should be back up to speed here and at the paying job.
There's a lot of nanotechnology talk in the last couple of years: plenty of cheerleading and handwringing, in nearly equal quantities. You might get the impression that there's a clear boundry to the field, and that "Departments of Nanotechnology" are springing up. Actually, it's more something that's coming on fast in many fields at once. It's possible that we're about to enter the log-growth phase of the technology. If we're not, then hold on tight when we do - if this is still the lag phase, we're in for quite a ride.
You can feel it coming. One way is by picking up any recent issue of a good general scientific journa - it's startling. Here's a browse through the August 5th issue of the Proceedings of the National Academy of Sciences. This isn't a special-topic issue, in case you're wondering, and PNAS is just one journal of several I could have chosen. Take a look:
A group from UC-Santa Barbara reports a new way to detect extremely small quantities of specific DNA sequences. They have a way of attaching the desired complementary sequence to the surface of a gold electrode; when it combines with the matching DNA in a sample, the physical change of the DNA's alignment on the gold surface gives an electric signal. The distances involved are small enough so that detecting the conformational change depends on quantum-mechanical tunneling.
At MIT, they've found that attaching gold nanoparticles to one of the best current DNA delivery systems makes it at least ten times better at delivering foreign sequences into the nucleus of cells. Thanks to the gold, they can use an electron microscope to watch the DNA complexes trafficking into the nucleus. Applications of this technology are just beginning, and there's clearly a vast amount of room for it to be optimized and refined.
A group at Oregon has modified a standard lysozyme enzyme, changing the sequence of one three-dimensional loop. This causes a huge shift in the protein's entire structure, changing the conformation at its far end as if it had been levered. Now that they've found a good system, they're working on similar proteins where this could be accomplished just by changing the pH, to make an artificial protein-based mechanical switch.
A Harvard group has been able to observe influenza viruses infecting live cells in unprecedented detail. They managed to label the viruses with a fluorescent dye, allowing them to follow subtleties of the multiple stages of infection in a single cell. Here's a movie of it happening.
At Georgia Tech, they're studying how DNA condenses into nanometer-sized particles. A common shape is a doughnut-like toroid, and they're getting a handle on what causes their variations in size and thickness. This will be important for understanding gene delivery systems and the behavior of polynucleotides in vivo.
In that same general area, a Michigan/Harvard collaboration is looking at how RNA folds into functional structures. This study follows a single RNA molecule's folding behavior in real time, the sort of view that a few years back was unattainable. A few pages later, an Illinois/Dundee collaboration follows with a similar single-molecule RNA study, again watching a single molecule twist and refold with great clarity.
A multi-center collaboration between several Japanese groups and Cold Spring Harbor shows the turning of a natural protein-based rotary motor. It's done with a single molecule of each unit, since (as the authors point out) if you tried to do this study the old-fashioned way, with a bulk sample, all the motors would be in different phases of their rotation and you wouldn't be able to get much useful data.
And, finally, groups at Albany and the Scripps Institute have worked out what they call a "near-atomic description" of the key motions of the ribosome, a molecular machine if ever there was one, as it moves along an RNA chain synthesizing proteins.
Now, those are just the overtly nanotechnological papers. I've left out a slew of gene- and protein-profiling papers, showing ingenious methods to study the actions of drugs and biomolecules in living cells. Those assays have been around a few years, but are getting smaller and faster, too. And I've left out several X-ray crystallography papers, an old technique that's now revealing the three-dimensional structures of proteins more quickly (and in more detail) than ever. For example, this Max-Planck-Institute/Munich collaboration crystallized a key protein in the cells of multiple sclerosis patients, identified the amino acid chains that presented themselves on the protein's surface, and searched the protein and genomic databases for similarities. This let them find, out of the blue, that exposure to Chlamydia infection might be a cause of multiple sclerosis in humans. A Chlamydia protein of unknown function gave a close match, the only close match in all the databases.
As the citations above show, this research is taking place all over the world. And the pace is picking up. These results are going to look a bit quaint in two or three years; some of them are going to show their age in half that time. (And what the contents page of PNAS is going to look like in about 2020, I couldn't begin to tell you.) I'm just glad to be here while it happens.
Just a note to the hordes, uh, swarms, er, reasonably large groups of Pipeline readers that I'll be on vacation the first part of next week. The next posting will be ready for Thursday morning. I've told my lab not to invent anything startling while I'm gone - looks bad, y'know, when productivity picks up while you're gone.
Entire Professions Would Be Imperiled
The report that Richard talks about here caught my eye, too. Does a new drug for Parkinson's cause problem gambling, or not? He's right that it's a theoretically plausible side effect, although the media coverage of it has been slipshod.
The thing is, I can believe just about any side effect in a CNS drug. Just try me. Working in the field for several years made me willing to believe almost anything if it was sufficiently weird - the brain is full of neurons, but it's even more stocked with surprises. Some evidence of these unexpected side effects, did you say?
Meet clomipramine. It was developed in the 1970s as an antidepressant, and it features a classic tricyclic structure which will be familiar to any medicinal chemist. Everything from schizophrenia to hay fever has been targeted from this template. Clomipramine is still used for depression, especially in Europe, and it's also a useful drug for treating obsessive-compulsive disorder and panic attacks. As a drug of of the 1970s, it's not surprising that no one had a good idea of how it worked. Not that I've any reason to feel superior, though; we're not all that sure now. (It could be that it's mediated by an unusual steroid effect.)
But its most memorable actions, though, have been found in a small subset of patients. No one's sure why only these people react the way they do, and no one's sure of the mechanism behind it. The only thing that's for certain is that when some patients take clomipramine, they have orgasms when they yawn. I'd like to see a market size estimate for that
The link above takes you to the text of the original report from the early 1980s. It's worth a read for the detailed, doubt-removing descriptions. The phrase "yawning fits" makes an appearance, as well it should.
To the best of my knowledge, no drug research program has followed up on this finding. Perchance there are funding difficulties. When I first heard about this back in graduate school, my first thought was what a great thing it would be to add an optimized version to the coffeepot before a departmental seminar. Now that I'm in industry, I have grander visions. I find myself wondering what the brand name would be, and trying to imagine what the "ask your doctor" marketing campaign would look like. . .
Back on the Air
Apologies to readers who've been checking in expecting new content. Our connection problems here backstage left me unable to post, but things seem to be back at the moment.
In the meantime, I've added and fixed a few things on the blogroll(s) at right. Black Triangle is a new pharma-related site (new to me, anyway!) And the links to Tim Blair and DefenseTech now finally take you to Tim Blair and Defense Tech, which is convenient.
On the One Hand
Chad Orzel over at Uncertain Principles mentioned the chiral properties of drugs the other day, so a post on that is in the pipeline here at "In the Pipeline." It's a rather complex subject, though, because chirality can get pretty complicated all on its own. Before getting into the drug part, I'll give a quick bit of background for the non-chemists in the audience. Fellow chemists can tune out until the next time I post on the subject, unless you want to hang around and catch me in a mistake, of course.
Chirality refers to the "handedness" of objects, and there's no better example than your right and left hands (or your feet, if you find them more distinctive.) It's a property that means that mirror images of the object aren't equivalent, the clearest example of which are right and left shoes. Chemically, those are known as enantiomers of each other. If an object has some sort of plane of symmetry in it (its own internal mirror image,) though, it isn't chiral, and can't be chiral. A flowerpot, a cutting board, or a bowling ball aren't chiral objects, because you can come up with an imaginary line through them that divides them into two identical mirror-image halves.
So far, so pretty good. Things start to get complicated when you consider chemical bonds. Double bonds lie in a flat plane, so they can't impart chirality. Similarly, flat rings like benzene can't, either. You have to spread out in three dimensions to be chiral. A standard single-bonded carbon atom qualifies, though, because its bonds point out in a tetrahedral shape. If all four substituents are different, it's a chiral carbon. (If two or more of them are the same, though, then you've made it so that there can be an internal plane of symmetry, and you've lost the chirality.) Some nice illustrations of these points can be found here.
I'm leaving out some tricky details, but those are the main points. If you want a taste of how the subject can make your eyebrows draw together, here's a PDF of Vladimir Prelog's Nobel lecture on the subject. That was nearly 30 years ago; I can assure everyone that the subject has not spontaneously simplified since then.
There are a lot of drug molecules that turn out to be chiral. We try to avoid it, but there's nothing that can be done in many situations. If you don't take special care, you're going to make those compounds in an exact 50/50 mixture of right-handed and left-handed isomers, known as a racemic mixture. "Special care" means that you make it from something that was chiral already, or you use a chiral reagent along the way to make the chiral center, or you use some sort of chiral reagent to separate the 50/50 mix at some point along the way. Chirality, generally speaking, just doesn't show up out of nowhere. You have to work for it.
But keep in mind, all your proteins are chiral, all the carbohydrates in your body are chiral, and some of your lipids are chiral, too. The subject of where that chirality came from, and how we ended up with the handedness that we did, is a notorious swamp full of decomposing theories. It's worth a visit all its own sometime. But the result of biochemical chirality is that any chiral drug you take is going to act differently, depending on whether you took the right-handed form or the left-handed form. I'll leave it there for now, and we'll get into some of the weird details next time around. It might not be tomorrow, but in the near future, anyway, you'll hear about piles of useless chiral junk, things that go in a person's mouth right-handed but come out in the urine left-handed, things that are more toxic when they're less pure, and similar head-scratchers.
I'll leave by quoting part of a poem that I wrote one day in grad school, having stuffed my brain with the chiral aspects of optical spectroscopy. What came out was a version of a Lewis Carroll poem which was itself a parody of Wordsworth:
. . .And if now I chance to put
My tongue in super glue
Or madly cram my chiral foot
In its enantiomeric shoe,
I weep, for it reminds me so
Of that old class I used to know,
Of ligand fields and planar nodes
And symmetries of normal modes.
Oh yeah. I should post the whole thing. Hang around here long enough and I'll eventually haul out my molecular-orbital-theory version of "The Raven." Wouldn't want to miss out on that, would you? Corante's traffic is just going to explode. . .
Apologies for the oddly timed postings recently. WebCrimson is having some problems with denial-of-service weasels, so I have to post my harangues when I can get through. In a brief window of availability, I managed to cut off Monday's snappy tagline, so if any of you read the version without the punctuation at the end, it's fixed now. More to come later tonight, assuming that the parties involved have taken a break to go and get a life.
Degrees! Get 'Em While They're Hot!
Reader Jim Tung found my post about getting out of graduate school of more than, well, academic interest. He's a grad student himself, so it's a subject that's getting more interesting all the time. He wrote with a follow-up question:
It's pretty clear that the work that you do as a graduate student and who your adviser is matters to you in terms of getting your first job as a synthetic organic chemist in the pharmaceutical industry. Ten years down the road when a Ph.D. decides to move from one company to the other, how much does their adviser matter?
Does it matter in terms of 'connections' or 'the foot in the door'(you worked for so-and-so, I know so-and-so is a good adviser, therefore I'll give you an interview) or is it actually a factor in hiring, just like it is in terms of getting your first job?
These are good questions, and he's already on the right track for the answers. First off, It's absolutely true that your grad school/postdoc advisor gets you in the door for your first job. It's not like you can't get looked at if you're from a smaller group, but it's a definite plus to be from a group that people know about. It's your most recent position that does the most good; it doesn't matter if you did a PhD in a relatively obscure group if you postdoc with someone that has more recognition.
Of course, it's widely recognized that it's possible to have mutant misfires coming from even the most high-profile research groups. (There are even a few of those big-time groups that folks worry about a bit come hiring time, because working in them is such a savage experience that too many people come out damaged. But that's a separate topic.) But playing the percentages, you're more likely to get someone good from a well-known group. Not ten times more likely, but there is a statistical edge. Next to that is a lesser-known group at a good school (say, from a younger faculty member who's still trying to make a name.) At the bottom would be, to pick the worst combination, a person who's the last grad student of an elderly faculty advisor that, even after all these years, no one's ever heard of because he's at an obscure school whose strongest degree program is in goat herding.
But the importance of all that tails off after a few years. It's hard to quantify, but I'd say that it's probably half as important after five years or so, and the influence is basically gone by the ten-year mark. A famous group on your resume will wear better, of course, because there is a network effect. As time goes on, others members of a large and well-known group will be distributed throughout the industry, working their way up to positions where they can affect hiring decisions. That won't cancel out a lackluster industrial resume, but it'll make it easier to at least get in the door - as long, of course, as you didn't have the reputation as the group's token bozo.
The thing is, no matter where you came from, after a few years you'll have to have some proof that your background actually led to something in your industrial career. You have to show that you've learned about real-world drug development, picked up the basics of medicinal chemistry and learned some biology. As the importance of your pedigree wanes, all this takes over as the key factor. The guy from from Goatherd State is going to be in demand if he turns out to have great ideas for things that tend to make it into the clinic. Contrariwise, someone from Pedigree U will have some explaining to do if all he's done is go around in circles since then.
The real world tends to level things out reasonably well. It's fun to watch, as long as you're sensible enough not to become one of the levelees.
Viva In Vivo?
Yesterday's post about the possible end of target-based drug discovery brought in some interesting mail. The feedback was more about the possible return of in vivo screening, actually. As it turns out, there are already some small companies out there taking a crack at this idea. I'm going to look them over and report back in more detail.
Whatever form it takes, it won't be a rerun of the old days. We have a lot more compounds to test, for one thing, and the standard dose-a-bunch-of-mice protocol from the 1960s just isn't going to be able to handle them. And we have so many more readouts now. At the very least, if you get into serious in vivo screening, there's going to be a heavy use of gene-chip assays in an attempt to see what's going on at a molecular level. They're already being used on a lot of fishing expeditions, from what I can see.
One (more targeted, and perhaps more interpretable) use of the expression assays would be the "tox-chip" kind, where you look at a lot of enzymes that get upregulated in an animal when it has to deal with something really poisonous. To that end, I've heard a colleague say "Wouldn't it be nice if we could just dose every compound we have, in a few mice per compound, and get rid of all the stuff that's just too toxic to use as a lead?"
Predicting toxic effects from cell assays would be a lot less trouble, of course. In theory (of course!) But try reducing it to practice: which cells are you going to use as your surrogate? Just how many assays are we looking at, and how many have to hit to reject a compound? And what about the effects that need a whole animal to show up? I haven't heard of any company running their whole compound library through an Ames test, for example, just to pick one cell-level tox assay that measures (imperfectly) one kind of toxicity.
At least everyone agrees on the undesirability of mouse toxicity. Well, most everyone. So we could just weed out all the compounds that show immediate trouble - if we had an insane number of mice, a huge corps of people to dose them, several extra buckets of money, and, oh yeah, if we had enough of all those compounds in our corporate files to do all that dosing. Which we certainly don't, even if we had all that other stuff. Thus the need for a new technique, if anyone's going to go seriously into whole animals. As I said, I'll take a look at some of these folks who are attacking the problem. And if anyone has some other candidates, drop me a line and I'll add them to the survey of the field.
The latest issue of Nature Reviews: Drug Discovery has a wonderful article by Hugo Kubinyi, titled "Drug research: myths, hype and reality." The text isn't free on Nature's site, but fortunately you can download a PDF of the article from the author's site.
He takes on several concepts that you can't walk through a drug company without tripping over. They each deserve discussion at length, and I'll try to give them their due over the next few days. One that I've heard people discuss - very quietly, because it's nearly heretical to challenge it - is the dominance of target-based drug discovery.
Kubinyi correctly mentions the shift to this model of research over the last 25 years or so. The dogma in most companies now is "one target, one drug." You work on clearly defined molecular mechanisms, specific receptors, specific enzymes. The search is always on for new targets that look to be relevant to a known disease. Phosphatase enzyme that seems to be important in insulin sensitivity? Got it. Kinase that helps control the cell cycle, could be disrupted for cancer therapy? Check. Serotonin receptor subtype that's seems to be correlated with asthma severity? Put it in the hopper. And so on. There are diseases with more than one target that can work against them (try blood pressure,) and diseases with a long list of targets that look as if they should work (cancer.) And there are diseases that have no good targets at all, or targets that would be good if anyone could find a drug that affected them.
But the problem is, there are many very successful drugs - and not all of them old, by any means - that didn't have a target. They just did something useful, no one was sure how. The "glitazone" drugs for diabetes are an example; their mechanistic target was only figured out while the first compounds were already in clinical trials. To pick another example from the diabetes field, Glucophage (metformin) is still the subject of strong debate about what its molecular target(s) might be. It isn't just diabetes, by any means: I don't think that Schering-Plough still knows how their cholesterol absorption inhibitor, Zetia, really works, not down to the specific protein it's targeting.
Then there are the successful drugs that hit more than one thing. All drugs have side effects, of course, and especially as the dose goes up you're going to start interacting with targets you never thought about. But there are drugs that hit more than one thing as part of their actions. Kubinyi mentions the antipsychotic olanzapine, and that's a good example. That one hits almost a dozen different receptors, and it hits them hard. If you were running a CNS drug development program in strict target-oriented style, when that one went through the broad selectivity assays you'd throw it out like a week-old mackerel. But it works. And it sells, hundreds of millions of dollars per year for Eli Lilly. A high price to pay for being picky, actually, if you're going to turn that one down. And it's not the only drug in its class that hits the brain receptors with both hands and a foot, either.
That search for new targets I mentioned? It's moving at a particularly frenzied pace these days. There's a sense that most, perhaps all, of the good easy ones have been found and exploited. The genomics gold rush hasn't turned up many, although there are always dozens more gene-based target proposals that might have something to do with something, if you could just put your finger on what it might be. It might just be time for something different.
So, if you're not going to do defined targets, most industry people will ask, then just how are you going to do drug discovery? Put compounds in your animal models, one after the other, and see what works? Well, we may just come to that. We just may. Oh, I know the complaints; I've made them myself: it's slow, it costs too much, it's inelegant, it takes too much of each compound, too many animals. But maybe some of us should start thinking about a way to optimize the in vivo discovery, rather than burning through more and more targets that we understand less and less.
The getting-out-of-grad-school posting from the other day has attracted some comment from Chad Orzel at Uncertain Principles, who's seen the same thing in the physics field, and from Chris Cotsapas, a grad student worried about falling into the same traps.
Wednesday is Double Coupon Day
Mentioning grad school brings to mind the single largest difference between academic chemistry and industrial chemistry. If you're expecting some sort of philosophical divergence, prepare to be disappointed: the biggest difference is money.
And no, I'm not talking about the money that you can earn in one place or the other. Industry pays better, of course, but that's a given. What I mean is the money available to spend in the lab. It's a rare academic lab indeed that comes near to spending what an industrial lab does. That's partly driven by the higher salaries, actually. Since industry isn't built on a foundation of cheap labor, unlike the slave economy of graduate study, it's actually cost-effective to buy fancier equipment to automate more tasks. Why pay someone to stand around the HPLC machine when you can have an autoinjector and fraction collectors?
There's more disposable equipment used in industry, for the same reason. I know that there are grad students out there cleaning and re-using glass pipets and such, and my heart goes out to them, because I've done some of that myself. But no more. Vials, pipets, plastic syringes - we go through them like tissues, because time is ticking away. And don't even think about one of the most foolish of the alleged money-saving plans of academic advisors, recycling waste solvents. I don't even think that works out correctly under graduate school conditions, much less in industry. There are whole chemistry departments at some schools that try to recycle wash acetonem or hexane to run columns in, but nobody ever trusts the stuff. Rightly.
And as for fresh reagents, why spend time making something, when you can buy it? That goes for any kind of apparatus, too. It takes some getting used to. It's hard to pick up the phone at first and spend twelve hundred dollars on one reagent, but practice makes it easier, naturally. Academic chemistry labs come in all degrees of penury, but no matter how well off they are, they never give anyone a free hand in ordering supplies.
I'm sure there's a limit, even in industry, but I've never made a serious run at it. If you dropped twelve hundred dollars every afternoon for a week, I suppose your phone would start ringing. But the person on the other end would just want to make sure that you really knew what you were doing, or would offer to get you a better price for a bulk purchase or something. But if you try that trick in an academic lab, you'd better be holding the phone out at arm's length if you value your eardrum.
And I Sprayed Gravel in the Parking Lot
Yesterday's post reminded me of something from my graduate school days. It was that part about the fear of running a crucial experiment. Back in those days, though, it wasn't so much that I was worried about putting my ideas to the test. Instead of being afraid that my key experiments wouldn't work, I gradually realized that I was afraid that they would.
That's an effect of the graduate school environment. Not that all my comments on the experience of graduate school apply first to degree programs in the hard sciences. I only had one year of classes (and the second half of that I was working in the lab.) All the rest was research at the bench. I have never experienced an English Literature PhD program, and if I continue on the path of righteousness, I never will.
Another thing I need make clear is that, in many ways, I did not enjoy my graduate experience. I'm always suspicious of someone who claims to have had a good time in grad school, not that I've met many of those. The pressure of trying to finish a crucial career-gatekeeper degree was one factor, and the relentless focus on a single problem didn't (and doesn't) suit my personality very well. And on top of those factors, there's the usual complaint about the PhD process, that it places too much power in the hands of a student's advisor. I didn't have that many significant disagreements with mine, but when it came time to get out, I knew that my fate was completely in his hands. It wasn't a feeling that I enjoyed; I wouldn't have enjoyed it no matter who was on the other side of the relationship.
So, all that said, you'd think that I would have been chewing through my ankle to get away from the place, right? Yes and no. That's the other thing about grad school: it's designed to be a temporary part of your life, but there's no set timetable. The rest of your life begins when you get out, and no one knows when that's going to be. After you've been there a few years, you start to get perversely comfortable. You know all the tricks of the research group, all the ins and outs of the lab. You know more about your project than anyone else does. You're now an old hand, someone the younger grad students come and ask for advice. Things could be worse.
They sure could. And they'll get worse, too, if you get too complacent and start enjoying yourself too much. It takes (in chemistry terms) some activation energy to get out of grad school, to polish off a project, write it up to an acceptable dissertation, and (not least!) figure out where on earth to go next. And that, of course, usually means picking up and moving across the country, to either a post-doctoral position (or a real job,) and starting all over as the lowest form of life again. It's a big disruptive decision, and wouldn't it be better if you could just put it off for a while?
And that's what some folks do. Grad schools around the country have these people haunting their hallways. They're people who've been working on their dissertation for three years, who are still trying to finish their total synthesis of a molecule that no one cares about any more, who have put down roots in a place that they shouldn't have. Telling someone like that to get a life isn't accurate: they have one. It consists of being a grad student.
It's not a good choice for the long haul. Watching people that started after you did finish their degrees and leave isn't very enjoyable, especially when it's clear that some of these people are off to bigger and better things. I knew a few of these perpetual students myself, and they always had a vague air of sadness hanging around them. You could tell that they knew that the train was leaving the station, yet again, and they weren't on it.
I caught myself very early in the process. I realized that I was avoiding some experiments just because they might be key steps to getting me out, and I realized that I was trying to avoid the whole process of finding a place to post-doc and getting on with my life. I actually finished my degree quite early, by the standards of my advisor at the time. I was his quickest PhD in years. It meant that I didn't finish my total synthesis, because I estimated it would take most of another year to do that, even if everything worked perfectly (and boy, would that have been a first.) And I've never regretted that decision. I don't think I've ever lost as much as five seconds of sleep about not getting to the end of my molecule, but I've lost count of the number of times I've been happy that I got out of graduate school when I did.
Per Fits and Starts, Ad Astra
Last summer I was working on an interesting chemistry idea. I posted about it on and off, in what was likely an irritating fashion - irritating because I could never quite go into just what the idea was. There were two reasons behind that: for one, my employer gets the rights to chemistry ideas that I work on in my employer's labs, and quite right. (The contracts that you sign when you join any research-based industrial organization are very, very clear on that point.) The second consideration is scientific priority, and scientific pride.
Now, what I'm doing isn't going to win me a Nobel prize, but it is a very nice idea, and one of the better ones I've ever had. So it would be more pleasing to me if I could get it to work with my own hands before letting everyone else take a crack at it. One problem is that I tend to work on things like this in jerky bursts of activity, and those don't come nearly as often as they should. Someone with more discipline would have made more progress, no doubt. A scientist who combined periods of free-association idea generation with stretches of well-structured lab work to follow them up would be the person to have around. I haven't met too many of those people, but they certainly exist. I'm not one of them.
I comfort myself by thinking that the folks with the most disciplined work schedules tend not to have ideas as off-the-rails as this one. It's a common complaint in the drug industry that the work is so ceaseless as to leave people with no time to think. And as I've written before, if you don't have some staring-out-the-window time, you don't have that many ideas. I know that when I've run a project myself, I don't as many good ones. There's no mental overhead left for them; I'm too busy making sure that everything's going the way it should. It's exciting, being at the head of a drug project, but it does wear you out.
Even when you're not running a project, there's always enough work to keep you busy. Keeping busy isn't the problem. The problem is remembering that "busy" doesn't always mean "productive," although they can be mistaken for each other in dim light.
I bring all this up because, as I mentioned a few weeks ago, I'm taking another crack at this stuff. I've been messing around with the idea(s) on and off over the intervening months, but a very good opportunity now presents itself. It's the same core concept that I've worked on before, but (for once) it matches up very well with the project that my lab is officially working on. If I can continue to keep on the tasks at hand, this coming week will see most of the groundwork laid, and the week after that should see the first runs of the real thing.
Here's hoping that I ignore all distractions, and have the nerve to put my favorite ideas on trial. That's the real problem with working on ideas of your own, ideas that you think have the potential to be really good. They don't all work. Most of them don't work. It can be more psychologically comforting to keep them in the "untried but promising" category, rather than find out if they're real.
DIY Drug Development?
I have to say, I feel better after venting in yesterday's post. Many thanks to Glenn Reynolds for linking to it and getting it out to a broader audience. It's brought in a larger than usual amount of mail: some of it from inside the industry, none of it defending PhRMA.
Another weblog that talks about the reimporation issue quite a bit is DB's MedRants. I noticed this item there ther other day, advanced as a possible solution to drug pricing. It's about a group of universities who are trying to develop their own drugs:
The universities say they do not plan to become drug companies. But by doing more of the basic work on drugs themselves ? like testing them for toxicity in animals ? they say they can then entice pharmaceutical companies. Moreover, they say, they will get a better deal because some of the risk has been taken out.
Whether the consortium will work is open to question. Under the deal, SRI International, which does contract research, will offer the universities up to 30 hours of free consulting for each project to develop a plan for how to test for toxicity, make the drug for clinical trials and other necessary steps. But the universities would still have to come up with the money for the tests, manufacturing or other tasks.
Well, there's nothing there that 30 whole hours of free consulting can't fix, I'm sure. And all the universities have to do, it seems, is raise the money for the really expensive stuff. Quite a deal. I wish these folks the best, but I can't help but think that they're going to be climbing a steep learning curve with ropes and pitons. DB's hopeful comment is "If this works, the straw man argument about investing in research may move towards moot." Well, as someone who's been getting beaten up for years by said straw man, let me add some comments of my own: If you know some chemistry, some biochemistry, some molecular biology or medicine, then the business of drug development looks pretty hard. Then when you try it out, you find that it's a lot harder than it looks.
As I've mentioned before, I've been doing this since 1989 and have yet to work on any compound that a sick person has put in their mouth. I've worked on something that's made it into phase I (safety testing,) but that's done in normal volunteers, not patients with the disease. And mine is not a hard-luck story, by any means. There are people who've been doing this for much longer than I have who are in the same position.
During that time, either in my own efforts or projects going on around me, I've seen: things that looked like they were going to roll back a dozen kinds of cancer that never made it to market. Drugs that looked like the answer to obesity that, at best, made people slightly fatter. Treatments for chronic pain that worked in every species except humans. Promising compounds in several areas that turned out to have blood levels twenty times lower than anyone's worst estimate. A series of cardiovascular drug candidates that failed their final, crucial animal test each time, and each time for a completely different reason. A new kind of drug for schizophrenia, which passed every test with ease, but turned out to do as much for the suffering patients as the equivalent dose of guacamole.
That comes nowhere near exhausting the list of failures. Those are just some of the ones that ate up the most money and time. And over the years I've seen many presentations from academia, where they speak of promising compounds they've discovered, with interesting biological activities and potential as new drugs. Having seen hundreds of interesting compounds crash and burn, all I can do is wish them luck. I really do, I wish everyone in the industry luck. We don't have as much of it as we need.
Archives, RSS & Email
Click here for access to our archives.
Dictionary of Scientific Quotations
If the Universe is Teeming with Aliens
(Then where is everybody?)
The Night is Large
A Martin Gardner collection
Eurekas and Euphorias