Corante

About this Author
DBL%20Hendrix%20small.png College chemistry, 1983

Derek Lowe The 2002 Model

Dbl%20new%20portrait%20B%26W.png After 10 years of blogging. . .

Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases. To contact Derek email him directly: derekb.lowe@gmail.com Twitter: Dereklowe

Chemistry and Drug Data: Drugbank
Emolecules
ChemSpider
Chempedia Lab
Synthetic Pages
Organic Chemistry Portal
PubChem
Not Voodoo
DailyMed
Druglib
Clinicaltrials.gov

Chemistry and Pharma Blogs:
Org Prep Daily
The Haystack
Kilomentor
A New Merck, Reviewed
Liberal Arts Chemistry
Electron Pusher
All Things Metathesis
C&E News Blogs
Chemiotics II
Chemical Space
Noel O'Blog
In Vivo Blog
Terra Sigilatta
BBSRC/Douglas Kell
ChemBark
Realizations in Biostatistics
Chemjobber
Pharmalot
ChemSpider Blog
Pharmagossip
Med-Chemist
Organic Chem - Education & Industry
Pharma Strategy Blog
No Name No Slogan
Practical Fragments
SimBioSys
The Curious Wavefunction
Natural Product Man
Fragment Literature
Chemistry World Blog
Synthetic Nature
Chemistry Blog
Synthesizing Ideas
Business|Bytes|Genes|Molecules
Eye on FDA
Chemical Forums
Depth-First
Symyx Blog
Sceptical Chymist
Lamentations on Chemistry
Computational Organic Chemistry
Mining Drugs
Henry Rzepa


Science Blogs and News:
Bad Science
The Loom
Uncertain Principles
Fierce Biotech
Blogs for Industry
Omics! Omics!
Young Female Scientist
Notional Slurry
Nobel Intent
SciTech Daily
Science Blog
FuturePundit
Aetiology
Gene Expression (I)
Gene Expression (II)
Sciencebase
Pharyngula
Adventures in Ethics and Science
Transterrestrial Musings
Slashdot Science
Cosmic Variance
Biology News Net


Medical Blogs
DB's Medical Rants
Science-Based Medicine
GruntDoc
Respectful Insolence
Diabetes Mine


Economics and Business
Marginal Revolution
The Volokh Conspiracy
Knowledge Problem


Politics / Current Events
Virginia Postrel
Instapundit
Belmont Club
Mickey Kaus


Belles Lettres
Uncouth Reflections
Arts and Letters Daily
In the Pipeline: Don't miss Derek Lowe's excellent commentary on drug discovery and the pharma industry in general at In the Pipeline

In the Pipeline

October 30, 2014

Down With the Western Blot?

Email This Entry

Posted by Derek

A reader sends along a thought that touches on a lot of the publication scandals that have happened in molecular biology over the years. A very common feature of these is manipulation of Western blots (see that graphic the other day on the Sarkar case, assembled by a reader of Retraction Watch who goes under the name of "Scrutineer", for an excellent example). In the hands of the unscrupulous, images get duplicated, swapped, stretched, flipped, contrast-adjusted and digitally whitewashed, all in the name of making a good story.

But gel images seem to be an essential part of publication in these fields. Papers in any kind of protein-level biology work are full of lane after lane illustrating experimental changes and controls - no one would believe the work without them, but it's getting to the point that people are wondering what work to believe with them. The question is, is there something better?

Because when you get right down to it, a picture of a blot is a pretty low-information object. Bands are there, or they aren't. If there are a lot of bands, it's not like anyone knows what they all are - the only thing you can say is that this sample isn't so clean, or this protein didn't express so well. Contrast this to the sorts of characterization that you can do on smaller organic molecules. You have distinctive fingerprints of proton and carbon NMR, high-res mass spec (with its isotopic distributions and fragmentation patterns) - even good old IR, which no one uses much any more, gives a lot of information in a compound-specific readout. But for molecular biology, it's "this lane has a band" versus "this lane doesn't have a band". It's as if the chemistry journals were full of TLC plate images, which they most certainly aren't.

It's for sure that characterization of proteins is a much harder business than the characterization of small molecules. I can see why the blot-image standard got started, and why it persists. But it's too easy to fake, too easy to manipulate. Is there anything to replace it, or to enhance it to the point where you can't just whip one up in ten minutes at the keyboard? I know that editorial staffs at the major journals have been thinking about this problem, but how much progress is being made? It's still a key part of work at the bench, but when it comes time to publish, is it time to say goodbye to the plain old Western?

Comments (2) + TrackBacks (0) | Category: Biological News | The Scientific Literature

October 29, 2014

Google's Nanoparticle Diagnostic Ideas

Email This Entry

Posted by Derek

Google's "Google X" division, the part that works on odd high-risk high-reward projects, is apparently interested in diagnostic nanoparticles. That Wired article is pretty short on specifics, but the company's Andrew Conrad revealed a few details in a talk yesterday. The idea, apparently, is to use magnetic-cored nanoparticles to interrogate various body functions, then to reconcentrate them in some superficial vein for a readout. I had thought initially that there would be a blood draw at that point, which seems like less of a leap, but apparently the idea is for some sort of across-the-skin readout.

That's still not a crazy idea, although it has a ways to go (and an awful lot of work in animals). And I'm not sure how the noninvasive readout thing is supposed to work - I can imagine a lot more being done if you take some of these things back out. The nanoparticles could be tagged in various ways for sorting after removal, and (in theory) you could get quite a bit of information that way. The tricky part, either way, will be targeting the things that you can't get from just circulating - otherwise you'd just take a blood sample as usual and add your nanoparticle brew to it ex vivo (which is not such a bad idea, either, and has been worked on by many others). Perhaps that's why Google's team is going the extra step.

So they're presumably checking up on solid tissues somewhere, not the soluble blood factors, and that brings up a lot of pharmacokinetic issues. Many things that interact well enough to be diagnostic might well stick to the tissue instead of circulating back around, for example. And the ultimate fate of all these particles will be key - what effects will they have themselves, how well are they cleared, by what routes and at what rate, and so on. But I'll reserve judgment until we know more about this. Google is saying that they're not planning on developing this all the way themselves, but are trying to get other life sciences companies interested. How interested anyone gets might be a measure to watch.

Comments (34) + TrackBacks (0) | Category: Analytical Chemistry

Aileron Heads Toward the Clinic

Email This Entry

Posted by Derek

Aileron, the stapled-peptide company, has had its ups and downs over the past few years. They went through the typical cut-back-hard phase not too long ago, but have been rounding up more money to try out their p53-targeted idea (blogged on here).

I'm glad to hear it. I would really like to see how some good stapled-peptide candidates performs in the clinic, and the p53 pathway is just the sort of hard-to-drug place you'd go with one. Aileron has one in Phase I in the growth-hormone pathway, but there's been very little news of it, from what I can see. I hope there's enough money to do a good job with these things.

Comments (7) + TrackBacks (0) | Category: Cancer | Clinical Trials

Viehbacher Out at Sanofi

Email This Entry

Posted by Derek

The talk of Chris Viehbacher being in trouble with Sanofi's board of directors was accurate: he's been dismissed. It was a unanimous vote, and the chairman cited a lack of trust and a "solitary management style" as factors. (This after a statement yesterday by Viehbacher that CEO succession wasn't on the board's agenda - they really weren't communicating, it appears).

They must have really wanted him out of there before he made any more solitary moves, though, because no replacement has been lined up yet. But if the board brings up a more clubbable Frenchman from inside the company, that probably won't go down well with the investment community, who seem to have been OK with Viehbacher for the most part. Look for some top-level people at other companies to be taking some sudden vacation days over the next few weeks. The company's partners, Regeneron in particular, will be hoping that the search is short.

Update: Matthew Herper says that this is a bad idea.

Comments (23) + TrackBacks (0) | Category: Business and Markets

October 28, 2014

Rumblings About AstraZeneca and Antibiotics

Email This Entry

Posted by Derek

Well, let's get the rest of today's bad news out there. There are persistent reports that AstraZeneca is going to be getting rid of the rest of its antibiotics research. David Shlaes' blog is the source of this one, and he may well have some information from people who know. The company's response has not been encouraging, either. Here's what they told Ed Silverman at Pharmalot:

". . .we have previously said on a number of occasions that we would take an opportunity-driven approach in our non-core therapeutic areas of infection and neuroscience,” she continues. “This means we would focus our resources on the core therapeutic areas and look for opportunities to maximize the value of our pipeline infection and neuroscience."

Oh, dear. When they start talking about opportunities and maximizing value, it's generally a very bad sign. Sauve qui peut is the usual reaction to this sort of statement, and I can't say that it would be inappropriate.

Comments (17) + TrackBacks (0) | Category: Business and Markets | Infectious Diseases

Amgen Cuts Harder

Email This Entry

Posted by Derek

Not good news. Amgen has announced that it's not just cutting 12 to 15% of its employees - it's going to cut 20% of them. So that means over a thousand new layoffs, on top of the ones that have already been put in the hopper. No details yet, that I know of, about where these cuts will take place, but the number of Amgen sites is already fewer than it used to be, so that certainly narrows it down.

As with the previous announcement, this one has sent the company's stock straight up. And hey, they're buying back more shares, too. So there is that.

Comments (29) + TrackBacks (0) | Category: Business and Markets

An Open-Source Cancer Pitch, Deconstructed

Email This Entry

Posted by Derek

I'm confused. Read this and see if you end up the same way. TechCrunch has the story of Isaac Yonemoto, who's crowdsourcing a project around a potential oncology compound. It's a derivative of sibiromycin, a compound I hadn't come across, but it seems that it was first studied in Russia, and then at Maryland. Yonemoto's own work on the compound is in this paper from 2012, which looks reasonable. (Here's more). And the crowdfunding pitch is also reasonable, in lay-audience terms:

The drug candidate 9DS was developed at the University of Maryland. The last work done on the drug showed that it had activity against cancer competitive with leading cancer drugs such as taxol. Moreover, 9DS is also likely to have lower side effects than most chemotherapies, since a related compound, SJG-136, seems to have low side effects in early clinical trials.

Project Marilyn involves: production of more 9DS, and submitting 9DS to a xenograft study ('curing cancer in mice'). This is the next step in drug development and an important one on the way to doing clinical (human) studies. The process we're seeking to fund should take approximately 6 months. If we recieve more funding, we will add stretch goals, such as further preclinical experiments on 9DS, development 9DS analogs, or other exciting anti-cancer ideas.

But here's where things begin to swerve off into different territory. Yonemoto isn't just talking about some preclinical spadework on yet another oncology compound (which is what the project actually is, as far as I can tell). He's pitching it in broader terms:

. . .Some drugs can cost upwards of $100,000 a year, bankrupting patients. This level of expense is simply unacceptable, especially since 1/3 of people will get cancer in their lifetime.

One solution to this problem is to develop unpatented drugs - pharmaceutical companies will have to sell them at a reasonable price. To those who believe that drugs cannot be made without patents we remind them:

When Salk and Sabin cured polio, they didn't patent the vaccine. It's time to develop a patent-free anticancer drug for the 21st century.

The software industry and the open-source movement have shown that patenting is not necessary for innovation. Releasing without a patent means the drugs will be cheaper and it will be easier to build on the work to make improved drugs or drug combinations. Releasing without a patent means expanded access to drugs in countries that can't afford extensive licensing and export agreements.

OK, let's take this one apart, piece by piece, in good old classic blogging style. Yes, some oncology drugs are indeed very expensive. This is more of a problem for insurance companies and governments, since they're paying nearly all of these costs, but the topic of drug prices in oncology has come up around here many times, and will do so again. It's especially worrisome for me that companies are already up close to the what-the-market-will-possibly-bear price with things that are not exactly transformative therapies (what pricing structure will those have?)

But are unpatented drugs the solution? It seems to me that pharmaceutical companies will not "have to sell them at a reasonable price". Rather, unpatented compounds will simply not become drugs. Yonemoto, like so many others who have not actually done drug development, is skipping over the longest, most difficult, and most expensive parts of the process. Readers of the crowdsourcing proposal might be forgiven if they don't pick up on this, but getting a compound to work in some mouse xenograft models does not turn it into a drug. Preparing a compound to go into human trials takes a lot more than that: a reliable scale-up route to the compound itself, toxicology studies, more detailed pharmacokinetic studies, formulation studies. This can't be done by a handful of people: a handful of people don't have the resources and expertise. And that's just setting the stage for the real thing: clinical trials in humans. That crowdsourcing proposal skates over it, big-time, but the truth is that the great majority of money in drug development is spent in the clinic. The amount of money Yonemoto is raising, which is appropriate for the studies he's planning, is a roundoff error in the calculations for a decent clinical campaign.

So who's going to do all that? A drug company. Are they going to take that on with an unpatented compound that they do not own? They are not. Another thing that a lay reader won't get from reading Yonemoto's proposal is that the failure rate for new oncology compounds in the clinic is at least 90%, and probably more like 95. If you are going to spend all that money developing compounds that don't make it, you will need to make some money when one of them finally does. If a compound has no chance of ever doing that, no one's even going to go down that road to start with.

Now we get to the Salk/Sabin patent example. There are plenty of persistent myths about the polio vaccine story (this book review at Technology Review is a good intro to the subject). Jonas Salk created one of the most enduring myths when he famously told Edward R. Murrow in an interview that "There is no patent. Would you patent the sun?". But the idea of patenting his injected, killed-virus vaccine had already been looked into, and lawyers had determined that any application would be invalidated by prior art. (Salk himself, in his late work on a possible HIV vaccine, did indeed file patent applications).

Sabin's oral attenuated-virus vaccine, on the other hand, was indeed deliberately never patented. But this does not shed much light on the patenting of drugs for cancer. The Sabin polio vaccine protected all comers after a single dose. The public health implications of a polio vaccine were obvious and immediate: polio was everywhere, and anyone could get it. But Yonemoto's 9SDS is not in that category: cancer is not a single disease like polio, and is not open to a single cure. Even if a sibiromycin derivative makes it to market (and they've been the subject of research for quite a while now), it will do what almost every other cancer drug does: help some people, to a degree, for a while. The exceptions are rare: patients who have a tumor type that is completely dependent on a particular mechanism, and that doesn't mutate away from that phenotype quickly enough. Most cancer patients aren't that fortunate.

So here's the rough part of cancer drug discovery: cancer, broadly speaking, is indeed a big public health issue. But we're not going to wipe it out the way the polio and smallpox vaccines wiped out their homogeneous diseases. Cancer isn't caused by a human-specific infectious agent that we can eliminate from the world. It crops up over and over again as our cells divide, in thousands of forms, and fighting it is going to take tremendous diagnostic skill and an array of hundreds of different therapies, most of which we haven't discovered yet. And money. Lots of money.

So when Yonemoto says that "The software industry and the open-source movement have shown that patenting is not necessary for innovation", he's comparing apples and iguanas. Drug discovery is not like coding, unfortunately: you're not going to have one person from San Jose pop up and add a chlorine atom to the molecule while another guy pulls an all-nighter in St. Louis and figures out the i.v. formulation for the rat tox experiments. The pitch on Indysci.org, which is really about doing some preliminary experiments, makes it sound like the opening trumpet of a drug discovery revolution and that it's going to lead to "releasing" a drug. That's disingenuous, to say the least. I wish Yonemoto luck, actually, but I think he's going to be running into some very high-density reality pretty soon.

Update: Yonemoto has added this to the comments section, and I appreciate him coming by:

"Thanks Derek! You've basically crystallized all of my insecurities about the future of open-source drugs. But that's okay. I think there are business models wherein you can get this to work, even under the relatively onerous contemporary FDA burden. To answer a few questions. I think sibiromycin is not a bad candidate for several reasons: 1. (I'm not sure I buy this one but) it's a NP derived and NP derived tends to do well. 2. A molecule with a similar mechanism has made it into phase III and phase I/II show only mild hepatotoxicity and water retention, which are prophylactically treatable with common drugs. 3. There is reportedly no bone marrow suppression in these compounds, and importantly it appears to be immune-neutral, which would make PBDs excellent therapies to run alongside immune-recruitment drugs."

Comments (57) + TrackBacks (0) | Category: Cancer | Clinical Trials | Drug Development | Drug Industry History | Infectious Diseases | Patents and IP

October 27, 2014

Sanofi - Trouble At the Top?

Email This Entry

Posted by Derek

There are reports that Sanofi's CEO, Chris Viehbacher, is in trouble with the company's board of directors. The reasons are various, ranging from his move to Boston, through the state of the company's oncology portfolio, all the way to his outspokenness, which is not very French. (That last quality has been noted here as well, several times. All of this may just be rumor-mongering, or it could be someone else's boardroom politics. But you can bet that people inside Sanofi are watching and wondering.

Update: definitely not just rumors. Via John Carroll at FierceBiotech, here's a letter that Viehbacher sent to Sanofi's board in September, outlining reasons why they shouldn't replace him. Note that the only reason we're seeing this is presumably it's been leaked by someone who wants him replaced. . .

Comments (22) + TrackBacks (0) | Category: Business and Markets

Sarepta's Duchenne Therapy Is A Lot Further Away

Email This Entry

Posted by Derek

I wrote here about Sarepta, a small company having plenty of difficulty getting a therapy for Duchenne muscular dystrophy through the FDA. At that point, the problem was the accelerated-approval pathway, but things have now gotten a good deal worse.

Following a meeting with regulators in September, the biotech spelled out a new set of data that the FDA is looking for in the application, and the biotech says it will have to delay filing--another dramatic turning point for the company this year, which saw its shares plunge 30% on the news this morning.

Sarepta shares have been on a roller coaster ride over the past two years as the company was forced repeatedly to move the goal lines on a prospective approval for the drug, which so far has registered promising data from a tiny study involving only a dozen boys. The biotech today spelled out regulators' demands for imaging and longterm results as well as more safety data, all of which will stall the company until at least mid-2015--or longer.

To make matters worse, the CEO has been telling investors that the agency was much more favorable, and in fact was strongly considering an early approval for the drug even before any Phase III results came in. Those investors are deeply unhappy now, and the parents of the potential patients even more so. This whole situation is a terrible mess, and honestly, it looks as if it could have been avoided. From outside, Sarepta seems to have been trying to make far too much of a single small study (12 boys, and the data only looked good when you excluded two of them). You have to provide convincing data in this business, and that takes time and costs money. Trying to take shortcuts is a low-percentage move.

Comments (3) + TrackBacks (0) | Category: Business and Markets | Clinical Trials

Fazlul Sarkar Subpoenas PubPeer

Email This Entry

Posted by Derek

Last month I mentioned that a professor at Wayne State, Fazlul Sarkar, was thinking of suing the PubPeer site or its commenters, after a host of negative comments on his papers disrupted his move to the University of Mississippi. Well, he's making good on that threat, according to Retraction Watch.

The court papers have all the details on just what a deal he was getting at Ole Miss:

First, we learn that in addition to a salary of $350,000, which has been previously reported, the University of Mississippi had offered Sarkar “Commitment to ‘help us realize the $2 million level on endowed professorship,’ “Relocation expenses up to $15,000,” “Laboratory and office space in two locations, Research Assistant Professors, up to two additional Research Associates, and administrative support,” “A start up package of $750,000,” and “Moving expenses for the laboratory and senior personnel.”

Sarkar had already signed the papers and submitted his resignation back in Detroit (and put his house on the market) when everything hit the skids. According to the court documents:

[I]n a letter dated June 19, 2014 – just eleven days before Dr. Sarkar was to begin his active employment – Dr. [Larry Walker, the Director of the National Center for Natural Products Research at the University of Mississippi Cancer Institute] rescinded that employment, as additionally confirmed by the Chancellor Jones on June 27, in effect terminating Dr. Sarkar before he’d even begun. Dr. Walker’s June 19, 2014 letter cited PubPeer as the reason, stating in relevant part that he had “received a series of emails forwarded anonymously from (sic?)PubPeer.com, containing several posts regarding papers from your lab. . .

Now the question is, who is Professor Sarkar suing, and what does he hope to accomplish? PubPeer itself is surely protected by the provisions of the Communications Decency Act that shield web site owners from comments made by users - this has been tested several times, to the best of my knowledge, and has held up every time. And Sarkar isn't suing PubPeer itself, apparently. The professor seems to mostly want the site's administrators to divulge the identities of the commenters who left so many damaging details (and who, presumably, had a lot to do with sending all those details to a list of people at Mississippi, just in case they'd missed them).
Sarkar%20lanes.jpg
I'm not sure how far that's going to get - I mean, what could the PubPeer folks have? IP addresses? Which probably resolve to a block of Comcast or Verizon stuff that can't be narrowed down much further? The NSA might know who these people are, but I'll bet PubPeer doesn't. And what, exactly, is Professor Sarkar going to do even if his lawyers manage to track someone down? Let's be real - a university does not suddenly throw the brakes on a big tenured hire just because of a bunch of misty allegations and unsourced grumbing. No, one of the comments at Retraction Watch provides an example (at left) of the sort of thing that appears to be at the root of the problem, from this paper, which Google Scholar says has 183 citations. (A larger version can be found here, where you can see all the little artifacts of the gel lanes, and what appear to be the same artifacts as they rotate, flip, and duplicate across the figures). And this one hadn't even been posted to PubPeer yet, so my impression is that there's a lot of this stuff.

If this is the sort of thing that was mailed to everyone at Ole Miss, I can see how it might have caused some rethinking. (The university probably didn't have a Distinguished Chair of Photoshop position open). At the very least, someone in Prof. Sarkar's lab appears to have been putting together a portfolio for such a job. Can someone be defamed by having the figures from his own papers reproduced?

At any rate, I'm glad to help draw attention to his plight, and to bring details of it to people who might otherwise have missed out. Lawsuits tend to have that effect, especially lawsuits against internet sites. I will watch the progress through the legal system of Prof. Sarkar's case with great interest. Commenters to the previous post here noted that this will not be an easy one (nor a cheap one) to pursue. A comment at Retraction Watch mentions that Sarkar is on the editorial board of a host of journals, many of which are on the Beall list of predatory publishers. Perhaps he has that to fall back on?

Update: got Prof. Fazlul's name correct, and consistently spelled. Better copy-and-paste skills would have come in handy there, I have to admit.

Comments (30) + TrackBacks (0) | Category: The Scientific Literature

October 24, 2014

Francis Collins Knows Why We Don't Have An Ebola Vaccine

Email This Entry

Posted by Derek

NIH Director Francis Collins has been saying that if only the agency's budget hadn't been cut, that we would already have an Ebola vaccine. He tried this line out during recent Congressional testimony, and apparently liked it enough that he expanded on it in an article for the Huffington Post. Collins' statements have been fodder for election-season attack ads, naturally. Personally I endorse the take on this from Michael Eisen at Berkeley (emphasis added):

. . .it’s time to call this for what it is: complete bullshit.

First, let’s deal with the most immediate assertion – that if there had been more funds there would be an Ebola vaccine today. Collins argues we’d be a few years ahead of where they are today, and that, instead of preparing to enter phase 1 trials today, they’d have done this two years ago. But last time I checked, there was a reason we do clinical trials, which is to determine if therapies are safe and effective. And, crucially, many of these fail (how many times have we heard about HIV vaccines that were effective in animals). Thus, even if you believe the only thing holding up development of the Ebola vaccine was funds, it’s still false to argue that with more money we’d have an Ebola vaccine. Vaccine and drug development just simply doesn’t work this way. There are long lists of projects, in both the public and private sector that have been very well-funded, and still failed.

It is a gross overtrivialization of even the directed scientific process involved in developing vaccines to suggest that simply by spending more money on something you are guaranteed a product. And, if I were in Congress, frankly I’d be sick of hearing this kind of baloney, and would respond with a long list of things I’d been promised by previous NIH Directors if only we’d spend more money on them.

Eisen would have have the case made for basic research, emphasizing that without decades of funding for it that we wouldn't even be in the position to try making an Ebola vaccine at all. But that doesn't grab enough headlines. Better to pander to the Disease of the Week, and tell everyone that if only you had more cash, you most certainly would have done something about it by now.
NIH%20budget.jpg
I'll go one further. Here's a graph of the NIH budget over the last twenty years (inflation adjusted). You will note that the large rise during the last part of the Clinton years and during G. W. Bush's first term; the fact that this trajectory did not continue has been the source of a good deal of turmoil over the years. There was also a blast of funding in the stimulus-spending package in 2009, which has also not been repeated, and I believe that this has caused similar disruption. Neglecting that, the NIH budget has been in the $30-$35 billion range (inflation-adjusted) for many years, although it is down around 10% from its peak in 2004. (Note that 2013 is estimated in this chart - the real number dips just below the $30 billion line).

This, then, is what a "slashed budget" looks like, from above, anyway. The classic Washington way of referring to budget cuts, though, comes from smaller increases than were planned. So if a program was originally budgeted to get 10% more funding, but it only ends up getting 5% more, than everyone who advocates for it goes out and says that spending on it was cut in half. And if it was supposed to stay even and instead shrinks by 2%, well, you can imagine. That's when the so-called "Washington Monument" strategy kicks in - if they decide to freeze or cut the Park Service budget, the first thing you do is close the Washington Monument, so as to make a highly visible and annoying case for your agency to have the money, anyway.

Now, there is room to complain about the allocations inside the NIH itself - there are programs where less money really is being spent. But while the overall budget for the agency has declined, it hasn't done so in the massive clear-cutting fashion that you might imagine if you read, for example, editorials by Francis Collins. But he's just doing what every other agency head does - bang the drum for their vital, essential funding. It would be a better world if Michael Eisen's recommendation of pitching basic research funding were to be effective, but I'm not sure that we live there. Where we live, we get. . .well, bullshit instead.

Update: man, the comments are rolling in on this one. So to clarify things, let me say that I think that NIH is a good thing, and should be well funded. But I also think that trying to get it funded by saying that we surely would have had an Ebola vaccine by now is not a good thing to do - it's not very effective, for one thing, because this is the same sort of story that always gets used, and it's also not very accurate. I realize that good things are often accomplished less exalted means, but you'd hope that there would be better means than this.

Comments (96) + TrackBacks (0) | Category: Academia (vs. Industry) | Infectious Diseases

October 23, 2014

Atmospheric Conditions

Email This Entry

Posted by Derek

Well, I've been busy sciencing away all morning, and we're having the kind of weather outside that makes a person want to stay indoors and do chemistry: rainy, chilly, and windy. You wonder why more big chemical discoveries don't come out of the places that have these sorts of conditions all the time!

Air handling and climate control notwithstanding, this sort of weather naturally raises the humidity, and if I were having to worry about tiny moisture-sensitive reactions, these are not the conditions I would pick. But neither were the conditions back in the Southern US in the summertime - some afternoons down there, the humidity is like sticking your head into a dishwasher. How anyone managed to get chemistry done under those conditions in the days before air conditioning is a mystery to me.

My German lab, on my post-doc, was not air-conditioned (in keeping with much of the rest of the country) and had windows that opened, as very few other labs I've worked in ever have. But a German summer is not like an Arkansas one - in fact, every so often, a German summer can resemble an Arkansas winter. Even so, we did have to watch the air-sensitive reagents, because conditions certainly varied. My summer research in Arkansas, though, back when I was an undergrad, was conducted on the fourth floor of a building with no windows, so when the air conditioning went out there, we basically had to flee after a while. The plastic caps used on the old ether cans would come popping off in the heat, and that was a pretty good sign that it was time to pack it in for the day.

And my grad-school lab was also a windowless cave, thanks to the design of the building, but I really didn't get to experience un-air-conditioned chemistry in there. If the AC was down, it meant that the whole air handling system was messed up, which meant that the lab itself rapidly became uninhabitable. Decades of grad student-led contamination led to a pestilential funk that you could breast-stroke through; there was no way that I was going to hang around and experience it for any longer than I had to.

But all this is first-world complaining - I've had Indian colleagues, among others, describe really severe climate-influenced lab work. So feel free to add your worst examples in the comments, but expect the folks with experience in the tropics to win the competition!

Comments (59) + TrackBacks (0) | Category: Life in the Drug Labs

October 22, 2014

Green Coffee Beans Will Mostly Slim Your Wallet

Email This Entry

Posted by Derek

Very few readers of this site are likely to have a good opinion of Dr. Oz (I certainly don't). And very few readers will be surprised to hear that one of his highly-touted miracle weight loss regiments - green coffee bean extract (GCA) - has turned out to be a load of faked-up nonsense. Retraction Watch has the details, and let's just say that the clinical trial results were. . .a little bit below the desired standard:

The FTC charges that the study’s lead investigator repeatedly altered the weights and other key measurements of the subjects, changed the length of the trial, and misstated which subjects were taking the placebo or GCA during the trial. When the lead investigator was unable to get the study published, the FTC says that AFS hired researchers Joe Vinson and Bryan Burnham at the University of Scranton to rewrite it. Despite receiving conflicting data, Vinson, Burnham, and AFS never verified the authenticity of the information used in the study, according to the complaint.

Other than that, the study was just fine, I guess. Sheesh. I have to admit, that's even worse than I had pictured, and that's saying a lot. Dr. Oz himself, though, will probably not even note this in passing. Too many other miracle cures to peddle, too many TV slots to fill. He's a busy man, you know.

Update: the show has released a rather bland statement about this whole affair, but has also apparently scrubbed the web site of any mention of green coffee beans, had videos taken down at YouTube, and so on. So that's all right, then!

Comments (21) + TrackBacks (0) | Category: Snake Oil

Roche Rebuilds

Email This Entry

Posted by Derek

Roche has announced ambitious plans for new buildings at its home base in Basel:

They're building a new home for John Reed and Roche's pRED research group in Basel – and the pharma giant is thinking big. Roche said today that it is committing $1.8 billion to build a new research center in Switzerland which will encompass 4 new office/lab buildings that will house 1,900 R&D staffers.

The first step of the process will involve construction of an in vivo center for animal research, slated for completion in 2018. And Roche plans to clear away older buildings to make way for a new office tower as part of a wider building plan that will cost $3.2 billion.

I saw the current new Roche office tower when I was in Basel recently - it's hard to miss - and I can't say that it's much of an ornament to the skyline. (People told me that it had originally been planned as more of an architectural statement, but that all that had been scaled back because of the cost, which seemed quite believably Swiss).

Comments (31) + TrackBacks (0) | Category: Business and Markets

Improving the Old-Fashioned Reaction Workup

Email This Entry

Posted by Derek

Here's something new: working up a reaction. The authors say that they have a porous polymer that adsorbs organic compounds from aqueous reaction mixtures, allowing you to just stir and filter rather than doing a liquid/liquid extraction. The adsorbed material can then be taken right to chromatography, as if you'd adsorbed your compound onto any other solid support, or just washed with solvent to liberate the crude product.

I have colleagues who will be trying this out soon, and I'll report on their experience with the stuff. If it really is widely applicable, it could be a nice addition to the parallel synthesis and flow chemistry worlds (pumping a crude reaction through a cartridge of absorbing polymer could be a fast way to do workups and solvent switches).

Comments (14) + TrackBacks (0) | Category: Chemical News

Phenylalanine Crystals

Email This Entry

Posted by Derek

No matter how long you've been doing chemistry, there are still things that you come across that surprise you. Did you know that plain old L-phenylalanine has been one of the most difficult subjects ever for small-molecule crystallography? I sure didn't. But people have tried for decades to grow good enough crystals of it to decide what space group it's in. One big problem has been the presence of several polymorphs (see blog posts here and here), but it looks like the paper linked above has finally straightened things out.

Comments (2) + TrackBacks (0) | Category: Analytical Chemistry | Chemical News

October 21, 2014

Oxygenated Nanobubbles. For Real?

Email This Entry

Posted by Derek

A longtime reader sent along this article, just based on the headline. "This headline triggers instant skepticism in me", he said, and I agree. "Potential to treat Alzheimer's" is both a bold and a weaselly statement to make. The weasel part is that sure, anything has the "potential" to do that, but the boldness lies in the fact that so far, nothing ever has. There are a couple of very weak symptomatic treatments out there, but as far as actually addressing the disease, the clinical success rate is a flat zero. But that's not stopping these folks:

“The impact of RNS60 on Alzheimer’s disease as outlined in our studies presents new opportunities for hope and deeper research in treating a disease that currently cannot be prevented, cured or even successfully managed,” said Dr. Kalipada Pahan, professor of neurological sciences, biochemistry and pharmacology and the Floyd A. Davis, M.D., endowed chair of neurology at the Rush University Medical Center. “Our findings sparked tremendous excitement for RNS60, identifying an opportunity for advanced research to develop a novel treatment to help the rapidly increasing number of Alzheimer’s disease and dementia patients.”

Well, good luck to everyone. But what, exactly, is RNS60, and who is Revalesio, the company developing it? I started reading up on that, and got more puzzled the further I went. That press release described RNS60 as "a therapeutic saline containing highly potent charge-stabilized nanostructures (CSNs) that decrease inflammation and cell death." That didn't help much. Going to the company's web site, I found this:

Revalesio is developing a novel category of therapeutics for the treatment of inflammatory diseases using its proprietary charge-stabilized nanostructure (CSN) technology. Revalesio’s products are created using a patented device that generates rotational forces, cavitation and high-energy fluid dynamics to create unique, stable nanostructures in liquids. CSNs are less than 100 nanometers in size (for reference, the width of a single strand of hair is 100,000 nanometers) and are established through the combination of an ionic scaffold and a nano-sized oxygen bubble core.

RNS60 is Revalesio’s lead product candidate based upon CSN technology. RNS60 is normal, medical-grade, isotonic saline processed with Revalesio’s technology. RNS60 does not contain a traditional active pharmaceutical ingredient and offers a unique and groundbreaking approach to treating diseases . . .

OK, then. If I'm getting this right, this is saline solution with extremely small bubbles of oxygen in it. I'm not familiar with the "nanobubble" literature, so I can't say if these things exist or not. I'm unwilling to say that they don't, because lot of odd things happen down at that small scale, and water is a notoriously weird substance. The size of the bubbles they're talking about would be what, a few hundred oxygen molecules across? Even proving that these structures exist and characterizing them would presumably be a major challenge, analytically, but I have some more reading to do on all that.

My problem is that there have been many, many, odd water products reported over the years that involve some sort of nanostructure in the solution phase. And by "odd", I mean fraudulent. Just do a quick Google search for any combination of phrases in that area, and the stuff will come gushing out - all sorts of claims about how the water being sold is so, so, different, because it has different clusters and layers and what have you. My second problem is that there have been many, many odd products reported over the years that claim to be some sort of "oxygenated" water. Do a Google search for that, but stand back, because you're about to be assaulted by page after page of wild-eyed scam artists. Super-oxygen miracle water has been a stable of the health scam business for decades now.

So the Revalesio people have a real challenge on their hands to distinguish themselves from an absolute horde of crackpots and charlatans. The web site says that these oxygen nanobubbles "have a stabilizing effect on the cell membrane", which modulates signaling of the PI3K pathway. The thing is, there are a number of publications on this stuff, in real journals, which is not the sort of thing you find for your typical Internet Wonder Water. The president of the company is a longtime Eli Lilly executive as well, which is also rather atypical for the fringe. Here's one from J. Biol. Chem., and here's one on nanoparticle handing from the Journal of Physical Chemistry. The current neuronal protection work is in two papers in PLOS ONE, here and here.

I'm baffled. These papers talk about various cellular pathways being affected (PI3K, ATP production, phosphorylation of tau, NF-kb activation, and so on), which is a pretty broad range of effects. It's a bit hard to see how something with such effects could always be positive, but paper after paper talks about benefits for models of Parkinson's, multiple sclerosis, exercise physiology, and now Alzheimer's. A common thread could indeed be inflammation pathways, though, so I can't dismiss these mechanisms out of hand. But then there's this paper, which says that drinking this water after exercise improves muscle recovery, and I'm just having all kinds of trouble picturing how these nanostructured bubbles make it intact out of the gut and into the circulation. If they're sticking all over cell membranes, don't they do that to every cell they come in contact with? Are there noticeable effects in the gut wall or the vascular endothelium? What are the pharmacokinetics of nanobubbles of oxygen, and how the heck do you tell (other than maybe a radiolabel?) I'm writing this blog entry on the train, where I don't have access to all these journal articles, but it'll be interesting to see how these things are addressed. (If I were running a program like this one, and assuming that my head didn't explode from all the cognitive dissonance, I'd be trying it out in Crohn's and IBD, I think - or do all the nanobubbles get absorbed before they make it to the colon?)

So I'm putting this out there to see if everyone else gets the same expressions on their faces as I do when I look this over. Anyone have any more details on this stuff?

Comments (94) + TrackBacks (0) | Category: Biological News

October 20, 2014

Compound Properties: Starting a Renunciation

Email This Entry

Posted by Derek

I've been thinking a lot recently about compound properties, and what we use them for. My own opinions on this subject have been changing over the years, and I'm interested to see if I have any company on this.

First off, why do we measure things like cLogP, polar surface area, aromatic ring count, and all the others? A quick (and not totally inaccurate) answer is "because we can", but what are we trying to accomplish? Well, we're trying to read the future a bit and decrease the horrendous failure rates for drug candidates, of course. And the two aspects that compound properties are supposed to help with are PK and tox.

Of the two, pharmacokinetics is the one with the better shot at relevance. But how fine-grained can we be with our measurements? I don't think it's controversial to say that compounds with really high cLogP values are going to have, on average, more difficult PK, for various reasons. Compounds with lots of aromatic rings in them are, on average, going to have more difficult PK, too. But how much is "lots" or "really high"? That's the problem, because I don't think that you can draw a useful line and say that things on one side of it are mostly fine, and things on the other are mostly not. There's too much overlap, and too many exceptions. The best you can hope for, if you're into line-drawing, is to draw one up pretty far into the possible range and say that things below it may or may not be OK, but things above it have a greater chance of being bad. (This, to my mind, is all that we mean by all the "Rule of 5" stuff). But what good does that do? Everyone doing drug discovery already knows that much, or should. Where we get into trouble is when we treat these lines as if they were made of electrified barbed wire.

That's because of a larger problem with metrics aimed at PK: PK is relatively easy data to get. When in doubt, you should just dose the compound and find out. This makes predicting PK problems a lower-value proposition - the real killer application would be predicting toxicology problems. I fear that over the years many rule-of-five zealots have confused these two fields, out of a natural hope that something can be done about the latter (or perhaps out of thinking that the two are more related than they really are). That's unfortunate, because to my mind, this is where compound property metrics get even less useful. That recent AstraZeneca paper has had me thinking, the one where they state that they can't reproduce the trends reported by Pfizer's group on the influences of compound properties. If you really can take two reasonably-sized sets of drug discovery data and come to opposite conclusions about this issue, what hope does this approach have?

Toxicology is just too complicated, I think, for us to expect that any simple property metrics can tell us enough to be useful. That's really annoying, because we could all really use something like that. But increasingly, I think we're still on our own, where we've always been, and that we're just trying to make ourselves feel better when we think otherwise. That problem is particularly acute as you go up the management ladder. Avoiding painful tox-driven failures is such a desirable goal that people are tempted to reach for just about anything reasonable-sounding that holds out hope for it. And this one (compound property space policing) has many other tempting advantages - it's cheap to implement, easy to measure, and produces piles of numbers that make for data-rich presentations. Even the managers who don't really know much chemistry can grasp the ideas behind it. How can it not be a good thing?

Especially when the alternative is so, so. . .empirical. So case-by-case. So disappointingly back-to-where-we-started. I mean, getting up in front of the higher-ups and telling them that no, we're not doing ourselves much good by whacking people about aromatic ring counts and nitrogen atom counts and PSA counts, etc., that we're just going to have to take the compounds forward and wait and see like we always have. . .that doesn't sound like much fun, does it? This isn't what anyone is wanting to hear. You're going to do a lot better if you can tell people that you've Identified The Problem, and How to Address It, and that this strategy is being implemented right now, and here are the numbers to prove it. Saying, in effect, that we can't do anything about it runs the risk of finding yourself replaced by someone who will say that we can.

But all that said, I really am losing faith in property-space metrics as a way to address toxicology. The only thing I'm holding on to are some of the structure-based criteria. I really do, for example, think that quinones are bad news. I think if you advance a hundred quinones into the clinic, that a far higher percentage of them will fail due to tox and side effects than a hundred broadly similar non-quinones. Same goes for rhodanines, and a few other classes, those "aces in the PAINS deck" I referred to the other day. I'm still less doctrinaire about functional groups than I used to be, but I still have a few that I balk at.

And yes, I know that there are drugs with all these groups in them. But if you look at the quinones, for example, you find mostly cytotoxics and anti-infectives which are cytotoxins with some selectivity for non-mammalian cells. If you're aiming at a particularly nasty target (resistant malaria, pancreatic cancer), go ahead and pull out all the stops. But I don't think anyone should cheerfully plow ahead with such structures unless there are such mitigating circumstances, or at least not without realizing the risks that they're taking on.

But this doesn't do us much good, either - most medicinal chemists don't want to advance such compounds anyway. In fact, rather than being too permissive about things like quinones, most of us are probably too conservative about the sorts of structures we're willing to deal with. There are a lot of funny-looking drugs out there, as it never hurts to remind oneself. Peeling off the outer fringe of these (and quinones are indeed the outer fringe) isn't going to increase anyone's success rate much. So what to do?

I don't have a good answer for that one. I wish I did. It's a rare case when we can say, just by looking at its structure, that a particular compound just won't work. I've been hoping that the percentages would allow us to say more than that about more compounds. But I'm really not sure that they do, at least not to the extent that we need them to, and I worry that we're kidding ourselves when we pretend otherwise.

Comments (32) + TrackBacks (0) | Category: Drug Assays | Drug Development | In Silico

October 17, 2014

More on "Metabolite Likeness" as a Predictor

Email This Entry

Posted by Derek

A recent computational paper that suggested that similarity to known metabolites could help predict successful drug candidates brought in a lot of comments around here. Now the folks at Cambridge MedChem Consulting have another look at it here.

The big concern (as was expressed by some commenters here as well) is the Tanimoto similarity cutoff of 0.5. Does that make everything look too similar, or not? CMC has some numbers across different data sets, and suggests that this cutoff is, in fact, too permissive to allow for much discrimination. People with access to good comparison sets of compounds that made it and compounds that didn't - basically, computational chemists inside large industrial drug discovery organizations - will have a better chance to see how all this holds up.

Comments (6) + TrackBacks (0) | Category: Drug Development | In Silico

Different Screening, Different Thermodynamics?

Email This Entry

Posted by Derek

Chris Lipinski and the folks at Collaborative Drug Discovery send word of an interesting webinar that will take place this coming Wednesday (October 22nd) at 2 PM EST. It's on enthalpic and entropic trends in ligand binding, and how various screening and discovery techniques might bias these significantly.

Here's the registration page if you're interested. I'm curious about what they've turned up - my understanding is that it will explore, among other things, the differences in molecules selected by industry-trained medicinal chemists versus the sorts that are reported by more academic chemical biologists. As has come up here several times in the past, there certainly do seem to be some splits there, and the CDD people seem to have some numbers to back up those impressions.

Comments (7) + TrackBacks (0) | Category: Academia (vs. Industry) | Chemical Biology | Drug Assays

October 16, 2014

The Electromagnetic Field Stem Cell Authors Respond

Email This Entry

Posted by Derek

The authors of the ACS Nano paper on using electromagnetic fields to produce stem cells have responded on PubPeer. They have a good deal to say on the issues around the images in their paper (see the link), and I don't think that argument is over yet. But here's what they have on criticisms of their paper in general:

Nowhere in our manuscript do we claim “iPSCs can be made using magnetic fields”. This would be highly suspect indeed. Rather, we demonstrate that in the context of highly reproducible and well-established reprogramming to pluripotency with the Yamanaka factors (Oct4, Sox2, Klf4, and cMyc/or Oct4 alone), EMF influences the efficiency of this process. Such a result is, to us, not surprising given that EMF has long been noted to have effects on biological system(Adey 1993, Del Vecchio et al. 2009, Juutilainen 2005)(There are a thousand of papers for biological effects of EMF on Pubmed) and given that numerous other environmental parameters are well-known to influence reprogramming by the Yamanaka factors, including Oxygen tension (Yoshida et al. 2009), the presence of Vitamin C (Esteban et al. 2010), among countless other examples.

For individuals such as Brookes and Lowe to immediately discount the validity of the findings without actually attempting to reproduce the central experimental finding is not only non-scientific, but borders on slanderous. We suggest that these individuals take their skepticism to the laboratory bench so that something productive can result from the time they invest prior to their criticizing the work of others.

That "borders on slanderous" part does not do the authors any favors, because it's a rather silly position to take. When you publish a paper, you have opened the floor to critical responses. I'm a medicinal chemist - no one is going to want to let me into their stem cell lab, and I don't blame them. But I'm also familiar with the scientific literature enough to wonder what a paper on this subject is doing in ACS Nano and whether its results are valid. I note that the paper itself states that ". . .this physical energy can affect cell fate changes and is essential for reprogramming to pluripotency."

If it makes the authors feel better, I'll rephrase: their paper claims that iPSCs can be made more efficiently by adding electromagnetic fields to the standard transforming-factor mixture. (And they also claim that canceling out the Earth's magnetic field greatly slows this process down). These are very interesting and surprising results, and my first impulse is to wonder if they're valid. That's my first impulse every time I read something interesting and surprising, by the way, so the authors shouldn't take this personally.

There are indeed many papers in PubMed on the effects of electromagnetic fields on cellular processes. But this area has also been very controversial, and (as an outside observer) my strong impression is that there have been many problems with irreproducibility. I have no doubt that people with expertise in stem cell biology will be taking a look at this report and trying to reproduce it as well, and I am eager to see what happens next.

Comments (27) + TrackBacks (0) | Category: Biological News | The Scientific Literature

What's The Going Rate These Days?

Email This Entry

Posted by Derek

Time to break out the pseudonyms for the comments section. I've had a couple of people asking (on both sides of the process) what the starting salaries for medicinal chemists are running in the Boston/Cambridge area. It's been a while since this was much of a topic, sad to say, but there is some hiring going on these days, and people are trying to get a feel for what the going rates are. Companies want to make sure that they're making competitive-but-not-too-generous offers, and applicants want to make sure that they're getting a reasonable one, too, naturally.

So anyone with actual data is invited to leave it in the comments section, under whatever name you like. Reports from outside the Boston/Cambridge area (and at other experience levels) are certainly welcome, too, because the same issues apply in other places as well.

Comments (127) + TrackBacks (0) | Category: Business and Markets | How To Get a Pharma Job

No More Varian

Email This Entry

Posted by Derek

This week has brought news that Agilent is getting out of the NMR business, which brings an end to the Varian line of machines, one of the oldest in the business. (Agilent bought Varian in 2010). The first NMR I ever used was a Varian EM-360, which was the workhorse teaching instrument back then. A full 60 MHz of continuous wave for your resolving pleasure - Fourier transform? Superconducting magnets? Luxury! Why, we used to dream of. . .

I used many others in the years to come. But over time, the number of players in the NMR hardware market has contracted. You used to be able to walk into a good-sized NMR room and see machines from Varian, Bruker, JEOL, Oxford, GE (edit - added them) and once in a while an oddity like the 80-MHz IBM-brand machine that I used to use at Duke thirty years ago. No more - Bruker is now the major player. Their machines are good ones (and they've been in the business a while, too), but I do wish that they had some competition to keep them on their toes.

How come there isn't any? It's not that NMR spectroscopy is a dying art. It's as useful as ever, if not even more so. But I think that the market for equipment is pretty saturated. Every big company and university has plenty of capacity, and will buy a new machine only once in a while. The smaller companies are usually fixed pretty well, too, thanks to the used equipment market. And most of those colleges that used to have something less than a standard 300 MHz magnet have worked their way up to one.

There's not much room for a new company to come in and say that their high-field magnets are so much better than the existing ones, either, because the hardware has also reached something of a plateau. You can go out and buy a 700 MHz instrument (and Bruker no doubt wishes that you would), and that's enough to do pretty much any NMR experiment that you can think of. 1000 MHz instruments exist, but I'm not sure how many times you run into a situation where one of those would do the job for you, but a 700 wouldn't. I'm pretty sure that no one even knows how to build a 2000 MHz NMR, but if they did, the number sold would probably be countable on the fingers of one hand. Someone would have to invent a great reason for such a machine to exist -this isn't supercomputing, where the known applications can soak up all the power you can throw at them.

So farewell to the line of Varian NMR machines. Generations of chemists have used their equipment, but Bruker is the one left standing.

Comments (43) + TrackBacks (0) | Category: Analytical Chemistry

October 15, 2014

Not The Sort Of Thing You'd Work With, Given a Choice

Email This Entry

Posted by Derek

Here's a paper that illustrates a different way of looking at the world than many medicinal chemists would have. It discusses inhibitors of SETD8, an unusual epigenetic enzyme that is the only known methyltransferase to target lysine 20 of histone H4. Inhibitors of it would help to unravel just what functions that has, presumably several that no other pathway is quite handling. But finding decent methyltransferase inhibitors has not been easy.
Quinones.jpg
When you search for them, actually, you find compounds like the ones in this paper. Most medicinal chemists will look at these, say the word "quinone", perhaps take a moment spit on the floor or add a rude adjective, and move on to see if there's anything better to look at. Quinones are that unpopular, and with good reason. They're redox-active, can pick up nucleophiles as Michael acceptors, react with amines - they have a whole suite of unattractive behaviors. And that explains their profile in cells and whole animals, with a range of toxic, carcinogenic, and immunologic liabilities. A lot of very active natural products have a quinone in them - it's a real warhead. No medicinal chemist with any experience would feel good about trying to advance one as a lead compound, and (for the same reasons) they tend to make poor tool compounds as well. You just don't know what else they're hitting, and the chance of them hitting something else are too high.

The authors of this paper, though, have a higher tolerance:

In the present work, we characterize these compounds and demonstrate that NSC663284, BVT948, and ryuvidine (3 out of the 4 HTS hits) inhibit SETD8 via different modes. NSC663284 (SPS8I1), ryuvidine (SPS8I2), and BVT948 (SPS8I3) efficiently and selectively suppress cellular H4K20me1 at doses lower than 5 μM within 24 h. . . The cells treated with SPS8I1−3 (Small-molecule Pool of SETD8 Inhibitor) recapitulate cell-cycle-arrest phenotypes similar to what were reported for knocking down SETD8 by RNAi. Given that the three compounds have distinct structures and inhibit SETD8 in different manners, they can be employed collectively as chemical genetic tools to interrogate SETD8-involved methylation.

I would be very careful about doing that, myself. I don't find those structures as distinct as all that (quinone, quinone, quinone), and I'm not surprised to find that they arrest the cell cycle. But do they do it via SETD8? To be fair, they do show selectivity over the other enzymes used in the screening panel (SETD7, SETD2, and GLP). They went on to profile them against several lysine methyltransferases and several arginine methyltransferases. The most selective of the bunch was 2.5x more active against SETD8 compared to the next most active target, which is honestly not a whole lot. (And I note that the authors spent some time a few paragraphs before talking about how their activity measurements are necessarily uncertain).

They do address the quinone problem, but in a somewhat otherworldly manner:

Given that SPS8I1−3 are structurally distinct except for their quinonic moiety (Figure 1a, highlighted in red), we reasoned that they may act on SETD8 differently (e.g., dependence on cofactor or substrate). . .

This, to many medicinal chemists, is a bit like saying that several species of poisonous snake are distinct except for their venom-filled fangs. The paper does seem to find differences in how the three inhibitors respond to varying substrate concentrations, but they also find (unsurprisingly) that all three work by covalent inhibition. Studies on mutant forms of the enzyme suggest strongly that two of the compounds are hitting a particular Cys residue (270), while the third "may target Cys residues in a more general manner". To their credit, they did try three quinone-containing compounds from commercial sources and found them inactive, but that just shows that not every quinone inhibits their enzyme.

This, too, is what you'd expect: if you did a full proteome analysis of what a given quinone compound hits, I'm sure that you'd find varying fingerprints for each one. But even though I have no objection to covalent inhibitors per se, I'm nervous about ones that have so many potential mechanisms. The size and shape of the three compounds shown will surely keep them from doing all the damage that a smaller quinone is capable of doing, but I fear that there's still plenty of damage in them.

Indeed, when they do cell assays, they find that each of the compounds has a somewhat different profile of cell cycle arrest, and say that this is probably due to their off-target effects. But they go on to wind things up like this:

Structurally distinct SPS8I1−3 also display different modes of SETD8 inhibition. Such differences also make SPS8I1−3 less likely to act on other common cellular targets besides SETD8. As a result, the shared phenotypes of the 3 compounds are expected to be associated with SETD8 inhibition. . .Such robust inhibition of SETD8 by SPS8I1−3, together with their different off-target effects, argues that these compounds can be used collectively as SETD8 inhibitors to offset off-target effects of individual reagents. At this stage, we envision using all three compounds to examine SETD8 inhibition and then focusing on the phenotypes shared by all of them.

I have to disagree there. I would be quite worried about how many other cellular processes are being disrupted by these compounds. In fact, the authors already point to some of these. Their SPS8I1, they note, has already been reported as a CDC25 inhibitor. SPS8I2 has been shown to be a CDK2/4 inhibitor, and SPS8I3 has been reported as an inhibitor of a whole list of protein tyrosine phosphatases. None of these enzymes, I would guess, has any particular great structural homology with SETD8, and those activities are surely only the beginning. How is all this to be untangled? Using all three of them to study the same system is likely to just confuse things more rather than throwing light on common mechanisms. Consider the background: even a wonderful, perfectly selective SETD8 inhibitor would be expected to induce a complex set of phenotypes, varying on the cell type and the conditions.

And these are not wonderful inhibitors. They are quinones, aces in the deck of PAINS. No matter what, they need a great deal more characterization before any conclusions can be drawn from their activity. A charitable view of them would be that such characterization, along with a good deal of chemistry effort, might result in derivatives that have a decent chance of hitting SETD8 in a useful manner. An uncharitable view would be that they should be poured into the red waste can before they use up any more time and money.

Comments (42) + TrackBacks (0) | Category: Drug Assays

October 14, 2014

Combichem Into Drugs: How Many?

Email This Entry

Posted by Derek

So here's a question I got from a reader the other day, that I thought I'd put up on the site. How many drugs have there been whose origins were in combichem? I realize that this could be tricky to answer, because compound origins are sometimes forgotten or mysterious. But did the combichem boom of the 1990s produce any individual compound success stories?

Comments (49) + TrackBacks (0) | Category: Drug Assays | Drug Industry History

Electromagnetic Production of Stem Cells? Really?

Email This Entry

Posted by Derek

Now this is an odd paper: its subject matter is unusual, where it's published is unusual, and it's also unusual that no one seems to have noticed it. I hadn't, either. A reader sent it along to me: "Electromagnetic Fields Mediate Efficient Cell Reprogramming into a Pluripotent State".

Yep, this paper says that stem cells can be produced from ordinary somatic cells by exposure to electromagnetic fields. Everyone will recall the furor that attended the reports that cells could be reprogrammed by exposure to weak acid baths (and the eventual tragic collapse of the whole business). So why isn't there more noise around this publication?

One answer might be that not many people who care about stem cell biology read ACS Nano, and there's probably something to that. But that immediately makes you wonder why the paper is appearing there to start with, because it's also hard to see how it relates to nanotechnology per se. An uncharitable guess would be that the manuscript made the rounds of several higher profile and/or more appropriate journals, and finally ended up where it is (I have no evidence for this, naturally, but I wouldn't be surprised to hear that this was the case).

So what does the paper itself have to say? It claims that "extremely low frequency electromagnetic fields" can cause somatic cells to transform into pluripotent cells, and that this process is mediated by EMF effects on a particular enzyme, the histone methyltransferase MII2. That's an H3K4 methyltransferase, and it has been found to be potentially important in germline stem cells and spermatogenesis. Otherwise, I haven't seen anyone suggesting it as a master regulator of stem cell generation, but then, there's a lot that we don't know about epigenetics and stem cells.

There is, however, a lot that we do know about electromagnetism. Over the years, there have been uncountable reports of biological activity for electromagnetic fields. You can go back to the controversy over the effects of power lines in residential areas and the later disputes about the effects of cell phones, just to pick two that have had vast amounts of coverage. The problem is, no one seems to have been able to demonstrate anything definite in any of these cases. As far as I know, studies have either shown no real effects, or (when something has turned up), no one's been able to reproduce it. That goes both for laboratory studies and for attempts at observational or epidemiological studies, too: nothing definite, over and over.

There's probably a reason for that. I have trouble with is the mechanism by which an enzyme gets induced by low-frequency electromagnetic fields, and that's always been the basic argument against such things. You almost have to assume new physics to make a strong connection, because nothing seems to fit: the energies involved are too weak, the absorptions don't match up, and so on. Or at least that's what I thought, but this paper has a whole string of references about how extremely low-frequency electromagnetic fields do all sorts of things to all sorts of cell types. But it's worth noting that the authors also reference papers showing that they're linked to cancer epidemiology, too. It's true, though, that if you do a Pubmed search for "low frequency electromagnetic field" you get a vast pile of references, although I'm really not sure about some of them.

The authors say that the maximum effect in their study was seen at 50 Hz, 1 mT. That is indeed really, really low frequency - the wavelength for a radio signal down there is about 6000 kilometers. Just getting antennas to work in that range is a major challenge, and it's hard for me to picture how subcellular structures could respond to these wavelengths at all. There seem to be all sorts of theories in the literature about how enzyme-level and transcription-level effects might be achieved, but no consensus (from what I can see). Most of the mechanistic discussions I've seen avoid the question entirely - they talk about what enzyme system or signaling pathway might be the "mechanism" for the reported effects, but skip over the big question of how these effects might arise in the first place.

An even odder effect reported in this paper is that the authors also tried these in an experimental setup (a Helmholz coil) that canceled out the usual environment of the Earth's magnetic field. They found that this worked much less efficiently, and suggest that the natural magnetic field must have epigenetic effects. I don't know what to make of that one, either. Normal cells grown under these conditions showed no effects, so the paper hypothesizes that some part of the pluripotency reprogramming process is exceptionally sensitive. Here, I'll let the authors summarize:

As one of the fundamental forces of nature, the EMF is a physical energy produced by electrically charged objects that can affect the movement of other charged objects in the field. Here we show that this physical energy can affect cell fate changes and is essential for reprogramming to pluripotency. Exposure of cell cultures to EMFs significantly improves reprogramming efficiency in somatic cells. Interestingly, EL-EMF exposure combined with only one Yamanaka factor, Oct4, can generate iPSCs, demonstrating that EL-EMF expo- sure can replace Sox2, Klf4, and c-Myc during reprogramming. These results open a new possibility for a novel method for efficient generation of iPSCs. Although many chemical factors or additional genes have been reported for the generation of iPSCs, limitations such as integration of foreign genetic elements or efficiency remain a challenge. Thus, EMF-induced cell fate changes may eventually provide a solution for efficient, noninvasive cell reprogramming strategies in regenerative medicine.

Interestingly, our results show that ES cells and fibroblasts themselves are not significantly affected by EMF exposure; rather, cells undergoing dramatic epigenetic changes such as reprogramming seem to be uniquely susceptible to the effects of EMFs. . .

I don't know what to make of this paper, or the whole field of research. Does anyone?

Update: PubPeer is now reporting some problems with images in the paper. Stay, uh, tuned. . .

Comments (36) + TrackBacks (0) | Category: Biological News

October 13, 2014

Alzheimer's in Cell Culture?

Email This Entry

Posted by Derek

While we're talking about cell culture, there's some potentially significant news in Alzheimer's. The Tanzi lab at Mass General is reporting in Nature that they've been able to grow 3D neuronal cultures that actually reproduce the plaque-and-tangle symptoms of Alzheimer's. That's quite a surprise - neurons are notoriously badly behaved in vitro, and Alzheimer's has been a beast to model in any system at all. You can't even get neurons from human Alzheimer's patients to behave like that when you culture them (at least, I've never heard of it being done).

These new cultures apparently respond to secretase inhibitors, which on one level is good news - since you'd expect those compounds to have an effect on them. On the other hand, such compounds have been quite ineffective in human trials, so there's a disconnect here. Is there more to Alzheimer's that these cell cultures don't pick up, or are the compounds much less better-behaved in vivo (or both)?

This new system, if validated, would seem to open up a whole new avenue for phenotypic screening, which until now has been a lost cause where Alzheimer's is concerned. It's going to be quite interesting to see how this develops, and to see what it can teach us about the real disease. Nothing in this area has come easy, and a break would be welcome. The tricky part will be whether compounds that come out of such a screen will be telling us something about Alzheimer's, or just telling us something about the model. That's always the tricky part.

Update: FierceBiotech notes that Tanzi's "previous insights about Alzheimer's have run into some serious setbacks."

Comments (29) + TrackBacks (0) | Category: Alzheimer's Disease

Diabetes Progress

Email This Entry

Posted by Derek

There have recently been some welcome developments in diabetes therapy, both Type I and Type II. For the latter, there's an interesting report of a metabolic uncoupling therapy in Nature Medicine. Weirdly, it uses a known tapeworm medication, niclosamide (specifically, the ethanolamine salt). It's toxic to worms by that same mechanism. If you uncouple oxidative phosphorylation and the electron-transport system in the mitochondria, you end up just chewing up lipids through respiration while not generating any ATP. That's what happens in brown fat (through the action of uncoupling proteins), and that's what used in mammals for generating extra body heat. Many schemes for cranking this up in humans have been looked at over the years, but a full-scale mitochondrial uncoupling drug would be a nasty proposition in humans (see, for example, dinitrophenol). DNP will indeed make you lose weight, while at the same time you ravenously try to eat your daily supply of ATP, but this is done at a significant risk of sudden death. (And anything that does a better job than DNP will just skip straight to the "sudden death" part). But niclosamide seems to be less efficacious, which in this case is a good thing.

This mechanism diminishes the fat content in liver and muscle tissue, which should improve insulin sensitivity and glucose uptake, and seems to do so very well in mouse models. The authors (Shengkan Jin and colleagues at Rutgers) have formed a company to try to take something in this area into humans. I wish them luck with that - this really could be a good thing for type II and metabolic-syndrome patients, but the idea has proven very difficult over the years. The tox profile is going to be key, naturally, and taking it into the clinic is really the only way to find out if it'll be acceptable.

The Type I news is even more dramatic: a group at Harvard (led by Doug Melton) report in Cellthat they've been able to produce large quantities of glucose-sensitive beta-cells from stem cell precursors. People have been working towards this goal for years, and it hasn't been easy (you can get cells that secrete insulin, but don't sense glucose, for example, but you really don't want that in your body). Transplantation of these new cells into diabetic mice seem to roll back the disease state, so this is another one to try in humans. The tricky part is the keep the immune system from rejecting them (the problem with cell transplants for diabetes in general), but they've managed to protect them in the mouse models, and there's a lot of work going into this part of the idea as well for human trials. This could be very promising indeed, and could, if things go right, be a flat-out cure for many Type I patients. Now that would be an advance.

Comments (12) + TrackBacks (0) | Category: Diabetes and Obesity

October 10, 2014

More on Fluorescent Microscopy Chemistry Prizes

Email This Entry

Posted by Derek

I wanted to note (with surprise!) that one of this year's Nobel laureates actually showed up in the comments section of the post I wrote about him. You'd think his schedule would be busier at the moment (!), but here's what he had to say:

A friend pointed this site/thread out to me. I apologize if I was unclear in the interview. #3 and #32 have it right -- I have too much respect for you guys, and don't deserve to be considered a chemist. My field is entirely dependent upon your good works, and I suspect I'll be personally more dependent upon your work as I age.

Cheers, Eric Betzig

And it's for sure that most of the readers around here are not physicists nor optical engineers, too! I think science is too important for food fights about whose part of it is where - we're all working on Francis Bacon's program of "the effecting of all things possible", and there's plenty for everyone to do. Thanks very much to Betzig for taking the time to leave the clarification.

rhodamine.jpg
Bacterial%20probe.jpg
With that in mind, I was looking this morning at the various tabs I have open on my browser for blogging subjects, and noticed that one of them (from a week or so back) was a paper on super-resolution fluorescent probes. And it's from one of the other chemistry Nobel winners this year, William Moerner at Stanford! Shown is the rhodamine structure that they're using, which can switch from a nonfluorescent state to a highly fluorescent one. Moerner and his collaborators at Kent State investigated a series of substituted variants of this scaffold, and found one that seems to be nontoxic, very capable of surface labeling of bacterial cells, and is photoswitchable at a convenient wavelength. (Many other photoswitchable probes need UV wavelengths to work, which bacteria understandably don't care for very much).

Shown below the structure drawing is an example of the resolution this probe can provide, using Moerner's double-helix point-spread-function, which despite its name is not an elaborate football betting scheme. That's a single cell of Caulobacter crescentus, and you can see that the dye is almost entirely localized on the cell surface, and that ridiculously high resolutions can be obtained. Being able to resolve features inside and around bacterial cells is going to be very interesting in antibiotic development, and this is the kind of work that's making it possible.

Oh, and just a note: this is a JACS paper. A chemistry Nobel laureate's most recent paper shows up in a chemistry journal - that should make people happy!

Comments (8) + TrackBacks (0) | Category: General Scientific News

You'd Think That This Can't Be Correct

Email This Entry

Posted by Derek

Well, here's something to think about over the weekend. I last wrote here in 2011 about the "E-cat", a supposed alternative energy source being touted/developed by Italian inventor Andrea Rossi. Odd and not all that plausible claims of low-energy fusion reactions of nickel isotopes have been made for the device (see the comments section to that post above for more on this), and the whole thing definitely has been staying in my "Probably not real" file. Just to add one complication, Rossi's own past does not appear to be above reproach. And his conduct (and that of his coworker Sergio Focardi) would seem to be a bit strange during this whole affair.

But today there is a preprint (PDF) of another outside-opinion test of the device (thanks to Alex Tabarrok of Marginal Revolution on Twitter for the heads-up). It has several Swedish co-authors (three from Uppsala and one from the Royal Institute of Technology in Stockholm), and the language is mostly pretty measured. But what it has to say is quite unusual - if it's true.

The device itself is no longer surrounded by lead shielding, for one thing. No radiation of any kind appears to be emitted. The test went on for 32 days of continuous operation, and here's the take-home:

The quantity of heat emitted constantly by the reactor and the length of time during which the reactor was operating rule out, beyond any reasonable doubt, a chemical reaction as underlying its operation. This is emphasized by the fact that we stand considerably more than two order of magnitudes from the region of the Ragone plot occupied by conventional energy sources.

The fuel generating the excessive heat was analyzed with several methods before and after the experimental run. It was found that the Lithium and Nickel content in the fuel had the natural isotopic composition before the run, but after the 32 days run the isotopic composition has changed dramatically both for Lithium and Nickel. Such a change can only take place via nuclear reactions. It is thus clear that nuclear reactions have taken place in the burning process. This is also what can be suspected from the excessive heat being generated in the process.

Although we have good knowledge of the composition of the fuel we presently lack detailed information on the internal components of the reactor, and of the methods by which the reaction is primed. Since we are presently not in possession of this information, we think that any attempt to explain the E-Cat heating process would be too much hampered by the lack of this information, and thus we refrain from such discussions.

In summary, the performance of the E-Cat reactor is remarkable. We have a device giving heat energy compatible with nuclear transformations, but it operates at low energy and gives neither nuclear radioactive waste nor emits radiation. From basic general knowledge in nuclear physics this should not be possible. . .

Told you it was interesting. But I'm waiting for more independent verification. As long as Rossi et al. are so secretive about this device, the smell of fraud will continue to cling to it. I truly am wondering just what's going on here, though.

Update: Elforsk, the R&D arm of Sweden's power utility, has said that they want to investigate this further. Several professors from Uppsala reply that the whole thing is likely a scam, and that Elforsk shouldn't be taken in. Thanks to reader HL in the comments section, who notes that Google Translate does pretty well with Swedish-English.

Comments (39) + TrackBacks (0) | Category: General Scientific News

Things I Won't Work With: Peroxide Peroxides

Email This Entry

Posted by Derek

Everyone knows hydrogen peroxide, HOOH. And if you know it, you also know that it's well-behaved in dilute solution, and progressively less so as it gets concentrated. The 30% solution will go to work immediately bleaching you out if you are so careless as to spill some on you, and the 70% solution, which I haven't seen in years, provides an occasion to break out the chain-mail gloves.

Chemists who've been around that one know that I'm not using a figure of speech - the lab down the hall from me that used to use the stuff had a pair of spiffy woven-metal gloves for just that purpose. Part of the purpose, I believe, was to make you think very carefully about what you were doing as you put them on. Concentrated peroxide has a long history in rocketry, going back to the deeply alarming Me-163 fighter of World War II. (Being a test pilot for that must have taken some mighty nerves). Me, I have limits. I've used 30% peroxide many times, and would pick up a container of 70%, if I were properly garbed (think Tony Stark). But I'm not working with the higher grades under any circumstances whatsoever.

The reason for this trickiness is the weakness of the oxygen-oxygen bond. Oxygen already has a lot of electron density on it; it's quite electronegative. So it would much rather be involved with something from the other end of the scale, or at least the middle, rather than make a single bond to another pile of electrons like itself. Even double-bonded oxygen, the form that we breath, is pretty reactive. And when those peroxides decompose, they turn into oxygen gas and fly off into entropic heaven, which is one of the same problems involved in having too many nitrogens in your molecule. There are a lot of things, unfortunately, that can lead to peroxide decomposition - all sorts of metal contaminants, light, spitting at them (most likely), and it doesn't take much. There are apparently hobbyists, though, who have taken the most concentrated peroxide available to them and distilled it to higher strengths. Given the impurities that might be present, and the friskiness of the stuff even when it's clean, this sounds like an extremely poor way to spend an afternoon, but there's no stopping some folks.

Any peroxide (O-O) bond is suspect, if you know what's good for you. Now, if it's part of a much larger molecule, then it's much less likely to go all ka-pow on you (thus the antimalarial drugs artemisinin) and arterolane, but honestly, I would still politely turn down an offer to bang on a bunch of pure artemisinin with a hammer. It just seems wrong.

But I have to admit, I'd never thought much about the next analog of hydrogen peroxide. Instead of having two oxygens in there, why not three: HOOOH? Indeed, why not? This is a general principle that can be extended to many other similar situations. Instead of being locked in a self-storage unit with two rabid wolverines, why not three? Instead of having two liters of pyridine poured down your trousers, why not three? And so on - it's a liberating thought. It's true that adding more oxygen-oxygen bonds to a compound will eventually liberate the tiles from your floor and your windows from their frames, but that comes with the territory.

These thoughts were prompted by a recent paper in JACS that describes a new route to "dihydrogen trioxide", which I suppose is a more systematic name than "hydrogen perperoxide", my own choice. Colloquially, I would imagine that the compound is known as "Oh, @#&!", substituted with the most heartfelt word available when you realize that you've actually made the stuff. The current paper has a nice elimination route to it via a platinum complex, one that might be used to make a number of other unlikely molecules (if it can make HOOOH in 20% yield, it'll make a lot of other things, too, you'd figure). It's instantly recognizable in the NMR, with a chemical shift of 13.4 for those barely-attached-to-earth hydrogens.

But this route is actually pretty sane: it can be done on a small scale, in the cold, and the authors report no safety problems at all. And in general, most people working with these intermediates have been careful to keep things cold and dilute. Dihydrogen trioxide was first characterized in 1993 (rather late for such a simple molecule), but there had been some evidence for it in the 1960s (and it had been proposed in some reactions as far back as the 1880s). Here's a recent review of work on it. Needless to say, no one has ever been so foolhardy as to try to purify it to any sort of high concentration. I'm not sure how you'd do that, but I'm very sure that it's a bad, bad, idea. This stuff is going to be much jumpier than plain old hydrogen peroxide (that oxygen in the middle of the molecule probably doesn't know what to do with itself), and I don't know how far you could get before everything goes through the ceiling.

But there are wilder poly-peroxides out there. If you want to really oxidize the crap out of things with this compound, you will turn to the "peroxone process". This is a combination of ozone and hydrogen peroxide, for those times when a single explosive oxidizing agent just won't do. I'm already on record as not wanting to isolate any ozone products, so as you can imagine, I really don't want to mess around with that and hydrogen peroxide at the same time. This brew generates substantial amounts of HOOOH, ozonide radicals, hydroxy radicals and all kinds of other hideous thingies, and the current thinking is that one of the intermediates is the HOOOOO- anion. Yep, five oxygens in a row - I did not type that with my elbows. You'll want the peroxone process if you're treated highly contaminated waste water or the like: here's a look at using it for industrial remediation. One of the problems they had was that as they pumped ozone and peroxide into the contaminated site, the ozone kept seeping back up into the equipment trailer and setting off alarms as if the system were suddenly leaking, which must have been a lot of fun.

What I haven't seen anyone try is using this brew in organic synthesis. It's probably going to be a bit. . .uncontrolled, and lead to some peroxide products that will also have strong ideas of their own. But if you keep things dilute, you should be able to make it through. Anyone ever seen it used for a transformation?

Comments (54) + TrackBacks (0) | Category: Things I Won't Work With

October 9, 2014

The Most Common Heterocycles in Drugs

Email This Entry

Posted by Derek

What sorts of heterocycles show up the most in approved drugs? This question has been asked several times before in the literature, but it's always nice to see an update. This one is from the Njardson group at Arizona, producers of the "Top 200 Drugs" posters.

84% of all unique small-molecule drugs approved by the FDA have at least one nitrogen atom in them, and 59% have some sort of nitrogen heterocycle. Leaving out the cephems and penems, which are sort of a special case and not really general-purpose structures, the most popular ones are piperidine, pyridine, pyrrolidine, thiazole, imidazole, indole, and tetrazole, in that order. Some other interesting bits:

All the four-membered nitrogen heterocycles are beta-lactams; no azetidine-containing structure has yet made it to approval.

The thiazoles rank so highly because so many of them are in the beta-lactam antibiotics as well. Every single approved thiazole is substituted in the 2 position, and no monosubstituted thiazole has ever made it into the pharmacopeia, either.

Almost all the indole-containing drugs are substituted at C3 and/or C5 - pindolol is an outlier.

The tetrazoles are all either antibiotics or cardiovascular drugs (the sartans).

92% of all pyrrolidine-substructure compounds have a substituent on the nitrogen.

Morpholine looks more appealing as a heterocycle than it really is - piperidine and piperazine both are found far more frequently. And I'll bet that many of those morpholines are just there for solubility, and that otherwise a piperidine would have served for SAR purposes. Ethers don't always seem to do that much for you.

Piperidines rule. There's a huge variety of them out there, the great majority substituted on the nitrogen. Azepanes, though, one methylene larger, have only three representatives.

83% of piperazine-containing drugs are substituted at both nitrogens.

There are a lot of other interesting bits in the paper, which goes on to examine fused and bicyclic heterocycles. But I think this afternoon I'll go make some piperidines and increase my chances.

Comments (25) + TrackBacks (0) | Category: Chemical News | Drug Industry History

Eric Betzig Is Not a Chemist, And I Don't Much Care

Email This Entry

Posted by Derek

Update: Betzig himself has shown up in the comments to this post, which just makes my day.

Yesterday's Nobel in chemistry set off the traditional "But it's not chemistry!" arguments, which I largely try to stay out of. For one thing, I don't think that the borders between the sciences are too clear - you can certainly distinguish the home territories of each, but not the stuff out on the edge. And I'm also not that worked up about it, partly because it's nowhere near a new phenomenon. Ernest Rutherford got his Nobel in chemistry, and he was an experimental physicist's experimental physicist. I'm just glad that a lot of cutting-edge work in a lot of important fields (nanotechnology, energy, medicine, materials science) has to have a lot of chemistry in it.

With this in mind, I thought this telephone interview with Eric Betzig, one of the three laureates in yesterday's award, was quite interesting:

This is a chemistry prize, do you consider yourself a chemist, a physicist, what?

[EB] Ha! I already said to my son, you know, chemistry, I know no chemistry. [Laughs] Chemistry was always my weakest subject in high school and college. I mean, you know, it's ironic in a way because, you know, trained as a physicist, when I was a young man I would look down on chemists. And then as I started to get into super-resolution and, which is really all about the probes, I came to realise that it was my karma because instead I was on my knees begging the chemists to come up with better probes for me all the time. So, it's just poetic justice but I'm happy to get it wherever it is. But I would be embarrassed to call myself a chemist.

Some people are going to be upset by that, but you know, if you do good enough work to be recognized with a Nobel, it doesn't really matter much what it says on the top of the page. "OK, that's fine for the recipients", comes one answer, "but what about the committee? Shouldn't the chemistry prize recognize people who call themselves chemists?" One way to think about that is that it's not the Nobel Chemist prize, earmarked for whatever chemists have done the best work that can be recognized. (The baseball Hall of Fame, similarly, has no requirement that one-ninth of its members be shortstops). It's for chemistry, the subject, and chemistry can be pretty broadly defined. "But not that broadly!" is the usual cry.

That always worries me. It seems dangerous, in a way - "Oh no, we're not such a broad science as that. We're much smaller - none of those big discoveries have anything to do with us. Won't the Nobel committee come over to our little slice of science and recognize someone who's right in the middle of it, for once?" The usual reply to that is that there are, too, worthy discoveries that are pure chemistry, and they're getting crowded out by all this biology and physics. But the pattern of awards suggests that a crowd of intelligent, knowledgable, careful observers can disagree with that. I think that the science Nobels should be taken as a whole, and that there's almost always going to be some blending and crossover. It's true that this year's physics and chemistry awards could have been reversed, and no one would have complained (or at least, not any more than people are complaining now). But that's a feature, not a bug.

Comments (38) + TrackBacks (0) | Category: Chemical News | General Scientific News

October 8, 2014

XKCD on Protein Folding

Email This Entry

Posted by Derek

I've been meaning to mention this recent XKCD comic, which is right on target:
"Someone may someday find a harder one", indeed. . .

Protein folding

Comments (25) + TrackBacks (0) | Category: Biological News

The 2014 Chemistry Nobel: Beating the Diffraction Limit

Email This Entry

Posted by Derek

This year's Nobel prize in Chemistry goes to Eric Betzig, Stefan Hell, and William Moerner for super-resolution fluorescence microscopy. This was on the list of possible prizes, and has been for several years now (see this comment, which got 2 out of the 3 winners, to my 2009 Nobel predictions post). And it's a worthy prize, since it provides a technique that (1) is useful across a wide variety of fields, from cell biology on through chemistry and into physics, and (2) does so by what many people would, at one time, would have said was impossible.

The impossible part is beating the diffraction limit. That was first worked out by Abbe in 1873, and it set what looked like a physically impassable limit to the resolution of optical microscopy. Half the wavelength of the light you're using is as far as you can go, and (unfortunately) that means that you can't optically resolve viruses, many structures inside the cell, and especially nothing as small as a protein molecule. (As an amateur astronomer, I can tell you that the same limits naturally apply to telescope optics, too: even under perfect conditions, there's a limit to how much you can resolve at a given wavelength, which is why even the Hubble telescope can't show you Neil Armstrong's footprint on the moon). In any optical system, you're doing very well if the diffraction limit is the last thing holding you back, but hold you back it will.
STED.jpg
There are several ways to try to sneak around this problem, but the techniques that won this morning are particularly good ones. Stefan Hell worked out an ingenious method called stimulated emission depletion (STED) microscopy. If you have some sort of fluorescent label on a small region of a sample, you get it to glow, as usual, by shining a particular wavelength of light on it. The key for STED is that if another particular wavelength of light is used at the same time, you can cause the resulting fluorescence to shift. Physically, fluorescence results when electrons get excited by light, and then relax back to where they were by emitting a different (longer) wavelength. If you stimulate those electrons by catching them once they're already excited by the first light, they fall back into a higher vibrational state than they would otherwise, which means less of an energy gap, which means less energetic light is emitted - it's red-shifted compared to the usual fluorescence. Pour enough of that second stimulating light into the system after the first excitation, and you can totally wipe out the normal fluorescence.

And that's what STED does. It uses the narrowest possible dot of "normal" excitation in the middle, and surround that with a doughnut shape of the second suppressing light. Scanning this bulls-eye across the sample gives you better-than-diffraction-limit imaging for your fluorescent label. Hell's initial work took several years just to realize the first images, but the microscopists have jumped on the idea over the last fifteen years or so, and it's widely used, with many variations (multiple wavelength systems at the same time, high frames-per-second rigs for recording video, and so on). There's a STED image of a labeled neurofilament compared to the previous state of the art. You'd think that this would be an obvious and stunning breakthrough that would speak for itself, but Hell himself is glad to point out that his original paper was rejected by both Nature and Science.
STED%20image.jpg
You can, in principle, make the excitation spot as small as you wish (more on this in the Nobel Foundation's scientific background on the prize here). In practice, the intensity of the light needed as you push to higher and higher resolution tends to lead to photobleaching of the fluorescent tags and to damage in the sample itself, but getting around these limits is also an active field of research. As it stands, STED already provides excellent and extremely useful images of all sorts of samples - many of those impressive fluorescence microscopy shots of glowing cells are produced this way.

The other two winners of the prize worked on a different, but related technique: single-molecule microscopy. Back in 1989, Moerner's lab was the first to be able to spectroscopically distinguish single molecules outside the gas phase - pentacene, imbedded in crystals of another aromatic hydrocarbon (terphenyl), down around liquid helium temperatures. Over the next few years, a variety of other groups reported single-molecule studies in all sorts of media, which meant that something that would have been thought crazy or impossible when someone like me was in college was now popping up all over the literature.

But as the Nobel background material rightly states, there are some real difficulties with doing single-molecule spectroscopy and trying to get imaging resolution out of it. The data you get from a single fluorescent molecule is smeared out in a Gaussian (or pretty much Gaussian) blob, but you can (in theory) work back from that to where the single point must have been to give you that data. But to do that, the fluorescent molecules have to scattered apart further than that diffraction limit. Fine, you can do that - but that's too far apart to reconstruct a useful image (Shannon and Nyquist's sampling theorem in information theory sets that limit).

Betzig himself took a pretty unusual route to his discovery that gets around this problem. He'd been a pioneer in another high-resolution imaging technique, near-field microscopy, but that one was such an impractical beast to realize that it drove him out of the field for a while. (Plenty of work continues in that area, though, and perhaps it'll eventually spin out a Nobel of its own). As this C&E News article from 2006 mentions, he. . .took some time off:

After a several-year stint in Michigan working for his father's machine tool business, Betzig started getting itchy again a few years ago to make a mark in super-resolution microscopy. The trick, he says, was to find a way to get only those molecules of interest within a minuscule field of view to send out enough photons in such a way that would enable an observer to precisely locate the molecules. He also hoped to figure out how to watch those molecules behave and interact with other proteins. After all, says Betzig, "protein interactions are what make life."

Betzig, who at the time was a scientist without a research home, knew also that interactions with other researchers almost always are what it takes these days to make significant scientific or technological contributions. Yet he was a scientist-at-large spending lots of time on a lakefront property in Michigan, often in a bathing suit. Through a series of both deliberate and accidental interactions in the past two years with scientists at Columbia University, Florida State University, and the National Institutes of Health, Betzig was able to assemble a collaborative team and identify the technological pieces that he and Hess needed to realize what would become known as PALM.

He and Hess actually built the first instrument in Hess's living room, according to the article. The key was to have a relatively dense field of fluorescent molecules, but to only have a sparse array of them emitting at any one time. That way you can build up enough information for a detailed picture through multiple rounds of detection, and satisfy both limits at the same time. Even someone totally outside the field can realize that this was a really, really good plan. Betzig describes very accurately the feeling that a scientist gets when an idea like this hits: it seems so simple, and so obvious, that you're sure that everyone else in the field must have been hit by it at the same time, or will be in the next five minutes or so. In this case, he wasn't far off: several other groups were working on similar schemes while he and Hess were commandeering space in that living room. (Here's a video of Hess and Betzig talking about their collaboration).
PALM.jpg
Shown here is what the technique can accomplish - this is from the 2006 paper in Science that introduced it to the world. Panel A is a section of a lysozome, with a labeled lysozyme protein. You can say that yep, the enzyme is in the outer walls of that structure (and not so many years ago, that was a lot to be able to say right there). But panel B is the same image done through Betzig's technique, and holy cow. Take a look at that small box near the bottom of the panel - that's shown at higher magnification in panel D, and the classic diffraction limit isn't much smaller than that scale bar. As I said earlier, if you'd tried to sell people on an image like this back in the early 1990s, they'd probably have called you a fraud. It wasn't thought possible.

The Betzig technique is called PALM, and the others that came along at nearly the same time are STORM, fPALM, and PAINT. These are still being modified all over the place, and other techniques like total internal reflection fluorescence (TIRF) are providing high resolution as well. As was widely mentioned when green fluorescent protein was the subject of the 2008 Nobel, we are currently in a golden (and green, and red, and blue) age of cellular and molecular imaging. (Here's some of Betzig's recent work, for illustration). It's wildly useful, and today's prize was well deserved.

Comments (43) + TrackBacks (0) | Category: Biological News | Chemical Biology | Chemical News

October 7, 2014

German Pharma, Or What's Left of It

Email This Entry

Posted by Derek

Busy day around here on the frontiers of science, so I haven't had a chance to get a post up. A reader did send along this article from the Frankfurter Allgemeine Zeitung, the heavyweight German newspaper known as the "Fahts" (FAZ). (The Chrome browser will run Google's auto-translate past it if you ask, and it comes out sort of coherent).

What they're asking is: how and why did the German pharmaceutical industry decline so much? Parts have been sold off (as with Hoechst and BASF), and some remaining players have merged (as with Bayer and Schering AG). There's still Boehringer and Merck (Darmstadt), but they're fairly far down the rankings in size and drug R&D expenditure. And you don't have to compare things just to the US: all this has taken place while the folks just down up the river (Novartis and Roche) have looked much stronger. The article is blaming "Wankelmut" (vacillation, fickleness) at the strategic level for much of this, especially regarding the role of an industrial chemicals division versus a pharma one.

There's something to that. Bayer was urged for years and years by analysts to break up the company, and resisted. Until recently - but now they're going to do it. Meanwhile, the other big German chemistry conglomerates did just that, but divested their pharma ends off to other companies (and countries) rather than spinning them out on their own. And there's not much of a German startup/biotech sector backstopping any of this, either. The successes of Amgen, Biogen, Genentech et al. have not happened in Germany - for the most part, players there stay where they are. The big firms stay the big firms, and no one joins their ranks.

And that's what strikes me about many economies in general, as compared to the US. We have more turmoil. It's not always a good thing, but we're also had a lot of science and technology-based companies come out of nowhere to become world leaders. And you can't do that without shaking things around. Is it partly an aversion to that sort of disruption that's led to the current state of affairs, or is this mistaking symptoms for causes? (I mean, the Swiss are hardly known for wild swings in their business sectors, but Swiss pharma has done fine). Thoughts?

Comments (30) + TrackBacks (0) | Category: Business and Markets | Drug Industry History