The Pharmaceutical R&D Model is Broken. Here’s How to Fix It
Research is the lifeblood of the biotech and pharmaceuticals business. The pharma and biotech industry spent some $65 billion dollars on R & D in 2008, according to the Pharmaceutical Research and Manufacturers Association. That’s a tremendous amount of money considering that the FDA only approved 24 new drugs (21 new molecular entities and 3 biologics) that same year. If the PhRMA numbers are true, this would imply that it cost about $2.7 billion/drug to win FDA approval, a very poor return on investment since few drugs would ever be able to recoup that expense.
These numbers suggest that drug makers need to find a more efficient way of developing medicines. A recent report from financial analysts at Morgan Stanley recommended that large pharma companies abandon their own early stage drug development programs, and switch to a less costly licensing model. Rather than try to discover drugs, Big Pharma should simply buy them from smaller, more nimble and innovative biotech companies. It was claimed that such an approach, as reported in the Financial Times, “would boost success rates, lower costs, and triple returns”. This scheme would certainly fit into the plans of most venture capital (VC) firms, who could cash out large profits if Big Pharma acquires the biotech startups that they have invested in.
Even before the Morgan Stanley report came out, Big Pharma had embarked on a major job shedding binge, eliminating positions duplicated as a result of mergers and a number of research programs. Merck is eliminating 16,000 jobs after buying Schering-Plough, Pfizer around 19,500 positions after acquiring Wyeth, and Roche about 1,500 jobs after purchasing the remainder of Genentech. Layoffs, however, were not confined to companies making acquisitions. To stay competitive, Johnson & Johnson is cutting 8,000 jobs this year, Eli Lilly is axing 5,500, and GlaxoSmithKline around 6,000 positions.
In seeking additional resources to fill holes in their drug development programs, many Big Pharma companies have partnered with academic institutions. Pfizer and Genentech have both partnered with UCSF, GlaxoSmithKline with the Immune Disease Institute, Solvay Pharmaceuticals with Emory University, and Janssen Pharmaceutica with Vanderbilt University. These alliances provide money to academic investigators, usually in exchange for licensing rights that arise from any discoveries made. Grant pressure on academic investigators (80 to 90 percent of applications to the NIH do not currently get funded) makes them willing partners to help solve Big Pharma’s empty pipeline problem.
These academic collaborations, though quite helpful to Big Pharma, are not a substitute for real drug discovery and development work. If Big Pharma ramps down its research efforts, can smaller companies ramp up their research programs to compensate? I have no doubt that some of these organizations could provide a true fountain of research innovation that the larger companies can drink from. But are these companies sufficiently productive to shoulder a much larger share of the responsibility for the industry?
I believe the answer is no. Many of the smaller companies will not be up to the challenge. Tight finances have caused numerous companies to reduce or even eliminate their research staffs. Here in Seattle, two of the oldest and most established biotech companies, ZymoGenetics and Cell Therapeutics, have both eliminated their new drug discovery programs while shifting efforts to develop candidates already in the late stages of development. Other local companies have laid off hundreds of researchers in recent years. Yes, if the good times return these companies will likely start hiring researchers again (provided they are not bought out first, a more likely outcome of success). However, the sad reality is that building a world-class research staff at a company that repeatedly cycles through layoffs and hirings is going to be quite difficult, if not impossible.
What did Sciele Pharma, Kowa Research, Gloucester Pharma, Dyax, Allos Therapeutics, and Vanda Pharma all have in common in 2009? Not sure? How about Merck, Genentech, Amgen, Abbott Laboratories, Bayer, and Pfizer? In the first group, all of these companies had a new drug approved by the US FDA in 2009. In the second group, none of the named powerhouse companies got a new drug over that goal line. What’s the lesson here? Small companies can succeed in getting their drugs approved by the FDA, and for a lot less than $2.7 billion. In this sense, size doesn’t matter.
These examples, while illustrative, are a bit misleading as they are only sampling a single year. Over a multi-year period, the drug approval numbers tilt much more heavily towards Big Pharma. However, the numbers don’t scale per employee. If a company with 500 employees can get one drug approved, this doesn’t mean that having 5,000 employees will net you 10 new drugs. Therefore, for their size, Big Pharma is relatively inefficient in developing new drugs. It’s also worth noting that some of the small companies that win drug approvals have Big Pharma partners helping them through the process.
To be fair, comparing drugs put into trials by Big Pharma vs. small biotechs is really an apples and oranges comparison. Why? Because the two groups have different expectations surrounding the returns generated by the drugs that they are developing. All companies want billion-dollar drugs, but these don’t come along so often. These businesses all have an internal benchmark number that they use for vetting drug candidates. Big Pharma might not send a drug to the clinic unless they are convinced it will produce hundreds of millions a year in revenue. In contrast, a $75 million/year drug can be a big winner for a company with only a couple of hundred employees. However, estimates of potential sales are just that, estimates, and these numbers often turn out to be wrong in both directions. Large companies are equally vulnerable to making bad decisions here as small ones. For example, in 2008 Amgen unloaded three drugs whose sales were so poor (combined sales of only $70 million in 2007) that they weren’t worth the sales and marketing expense.
Drug and biotech companies get started when entrepreneurs (usually a mix of business people and university profs) band together to license some discovery (and it’s concomitant intellectual property) from a university. This discovery then becomes the backbone upon which a company is incorporated and a drug is developed. How does this get paid for? It’s usually done with a combination of angel investors, small business grants, and the bigger dollars provided by VC firms.
VC money, however, comes with serious strings attached. VC firms are planning their divorces from the companies they fund before they even consummate their marriage. The investors will only make money if one of three things happens: the company is acquired, the company goes public, or the company successfully develops and profitably markets its drug. Estimates for the average time to fully develop a drug range from 8-12 years. Rightly or wrongly, most biotech investors are simply not willing to wait that long to get a return on their investment. If they had to wait until the drug was available for sale, they would simply invest in other industries (e.g. software) that had a much shorter time frame to product launch as well as a significantly lower cost of entry into the industry. Because of this, the investors are then looking for an acquisition or an IPO. These events generally will not occur before a company has its drug in clinical trials, and demonstrated some really solid evidence that it is safe and effective. As a result, many investors (i.e. VC firms) will only provide money if the entrepreneurs can get their drug into clinical trials in a period of 18 to 24 months.
So is this time frame possible? Absolutely. It happens all the time. With many molecular targets, a drug can be found, manufactured at small scale under exacting conditions, and be declared ready to enter the clinic. This early stage is where the critical choices get made, often with little or no data. There are so many decisions to make. Defining the basic biology experiments. Choosing the ultimate composition of the molecule. Establishing a manufacturing process. Biological testing in multiple species. Stability testing. Analytics to define uniformity. Toxicology. The success rate for a drug making it through clinical trials is only about 7 percent. I’ve heard many people blame the FDA for the current low rate of drug approvals. They say they’re too conservative, too focused on safety.
I’d like to propose an alternate interpretation of the data. Drugs are currently failing in clinical trials at a high rate because they enter clinical testing before they are truly ready. And drugs, like people, usually get but a single chance to make a good first impression with both the FDA and investors. Fail in the clinic once as a new drug, and you are done for.
The current approach is simply a formula for failure for biotech startups. This recipe mixes equal parts hubris, financial pressure, and unresolved scientific/medical questions that are whipped together to form half-baked drugs that are poorly understood, designed, and vetted. Cutting corners to save money often backfires. Those of you who have tried to do your own plumbing or electrical work know the meaning of the phrase “the cheap comes out expensive” when you need to hire a pro to fix your mistakes. The drug development process is no different, but is vastly more expensive to repair.
There’s no getting around it: biomedical research is very expensive. So how do we pay for drug discovery research going forward, if Big Pharma doesn’t want to pay for it, and smaller companies can’t always afford to as a result of their limited funding? New approaches are required. Getting the federal government to pay for this doesn’t seem like a viable path; it’s not their job, and they’re already financially constrained (i.e. broke). Industry has been testing out two other approaches in the past few years: shipping research offshore, and establishing agreements with contract research organizations. Can these approaches get the job done?
I’m afraid I have reservations about both of these efforts to boost research productivity. Offshore jobs will certainly be cheaper, but the countries that are getting most of these jobs (e.g. China, India) don’t have a strong record of novel drug development, and quality control is a serious concern. Contract research organizations that perform critical tasks for a fee, but don’t take equity in a drug, can free a biotech company from hiring/firing cycles, as well as provide expertise not available in a small company. There would seem to be some merit to that approach. My concern here, and where I admit that I have little direct experience, has to do with a perceived lack of passion and control. I’ve worked for a biotech where the scientists (and many other employees as well) worked evenings, weekends, and holidays to move their projects forward. They had the drive, the fire in the belly. Are contract employees similarly invested? Will they do your studies the right way? Are they motivated to go all out and spend long days solving all of the problems that keep you up during long nights?
New (draft) legislation has been proposed (The Therapeutic Tax Credit) as part of the nation’s health care reform package that would provide a large tax benefit to biotech companies with 250 or fewer employees. The grant-like program would require companies to vie for funds based on how their drugs would lower health care costs or meet unmet medical needs. Interestingly, this competition is to be vetted by the Treasury Department, not previously known for its wisdom in reviewing drugs. Winning companies would be awarded tax credits (and possibly cash payments) that would cover as much as 50 percentof research and development expenses. The entire program is envisioned as lasting two years, and is capped at a total expenditure of $1 billion.
While I would love to see a tax credit to boost biotech research, this approach seems fraught with problems. One can imagine companies telling the Feds that their drug will lower health care costs because it will only cost $500/month when launched. Then, when it really does get on the market, the actual price turns out to be $10,000/month. Who could have seen that coming? And science can surprise even the best of intentions. Get a tax credit for developing a drug for heart disease, and then sell it for erectile dysfunction (this is the story behind Pfizer’s Viagra). There would need to be some serious safeguards put in place to handle either of these types of scenarios before I (and I imagine many others) could support something like this.
New approaches are needed that remove the pressures on startup companies to race into the clinic with drugs that are not ready for prime time. An obvious path, and one that I favor, is to incentivize investment in early stage biotech and pharma. The long timeline required to develop a new drug is perhaps unique to this industry. Investors in biotech and pharma need to be compensated, in the form of a greatly reduced tax burden, to balance both the nature of the risk as well as the lengthy development times. My suggestion: lower or eliminate taxes on investments in pharma and biotech that are in place for more than five years. This would hopefully entice investors into accepting a longer waiting period before financial returns are realized. This, in turn, would then allow companies more time to develop novel drugs that are likelier to succeed in the clinic. Besides tax reductions, other innovative types of incentives should be considered to reward the investors who provide the resources that fuel innovations in America’s healthcare.