all the information, none of the junk | biotech • healthcare • life sciences

A Smarter War on Cancer

Xconomy National — 

[Editor’s Note: this editorial was co-authored by Jay (Marty) Tenenbaum of CollabRx and Leroy Hood of the Institute for Systems Biology.]

One of society’s great challenges is to advance health care while deflecting its rising cost. Achieving this objective requires that we capitalize upon the immense potential of personalized medicine. Cancer is a case in point. Emerging genomic and computational technologies are transforming cancer research. We now recognize that cancers are far more variable on a molecular level than they appear under the microscope. Yet we continue to treat most cancers based on their appearance, analogous to giving everyone with fatigue a blood transfusion.

We rely on 20th century randomized clinical trial designs, which test drug effects in populations, to evaluate a burgeoning number of 21st century cancer drugs that work only in specific subpopulations. Randomized trials made sense when we thought that microscopically similar cancers were essentially identical. However, many new cancer drugs are designed to inhibit specific target proteins that are more important to cancer cells than normal cells. Testing these drugs in randomized trials can be problematic. For many patients entering clinical trials we do not even know whether their tumors possess the target protein that the investigational drug was designed to hit.

To compensate for those patients whose tumors lack the target and are thus destined to fail, randomized trials commonly enroll thousands of patients so that the extension of survival in those who benefit can be discerned from a background of non-responders. This model has led to FDA approval of drugs that extend the average lifespan only by weeks to months while costing more than the annual median U.S. household income (around $50,000). Meanwhile, therapies that are highly effective but help only a few cancer patients with specific molecular profiles are rejected.

One alternative to this conventional approach would be to treat a small number of highly motivated cancer patients as individual experiments, in scientific parlance an “N of 1.” Vast amounts of data could be analyzed from each patient’s tumor to predict which proteins are the most effective targets to destroy the cancer. Each patient would then receive a drug regimen specifically tailored for their tumor. The lack of “control patients” would require that each patient serve as his or her own control, using single subject research designs to track the tumor’s molecular response to treatment through repeated biopsies, a requirement that may eventually be replaced by sampling blood. After the patient receives a drug to block a particular target protein, a biopsy would confirm whether the target had indeed been blocked. One of the most important advantages of serial molecular monitoring is that it would unveil strategies that tumors adopt to evade therapy, possibly uncovering new targets of opportunity.

This patient-centric approach would represent a dramatic departure from traditional oncology. Novel patient-specific combinations of drugs could have unforeseen side effects, and the methodological, regulatory, and ethical framework for cancer research would need to be reconsidered from the ground up. Therapies that appear to be effective can be validated in small trials of other patients with similar molecular profiles. Unsuccessful therapies could be analyzed to refine our understanding of tumor biology and drug mechanisms of action. While the “N of 1″ approach may not hit immediate home runs, the extensive body of knowledge generated from each patient could, upon aggregation with data from other patients, lead to blood diagnostics to classify responders for particular drugs. Eventually we should be able to head tumors off at the pass, anticipating the escape routes they are likely to take.

Although this approach will be very expensive for early adopters, technology costs are falling at exponential rates similar to Moore’s law for integrated circuits, eventually leading to dramatic reductions in healthcare costs. There are obvious economic benefits to administering expensive drugs only to patients for whom they are most likely to work, and technological advances that allow serial blood draws to replace tumor biopsies will reduce costs and increase scalability. Although there are still many significant challenges, the primary roadblock to implementation is financial. Will insurance companies or federal agencies pay to test this new paradigm for cancer treatment? Would patients with difficult-to-treat cancers and the means to fund the entirety of their own course of therapy be willing to pay for the opportunity to serve as individual Manhattan Projects for the potential benefit of themselves and, possibly, mankind? It’s time to find out.

By posting a comment, you agree to our terms and conditions.

  • http://www.eutropics.com Andrew Kolodziej

    (1) tailored cancer drugs are where the future of treatment lies–Gleevec and Herceptin provide this model. For broad spectrum drugs (e.g. Bortezomib with a 40-60% response rate), there is a need to predict responders–functional or genetic profiles will be required and are under development. Financial issues are also involved: UK will not reimburse if the patient fails to respond.
    (2) Engineering N=1 is going to require more cooperativity and shared standards than is current in the industry, and it is doubtful that we are at the stage where that will useful (have we even identified the right targets)
    (3) Cancer patient participation in clinical trials is very low (<5%?, the indications and disease etiology are fragmented, and there is intense competition for enrolling them
    (4) The health care system should put the brakes on treatments with marginal (<2-3 month) survival benefit, exhorbitant cost, and poor side effect profiles.

  • Pingback: Individualized cancer research « Follow the Data()