The Challenge of Understanding Biotech: Sifting Through the Fog and Jargon

10/20/10

I recently enjoyed reading two books on the mortgage meltdown, “The Big Short” by Michael Lewis and Gregory Zuckerman’s “The Greatest Trade Ever.” They each provided a detailed post-mortem on the implosion of the housing bubble. What fascinated me most about their accounts was how virtually everyone on Wall Street, with a few notable exceptions, was completely wrong about the stability and direction of the housing market. This, of course, fits the classic definition of Groupthink, a term coined by social psychologist Irving Janis to describe faulty decision making that results from a deterioration in “mental efficiency, reality testing, and moral judgment”. The lesson I took away from this is crystal clear: you must think on your own, forming your own opinions from the data in front of you. Remember that the “experts” are not always right, and indeed, are often very wrong.

Is Groupthink a problem for the Pharma/Biotech industry? A recent summary of “Pharma’s Biggest Flops” readily illustrates that both companies and the analysts who cover them can, from time to time, be dead wrong in their pronouncements and expectations. However, I only see weak evidence of Groupthink in BioPharma, nothing anywhere near the scale of what was seen in the mortgage meltdown debacle. Pharma and biotech are a much more fragmented and diversified industry than the mortgage business, and as a result it is much more resistant to having everyone think the same way. It is also harder to place specific financial wagers either for or against the industry in general, especially since many biotechs are not publicly traded.

So how does one avoid getting sucked in by industry Groupthink? Critical reading and thinking are the best defense against this problem, but this requires a serious brainpower commitment. I spend a lot of time reading and, more importantly, thinking about where the industry has been and where it is going. Science journals, newswire reports, and industry missives fill my computer screen on any given day. I know that many of you share my pain: so much information to process, so little time.

We all deal with information overload. More and more, however, I find myself spending inordinate amounts of time trying to understand a single number or the ramifications of a particular finding that are either poorly explained or don’t make sense. A lack of clarity can simply indicate poor writing, but it often reveals sloppy thinking. Herman Melville said it very well: “A man of true science uses but few hard words, and those only when none other will answer his purposes; whereas the smatterer in science thinks that by mouthing hard words he proves that he understands hard things.”

Consider the simple phrase “….. is the best selling drug.” What does this mean? Best selling in the U.S., or in the world? The most prescriptions written? Used by the greatest number of patients? Highest dollar sales? Total number of pills consumed? Do biologics count as a “drug,” or are those scored separately as a “best selling biologic”? Sadly, phrases such as these are seldom defined in a way that you can tell what the author truly meant to say.

I often come across numbers that are so unbelievable that they throw a spanner wrench into my mental gears. These figures may crop up in discussions of topics I am unfamiliar with, but just as often they show up in a commentary that covers my particular areas of expertise. My first response is to look for a footnote, something to tell me where this number came from. The footnote, if you can find one, is often rather vague, attributing the statistic to something like the Government Accountability Office, the Brookings Institution, the Biotechnology Industry Organization, Public Citizen, or the National Academy of Sciences. I must confess that I (regretfully) quote these numbers from time to time, since they are often the only ones you can find available on a particular topic.

Having identified the source of these numbers, I often attempt at least a shallow dive into the quoted material to find out how the number was obtained. These efforts are nearly always unsuccessful in that the exact meaning of the number, or the methodology used to derive it, are simply unavailable. One example: a number was cited in a Chicago Tribune article about the number of biotech companies that had declared bankruptcy that did not jibe with other numbers I had seen published. Numerous phone calls and emails later, I was told that the number provided by an industry organization had been misquoted by the reporter who authored the piece and was, therefore, untrue. I never saw a correction published. To put it bluntly, if I can’t understand or confirm a phrase or number, I have a difficult time believing it is true.

Also irritating is reading some number or graph that’s been extracted from a white paper, deciding you would like more info, and then finding out that your only available option is to purchase the entire article for a mere $7,695 from Expensive Reports R Us. Equally vexing: reports produced by financial firms for the exclusive use of their clients, which are therefore not available to the general public. Earlier this year Morgan Stanley published a recommendation that Big Pharma companies largely abandon their internal research programs and acquire their new drugs primarily via acquisitions. Sounds fascinating (and to my mind, unsustainable), and I would love to see their analysis, but my request for a copy of this report has been unfulfilled. These articles may be well researched and written, but price and/or exclusivity have prevented me from determining this for myself.

Poorly explained phrases gum up a clear understanding of relevant issues in the industry. Meaningless jargon too often replaces clear insights. In one recent report by Ernst and Young, titled “Beyond Borders: Global Biotechnology Report 2010,” I waded through the muck created by the following phrases (not all written by E&Y): “the new normal” “Pharma 3.0″ “precompetitive collaboration” “open innovation approaches” “asset-centric financing” “constituents of the biotech ecosystem” “proof-of-concept value inflection point” “fail fast” “force multipliers” and “lean proof of concept.”

Here is an “insight” from the same report, discussing Eli Lilly’s Chorus program: “We did not follow the typical path to proof of concept—using very small studies that are poorly controlled and rely on unvalidated endpoints to give some suggestion of efficacy. Instead, we designed well-controlled studies to derive truly meaningful data using validated surrogates or bona fide clinical endpoints.” Are there industry people out there who deliberately set out to do poorly controlled studies on unvalidated endpoints for their personal amusement? Is this why Big Pharma has gotten itself into such Big Trouble? What does it say about an industry if doing well-designed studies with clear endpoints is supposed to be a revolutionary advance?

According to several contributors to this 2010 E&Y report, “our industry is at a point in its evolution when we have limited resources and large unmet needs.” Did I miss the point when we had unlimited resources? Was that the day I overslept? And when was there was a small number of unmet needs?

Years ago, a thoughtful co-worker passed along to me a copy of a game called “BS Bingo” that was designed to keep you awake during boring meetings and conference calls. Many of you will have played this game over the years. It consists of a 5 x 5 grid of current industry buzzwords and is designed to help focus your attention during those meetings (and we’ve all been there) where the speakers trot out every trite phrase and trendy jargon term currently circulating in the industry. The version I was given in the early ’90s included “synergy” “win-win” “think outside of the box” “value added” and “proactive.” Every time the speaker used one of these terms, you checked it off on your grid. If you successfully checked off a full row of terms, you were supposed to yell out “Bullsh*t” and bring the meeting to a thankfully foreshortened conclusion. It’s clearly time to update my copy.

People who present themselves as industry leaders have a responsibility to explain plainly how their conclusions are derived, and to communicate clearly with those whom they seek to align with their viewpoints. We should refuse to accept ill-defined or inaccessible numbers, meaningless statistics, and challenge those who retreat to the illusionary safety of jargon to clarify what they really mean.

Stewart Lyman is Owner and Manager of Lyman BioPharma Consulting LLC in Seattle. He provides strategic advice to clients on their research programs, collaboration management issues, as well as preclinical data reviews. Follow @

By posting a comment, you agree to our terms and conditions.

  • natalie

    Well-written

  • http://www.austinpreclinical.com Eric Austin

    Hilarious and unfortunately all too true. I have had similar thoughts as yours Stuart but you have done a great job of getting it down on “virtual” paper for all to enjoy.

  • JC

    I’m amazed by the continuous “group think” inside the industry and companies vying for approval themselves. They are so sure they will be approved they bet the farm, build out expensive facilities and hire loads of people who all too soon will be laid off.

    Always makes me wonder what they know that no one else does, but alas the FDA continuously surprises us with delays or outright dis-approvals (Amylin the most recent notable).

    With all of the sophistication, investment and science of our modern era, I find the process, expectations and execution of getting products approved still archaic.

  • Peter

    Nice job! I always try to read the Numbers Guy, Carl Bialik, in the Wall Street Journal. As a mathmetician he tries to rationalize some of the numbers that frequently are quoted. For instance, he recently wrote about and largely debunked the oft-cited “statistic” that 10% of the drugs sold are counterfeit: http://blogs.wsj.com/numbersguy/dubious-origins-for-drugs-and-stats-about-them-990/. We need more fact checkers, but as you point out, it is not easy to trace down the origins of many of these numbers.

    The best example of “group think” is what happens in big pharma when senior management changes its philosophy on the “right” way to discover and develop drugs. They often end up criticizing their own ideas of a few years earlier and then expect the entire organization to adopt their “new” ideas. Of course anyone who disagrees is criticized for not being a “team” player and is usually swept out in the next round of layoffs.

  • John

    Great stuff, well presented. But it isn’t only the life science industry that plays fast and loose with facts and language. Yesterday I read of a highly successful aid program that distributed $200M to 25M people. That $8 per person was life-changing aid.

    Is part of our “evolving” journalism industry dropping its fact checking role?

  • Pingback: The Challenge of Understanding Biotech: Sifting Through the Fog and Jargon – Xconomy « Law Yarns

  • Anthony Rodriguez

    In every sector from academia to politics, people are irresponsible with numbers. The lack of transparency with the information that is pushed in today’s web-based society is sad. Like Stewart, I will all too often stop my due diligence as soon as the source asks for a credit card number. Its just easier and cheaper to take these statements at face value. I applaud journals like PLoS who allow for researchers to post their RAW data online with their article free to access for anyone. Will we ever see something like this become a standard across all sectors? I hope so, but I am not holding my breath.