Precision Medicine (PRx) was $40B in 2016 and forecast to hit $140-230B by 2026 with most of that coming from new drugs according to Frank Ingari at last week’s Precision Medicine World Conference, appropriately held right here in Silicon Valley at the somewhat dated Computer History Museum.
Precision is a large and increasingly important segment of the market with a rapidly growing number of products and services in the pipeline. It offers a lot of promise to cut in to the estimated one trillion of wasted dollars in our $3.2T healthcare system (that’s roughly one third estimated to be wasted on the wrong therapies, unnecessary tests and procedures, etc.) While definitions vary, most seem to agree that by people getting the right diagnoses and treatments, and avoiding all the unnecessary and ineffective ones, we should be able to eliminate a lot of that waste.
What’s not immediately intuitive is how imprecise precision medicine still is. Many of the immune-oncology presentations focused on efforts to reduce the high percentage (anywhere from 60 to 95%) of non-responders to treatments. Any lack of response is clearly not good for patients, and for our healthcare costs, given these are expensive medications. There are a host of reasons including the absence of reliable biomarkers, that genetics is perhaps only 1/3 of the potential contributing factors (such as microbiome, treatment history, lifestyle and environment) to developing and overcoming a malignant cancer and that cancer is, to use Ira Mellman’s word “crafty”. Another key effort is in finding the right antigen to target. A number of cancer “vaccines” are in development with the aim of customizing this to the individual.
Most of these efforts require a mind-boggling amount of computing power and data storage for diagnosis and drug discovery. To the extent that some novel techniques for accessing petabyte data storage systems are required (because you can’t just copy and paste these data stores) and health systems have to consider metering the analysis done to avoid receiving million dollar bills from Amazon Web Services. Artificial intelligence, Machine Learning and Deep Learning techniques are naturally heavily leveraged to deal with all of this “Big Data”.
What really surprised me was the absence of substantive discussion of commercial strategy. These high-priced innovative therapies demand new market access, distribution, dispensing and support service strategies to be successful in the market. As we’ve seen with the first CAR-T products, a well-conditioned payer market will cover a treatment that, including medical costs, could run into several millions of dollars. Poorly conditioned, approvals may be slow or absent and patients won’t get access to therapy. Innovative approaches such as guarantees (as offered by Novartis for Kymriah) or payment plans (as proposed in Europe) are necessary when such high costs and risks are in play.
With these CAR-T therapies there is no standard distribution model for a product that starts with apheresis, cryo-frozen storage and handling and manufacture and is returned to a patient’s body weeks later. Understanding wastage and breakage takes on new stakes with vials worth hundreds of thousands of dollars. Likewise, ownership and timing of title transfer is critical for worst case scenarios such as patient death after initiating but before completing therapy.
Even if these brilliant scientists, incredible computers and leading physicians and health care systems are able to improve diagnoses, response rates and progression free survival it’ll doesn’t mean the right patients will get access to the amazing products. You need a commercial strategy to make that happen.