Color Block

Text

Advancing the use of real-world data to help speed treatments to patients

February 12, 2018

Text
Text
Text
Text

By William Crown, PhD, OptumLabs

For more than 20 years, I used medical claims data to compare the effects of medications on patient outcomes and cost. In the back of my mind (and sometimes the front) I was always worried about what I might be missing since claims data contained limited clinical information. For example, I knew that it would be helpful to know the body mass index, cholesterol levels and blood pressure of patients when studying heart failure drugs. And, when studying breast cancer treatment, it would be critical to control for cancer stage. But none of this information was available in claims data alone.

This changed in 2009 with the HiTECH Act, which mandated meaningful use of electronic medical record (EMR) data for health care providers. The HiTECH Act transformed the data landscape almost overnight by making it possible to broaden our view into patient care and outcomes by linking patient-level EMR and claims data.

My previous concerns about possibly missing something important turned to curiosity: Could using real-world data such as linked patient claims and EMR records simulate a randomized controlled trial (RCT)? Could important evidence be produced in months, rather than years, to dramatically cut development costs and accelerate patient access to effective therapies?

OptumLabs is working to answer these questions.

From the discovery period to FDA-approval, it can take up to 15 years to make a new drug.

On the path to drug development, clinical trials typically take 6-7 years. Source: Pharmaceutical Manufacturers Association of America

What’s so great about randomized controlled trials?

Randomized controlled trials (RCTs) are used to obtain FDA approval to manufacture and sell a new drug. By randomly assigning patients to treatment and comparator groups, RCTs minimize bias from missing control variables. Randomization also reduces bias intentionally or unintentionally introduced by the investigator or sponsor in the study. 

While these trials use strict research protocols to help ensure the collection of high quality data, their idealized environment often doesn’t reflect reality. For example, people do not always take their medication as prescribed for a variety of reasons. Data shows that people with chronic conditions like diabetes, hyperlipidemia, or hypertension don't take their medication 25-50 percent of the time.1 From the perspective of medication adherence alone, the very features of the research protocol that help to strengthen the internal validity of RCT results also may limit their generalizability to the real world.

The times, they are changing

Recent FDA regulatory mandates acknowledge this limitation of RCTs and a growing interest in using real-world evidence (RWE) — or, the output of research using real-world data — for approving new uses of an already approved drug and monitoring data to protect patient safety. Reauthorization of the Prescription Drug User Fee Act(PDUFA VI2) and the 21st Century Cures Act3 now require that the FDA

  • Conduct a public workshop on regulatory uses of RWE by the end of FY 2018
  • Fund a pilot and set methodology for RWE by the end of FY 2019
  • Publish draft guidance by end of FY 2021

Researchers have already been generating RWE via claims analyses to gather safety data on approved products (e.g., FDA's Sentinel Initiative).4 This has helped establish research designs and methodologies for balancing comparator treatments across large numbers of observed attributes in the data. As a result, well-designed studies utilizing RWE are beginning to look more and more like trusted RCTs.

Our ability today to link multiple sources of real-world data can help strengthen RWE further. For example, at OptumLabs, we’re combining claims and clinical data. The EMR will tell us when the physician writes a prescription for a medication. But how do we know if the prescription was actually filled? That information is in claims data, not the clinical data. Details about how the patient responded to that medication, however, may only be found in clinical notes. It takes both pieces to see the whole picture, and linking these two sources helps remove biases that would arise from using either type of data alone.

Healthcare databases frequently used for real world evidence generation. Source: Franklin J, Schneeweiss S. When and How Can Real World Data Analyses Substitute for Randomized Controlled Trials? Clinical Pharmacology and Therapeutics 2017.

It’s an observational study — what could possibly go wrong?

There are contradictory findings when it comes to the reliability of RWE. A recent Cochrane Review concluded that observational studies generally have similar findings as RCTs in the same disease states.5 But substantial literature also shows wide variation in results from database studies within the same therapeutic areas.6

There are cases when an observational study and RCT may have very different initial results, but reach similar conclusions once the observational study design is adjusted to match that RCT. For example, the observational Nurses' Health Study showed that hormone replacement therapy (HRT) reduced the risk of heart disease among post-menopausal women.7 A decade later, results from a large RCT, the Women's Health Initiative, found the exact opposite.8

These seemingly different conclusions created huge anxiety among post-menopausal women being treated with HRT, as well as their physicians. However, it turns out that the difference in conclusions was not due to randomization at all. Rather, the risk of heart disease was highest in a two-year window after starting HRT and then declined over time. When the observational Nurses Health data was re-analyzed to control for this, the results were the same as the Women’s Health Initiative RCT.

The bottom line is that sometimes observational studies have been able to replicate the results of RCTs and sometimes not. To increase our confidence in the findings from observational studies, we need to understand why this is the case. 

How can we raise confidence in appropriate use of observational studies?

OptumLabs is partnering with the Multiregional Clinical Trials Center at Brigham and Women’s hospital to explore this question. Observational Patient Evidence for Regulatory Approval and uNderstanding Disease (OPERAND) is a collaboration of stakeholders from industry, academia and regulation to consider the principles behind, the methodology to draw upon, and the appropriate use of observational data in regulatory review and approval. We can then use these design elements to examine the circumstances under which real-world data — specifically, linked claims and EMR data — can be used to confirm previously published RCTs. 

While individual RCTs have been replicated using observational database analyses before, we plan to replicate a large number of trials across several different therapeutic areas simultaneously (a novel feat). Our findings will complement the efforts of groups like Duke Margolis Center working on the FDA regulatory policy guidelines for RWE. Moreover,  the ability to trust policy evaluations of benefit design, care organizational models, disease management programs or non-drug medical interventions all rest on the same principles being explored in OPERAND.

There will always be a place for RCTs in areas such as initial drug approvals. But with the vast amounts of high-quality data available to us today, we have a unique opportunity to explore how we can use this data to generate reliable RWE more quickly and at less cost — ultimately benefiting patients by accelerating access to effective treatments.

 

About the Author:

  • William Crown, PhD, is chief scientific officer at OptumLabs


References

  1. Rowan C., Flory J., Gerhard T., et al. Agreement and Validity of Electronic Health Record Prescribing Data Relative to Pharmacy Claims Data: A Validation Study From a US Electronic Health Record Database. Pharmacoepidemiol Drug Saf. 2017. https://doi.org/10.1002/pds.4234.
  2.  https://www.fda.gov/forindustry/userfees/prescriptiondruguserfee/ucm446608.htm
  3.  https://www.fda.gov/RegulatoryInformation/LawsEnforcedbyFDA/SignificantAmendmentstotheFDCAct/21stCenturyCuresAct/default.htm
  4. https://www.fda.gov/Safety/FDAsSentinelInitiative/ucm2007250.htm
  5. Anglemyer A, Horvath HT, Bero L Healthcare outcomes assessed with observational study designs compared with those assessed in randomized trials (Review). The Cochrane Collaborative: John Wiley & Sons, Ltd., 2014.
  6. Hernandez-Diaz S., Varas-Lorenzo C., Garcia Rodriquez LA. 2006. Non-steroidal anti-inflammatory drugs and the risk of acute myocardial infarction. Basic Clin Pharmacol Toxicol. 98(3):266-274. Kwok CS, Loke YK. 2010. Meta-analysis: the effects of proton pump inhibitors on cardiovascular events and mortality in patients receiving clopidogrel. Aliment Pharmacol. Ther. 31(8):810-823.
  7. Stampfer MJ et al. Postmenopausal Estrogen Therapy and Cardiovascular Disease: Ten-Year Follow-Up from the Nurses’ Health Study. N. Engl. J. Med. 325, 756-762 (1991).
  8. Rossouw JE et al. Risks and Benefits of Estrogen Plus Progestin in Healthy Postmenopausal Women: Principal Results from the Women’s Health Initiative Randomized Controlled Trial. JAMA 288, 321-333 (2002)