The use of real-world evidence (RWE) by clinicians and drug manufacturers is rapidly unfolding in oncology. RWE has the potential to provide oncologists with fresh insights into treatment decisions, given the wide variation in patients’ responses to medications. Drug manufacturers are also expanding their use of RWE in oncology through risk contracting with payers, in drug discovery, and to identify unmet therapy needs. RWE is also coming into play in pharma’s health economics and outcomes research (HEOR) programs, as companies seek to demonstrate the value and comparative effectiveness of their products.
But RWE is a relatively new kid on the block, and it has its own real-world challenges. How exactly it will fit into the complicated world of cancer drug testing, approval, regulation, and marketing is uncertain. The randomized clinical trial has been the gold standard in oncology research for decades and will remain so for the foreseeable future. The American Society of Clinical Oncology (ASCO) and National Comprehensive Cancer Network (NCCN) say that RWE does not have a formal role—yet—in their clinical guideline development process. And, to some extent, the very essence of RWE is also its nemesis: the real world of clinical practice is diverse and, when it comes to data, messy. Significant biases can creep into the raw data that can make its rendering into evidence difficult. Some of the reservations about RWE are ebbing as study design and analytics improve. And there will be some government standards to go by relatively soon, because the 21st Century Cures Act requires the FDA to develop a program for incorporating RWE into its activities.
In many ways, ASCO is setting the standard for data collection, curation, analysis, and reporting of RWE with its CancerLinQ program. Launched in November 2014, CancerLinQ is set up as a wholly owned, not-for-profit subsidiary of ASCO to allow data-sharing partnerships with health care providers and business relationships with drug companies. So far, about 80 health systems and 2,000 oncologists are using the network, and it now includes a database of about 2 million patient records. More than 100 additional practices have expressed interest in joining, says CEO Kevin Fitzpatrick, a former executive vice president of the American College of Cardiology.
“We are looking to establish a database and analytic engine that provides a 360-degree view of the patient journey,” says Kevin Fitzpatrick, CEO of CancerLinQ.
The LinQ in CancerLinQ stands for “learning intelligence network for quality.” Fitzpatrick describes it as a rapid learning system for medical oncologists. Each practice’s EHR is connected to the network so it can capture both clinical and business data. The network’s platform allows oncology practices to analyze detailed information about their own patients while giving them access to de-identified data for every other patient in the network. The hope is that the analytics will produce insights into practice patterns, the efficacy and cost effectiveness of therapy regimens, and safety issues. With CancerLinQ, ASCO has created a tightly integrated data collection, filtering, curation, and reporting platform. It captures text from unstructured notes in EHRs and uses natural language tools to process that information. But to improve the quality of the data, it goes a step further and has trained clinical staff who review and curate questionable text.
CancerLinQ is expanding beyond medical oncology. “We are looking to establish a database and analytic engine that provides a 360-degree view of the patient journey,” says Fitzpatrick. CancerLinQ is collaborating with other specialty societies, including the ones for radiation oncology, pathology, surgery, and pharmacy. “We see ourselves as a utility for the entire cancer community.”
Real-world data will be made available to other specialties and organizations, including drug companies, through CancerLinQ Discovery, a data repository for researchers, clinicians, and life science companies, Fitzpatrick says. When CancerLinQ announced Discovery, it also announced a nonexclusive strategic relationship with AstraZeneca, which is supporting the effort and will have access to the database for its own business purposes.
Drug companies have a growing appetite for RWE because it gives them an understanding of their products beyond the womb of the randomized clinical trial and across their entire life cycles. Cardinal Health Specialty Solutions, the specialty drug consulting division of Cardinal Health, partners with drug companies to conduct HEOR studies using real-world data. Bruce Feinberg, DO, the chief medical officer of Cardinal’s specialty drug division, says oncology RWE helps drug companies and other stakeholders gain insights into the cost of care and resource utilization of a typical patient. RWE may also reveal aspects of care and a drug’s use that were rare or didn’t occur in the more limited circumstance of a clinical trial, he says.
RWE may also reveal aspects of care and a drug’s use that were rare or didn’t occur in the more limited circumstance of a clinical trial, says Bruce Feinberg, DO, of Cardinal.
In one study for a drug manufacturer, Cardinal looked at the comparative effectiveness of several kidney cancer therapies using claims data to identify patterns of care and resource utilization, including emergency room visits and hospitalizations. It found a statistically significant difference between leading kidney cancer therapies in the number of patients that had an emergency department visit during their first-line treatment. That type of real-world data is important as a potential safety signal for further research and also in demonstrating value to payers, Feinberg notes.
RWE is definitely trendy. Kevin Carr, MD, with PricewaterhouseCoopers, warns that it’s easy to get in over your head. “The hard part of RWE is understanding the data set that is truly required to answer a question,” says Carr. Information technology makes it relatively easy these days to mine huge amounts of data from EHRs, claims data, and other sources. The problem comes with building a database that is reliable and complete. Carr has seen projects shut down because the underlying information was inaccurate, incomplete, or otherwise unreliable. There’s simply no escaping the ironclad rule of garbage in, garbage out. Moreover, real-world studies are complex. They require complete records with multiple data fields that can be cross-tabulated, such as patient age, diagnosis, and medication dose. When complete records are not available, databases on which RWE is built get shaky.
FDA staffers, including then-commissioner Robert Califf, sounded a warning about shoddy RWE in an opinion piece in the Dec. 8, 2016, issue of the New England Journal of Medicine. “The confluence of large data sets of uncertain quality and provenance, the facile analytic tools that can be used by nonexperts, and a shortage of researchers with adequate methodologic savvy could result in poorly conceived study and analytic designs that generate incorrect or unreliable conclusions,” they wrote.
Efficacy data generated through real-world studies may seem to be useful in guideline development, but NCCN VP Joan McClure says built-in biases and limitations make it problematic.
Partly because of the 21st Century Cures ACT, RWE seems poised to become an increasingly important part of the regulatory oversight of drugs. But so far, neither ASCO nor NCCN is using RWE in the development of clinical guidelines. NCCN Senior Vice President Joan McClure says two categories of real-world data may become relevant to guideline development, while the use of a third remains more questionable. One of the two likely categories is patient-reported outcomes related to quality of life and the tolerability of regimens. Another is pharmacovigilance and the tracking of rare adverse events or toxicities that aren’t fully understood. For example, McClure says diarrhea from immunotherapy may be different from diarrhea from traditional chemotherapy, and RWE might help physicians make decisions about how to manage the two problems differently.
The RWE trouble spot might be in the crucial area of efficacy. McClure explains that efficacy data from real-world studies is problematic because of the likelihood of some built-in biases and limitations. Many factors go into a physician’s selection of a therapy, and they will be hard to capture in EHRs. As a result, efficacy data culled from EHRs may not include important information that affected treatment choices, and that blind spot may affect the efficacy data.