Introduction: The Moffitt Oncology Network (MON) Initiative demonstrates a way to form a value-based network based upon clinical pathways across a broad geographical area.
Methods: Moffitt Cancer Center (MCC) has developed various cancer-specific pathways. MCC pathways translate evidence-based guidelines into personalized cancer treatment and set a care standard for evaluation and personalized treatment. MCC is using these pathways with other hospital systems and physician groups throughout the MON. Clinical Performance and Value Vignettes, which are virtual patient cases related to the specific clinical pathways, are used to improve the uptake of pathways in the MON. We report here on the baseline data of 66 breast cancer care providers who took 132 breast cancer vignettes. Using the vignettes, variation in care practice is examined, with special attention to use of clinical breast cancer pathways.
Results: Pathway-based clinical care was measured at baseline across MON sites. The mean distributions at baseline varied across all sites and were not statistically significantly different (P>.05). Scores varied by domain across sites, although history and physical scores tended to be higher than work-up, diagnosis, and treatment scores. Pathway adherence also varied for specific diagnostic evaluations or treatments: surgery; sentinel/axillary lymph node dissection; radiation therapy; chemotherapy; and hormonal therapy, and also for the prevalence of unnecessary testing.
Conclusion: Our study suggests that fostering the adoption of breast cancer clinical pathways into an oncology network is feasible; however, adherence to pathways in breast cancer is varied and reducing such variation is a priority as oncology networks continue to grow in popularity.
As health care delivery and financing shifts from volume-based to value-based, the aspiration is that financial returns will be achieved by offering services with the best possible quality and outcomes for the lowest possible cost across the continuum of patient care. It is increasingly recognized that when value (defined as the patient health outcomes achieved per dollar spent) improves, patients, providers, and payers alike derive benefit from the improvement (Porter 2010).
In accountable care organizations (ACOs), a cornerstone of value-based care, the physician incentive to provide a certain level of quality is ideally based on patient benefit and alignment with economic efficiencies. Provider payment is thus being tied to a defined set of results for quality, access, and efficiency agreed upon by all providers that form the network. When threshold performance levels of quality and cost are met, providers may benefit from shared savings negotiated with payers. Conversely, when threshold performance levels are not met, hospitals and physicians are potentially at risk for reduced payment, no payment, or exclusion from a network. With the promise of higher quality and lower costs, accountable, unified networks and partnerships increasingly have replaced adversarial relationships in delivering care (McDonald 2007, Porter 2007). Large centers have shown that integration in care can be done successfully with care coordinated from diagnosis through therapy and follow-up. For specialty care, this shift has driven the importance of building networks of physicians and other health care practitioners who are jointly responsible for care (Peabody 2013).
A focus on networks is not a novel idea; in fact, the managed care movement in the 1990s brought this idea to the forefront (Burns 2012). However, what was largely missing in the 1990s was an ability to align providers based on quality, exacerbated by a lack of technological means to measure quality, give feedback to providers in a meaningful way, and link quality to reimbursement. We now know that focusing on quality not only may integrate care effectively but also bring substantial cost savings to the larger health system, as in the experiences of Intermountain Health and Cleveland Clinic (Bohmer 2011, James 2011).
To focus on quality, however, newer provider-oriented strategies that engage physicians in system-wide initiatives of routine measurement, feedback, and learning are needed (Mountford 2012, Meyer 2012, Biller-Andorno 2013). Simulations of a patient-provider interaction, such as Clinical Performance and Value (CPV) Vignettes (hereafter “vignettes”) have been shown to measure real practice data and offer promise in this area as rapid feedback on simulated patients, done serially, promotes continuous learning and resonates with professional purpose (Peabody 2000, Dresselhaus 2000, Peabody 2004, Peabody 2011, Peabody 2014).
Moffitt Cancer Center (MCC), located in Tampa, Fla., is a National Cancer Institute Comprehensive Cancer Center that has developed a collaborative network consisting of hospitals, health care systems, and community-based practices that agree to follow Moffitt’s Clinical Pathways, collaborate on quality initiatives, and participate in clinical research. Each site emulates Moffitt’s model of care through interdisciplinary, disease-focused clinics and prospective peer review. To further standardize clinical care and quality, vignettes were developed to measure provider performance against explicit, evidence-based standards in Moffitt’s Clinical Pathways. The tool outlines a patient’s optimal clinical care based on best practices specific to each patient’s situation and treatment aims.
As part of the Moffitt Oncology Network (MON) initiative, vignette measurement and feedback of clinical pathway compliance was launched at MCC and then at 3 partner-network hospitals/facilities: Norton Cancer Institute (Louisville, Ky.), Lehigh Valley Health Network (Allentown, Pa.), and Space Coast Cancer Center (Titusville, Fla.). At these institutions, measurement and feedback to the multidisciplinary team of medical oncologists, radiation oncologists, surgeons, and advanced practice providers occurs every 4 months, starting with breast cancer vignettes.
We report here on our baseline findings in breast cancer at 4 Moffitt Oncology Network sites and discuss how the measurement and feedback system could improve clinical quality and decrease costs in the care of patients with breast cancer.
Setting: MCC collaborates with other hospital systems and physician groups through the Moffitt Oncology Network. The network uses Moffitt’s Clinical Pathways, a decision-making tool that translates evidence-based guidelines for personalized cancer treatment into specific clinical pathways based on a patient’s unique clinical profile, using current clinical standards and associated medical evidence and integrating Moffitt’s clinical research priorities into an interactive Web-based application. These pathways also include cost information for chemotherapy regimens and radiation therapy.
To overcome the challenges of adherence to clinical pathways and link network providers with a common quality metric, MCC joined with QURE Healthcare to develop and administer vignettes related to specific clinical pathways in breast cancer. The vignettes were given to participating providers in a serial measurement and feedback system developed and validated by QURE Healthcare (Peabody 2000, Dresselhaus 2000, Peabody 2004). Vignettes are administered every 4 months. First launched in breast cancer in March 2013 at MCC, breast cancer vignettes were then introduced to interested partner institutions in July 2013 (Lehigh Valley Health Network, Norton Cancer Institute, Space Coast Cancer Center).
Study population: As part of this initiative, all breast cancer providers (69 medical oncologists, radiation oncologists, surgeons, and advanced practice providers) at the participating sites were asked to complete 2 vignettes every 4 months for 6 rounds over a period of 20 months. Sixty-six providers at 4 sites agreed to participate and completed the vignettes (96% participation).
Measurement and feedback system: Using the MCC clinical pathways, 12 vignettes were written to address diagnostic, therapeutic, and cost challenges in cancer care.
Vignettes have been previously validated as a measure of the provider’s ability to evaluate, diagnose, and treat specific diseases and conditions (Peabody 2000, Dresselhaus 2000, Peabody 2004). Each vignette takes approximately 25 minutes to complete and asks the provider to respond online to open-ended questions as they proceed through a patient visit. Participation was not compensated. We report data from the first round of administration within the network. Trained physician abstractors, blinded to the vignette-taker’s identity, scored each vignette using MCC pathways to determine the prevalence of on- and off-pathway care and domain measures of overall care in history taking, physical exam, lab and imaging studies ordered, diagnostic accuracy, and treatment plan (domain score). Scores generated from the cases were reported as the percentage of criteria correct divided by the total number of criteria for each case. The provider received confidential feedback on each vignette, which included an overall score and domain scores, as well as recommendations for improvement and links to relevant clinical guidelines and medical literature.
Objectives: We examine the baseline variation in breast cancer care as captured by the vignettes, paying special attention to the use of unnecessary tests and provision of care that deviate from MCC’s clinical pathways. We discuss the implications of this initiative on improving adherence to pathways as MCC begins to use this platform to align providers and build a network based on quality.
Analysis: We examined overall distributions of the vignette scores and then evaluated distributions by domain score. The small sample size limited our ability to disaggregate data by provider type.
Using the vignettes, we also examined adherence to MCC breast cancer clinical pathways in the following 5 areas: surgery; sentinel/axillary lymph node management; radiation therapy; chemotherapy, and hormonal therapy.
We also examine the prevalence of unnecessary testing, defined as ordering a test whose result would not change clinical management as identified in the breast cancer clinical pathways. From our vignettes, we are able to estimate the average number of unnecessary diagnostic tests ordered.
At baseline, 66 breast cancer care providers took 132 breast cancer vignettes. The provider characteristics (Table 1) demonstrate a broad range of training and experience, with some providers exclusively treating breast cancer patients and others having a mixed oncology practice.
|Table 1 Participating provider characteristics|
|Site #1||Site #2||Site #3||Site #4|
|Number of providers who participated||17||17||22||10|
|Advanced practice providers||6||0||3||0|
|Average number of years of experience treating breast cancer||12.6||21.1||11.7||13.7|
|Average number of breast cancer cases seen per week||34.1||17.6||12.9||10.8|
The CPV vignettes consisted of 12 different online cases that required the provider to evaluate a simulated female patient for breast cancer or suspected breast cancer. Providers were required to take a history and physical examination; they were then asked to order whatever test they felt was indicated, make a diagnosis, and then provide treatment, any follow-up testing they required, and follow-up care. Upon completing the history, the physical, and the testing, clinical responses (eg, the result of a biopsy) were provided in real time. Providers completed their cases, on average, in less than 25 minutes across all the sites. Once all the vignettes in any given site were completed, the free-text responses were scored by a team of trained physicians using explicit predetermined criteria. Scoring was done by 2 physicians blinded to the caregiver and to each other to check for accuracy. Individual feedback forms, which were always provided confidentially, were prepared for each provider for each case they completed. Feedback forms benchmarked where individual providers scored in relation to other providers in their sites. The feedback also provided customized suggestions on where the provider might improve in the care of the simulated patient. Unnecessary testing was also flagged and fed back to the provider (see summaries of the 12 CPV vignette cases in Box 1).
We examined overall score distributions of the vignettes by site (Figure 1) and then by domain score (Figure 2). The mean distributions varied across all sites and were not statistically significantly different (P>.05). Scores varied by domain across sites, although history and physical scores tended to be higher than work-up, diagnosis, and treatment scores. As seen in Figure 2, there was wide variation in each domain. Pathway adherence for specific diagnostic evaluations or treatments (Table 2) shows the adherence to MCC clinical pathways in the following 5 areas: surgery; sentinel/axillary lymph node dissection; radiation therapy; chemotherapy, and hormonal therapy, along with the prevalence of unnecessary testing.
Figure 1 Distributions of breast cancer CPV vignette scores, 2013
Note: CPVs are scored based on whether providers followed MCC breast clinical pathways. Scores generated from the cases are reported as the percentage of criteria correct divided by the total number of criteria for each case. Scoring of the vignettes was completed by a team of trained physicians who were blinded to the identities of the providers taking the vignettes.
Figure 2 Distributions of breast cancer CPV vignette scores by domains
Note: Boxplots show the variation in scores. The bottom of the box is the first quartile (Q1) to the third quartile (Q3). Within the box, the line denotes the median of the data. The ends of the whiskers denote the minimum and maximum values of the data.
|Table 2 Unnecessary testing and adherence to MCC clinical pathways|
|Site #1 (n=17)||Site #2 (n=17)||Site #3 (n=22)||Site #4 (n=10)|
|Percent of providers ordering unnecessary tests/procedures||50%||55%||81%||62%|
|Average number of unnecessary/tests procedures (among those who ordered unnecessary tests)||1.6||1.7||1.7||1.8|
|Adherence to MCC pathways|
|Axillary/sentinel lymph node dissection||31%||42%||39%||50%|
In its seminal 2013 report Delivering High Quality Cancer Care: Charting a New Course for a System in Crisis, the Institute of Medicine called for linking payment to quality measures. Through its recomendations, the IOM envisioned a system that rewards cancer care networks for the provision of high-quality, patient-centered care and discourages ineffective interventions that contribute to costs in the fee-for-service environment. Payers and providers across the country are experimenting with a variety of ways to measure quality as they shift to new delivery models, such as specialty ACOs, shared risk payments, and oncology patient-centered medical homes (Porter 2007).
In 2009, Moffitt Cancer Center (MCC) launched an intensive effort to introduce evidence-based clinical pathways among its own providers and among providers in its oncology network. The key goals were to advance internal and cross-system discussions of quality and value, which included assuring that patients receive the most appropriate personalized care to optimize their desired outcomes, identifying and reducing practice variation, streamlining patient-care processes, increasing the use of generic drugs when available for chemotherapy, and advancing supportive care.
The challenge for MCC and other health care systems has been to engage providers to use their pathways. Research suggests oncologists tend to find pathways influential and favor their use (Farina 2012). However, the rate of actual adoption of pathways into practice is variable, is often quite low and is the subject of active investigation (Casey 2013).
To address this challenge, CPV vignettes were introduced to encourage adherence to pathways, determine their feasibility, and promote a culture fostering measurement and learning improvement throughout MCC and its network sites. At baseline, CPV scores revealed a similar distributions across all of the institutions, with large variation in the work-up, diagnosis, and treatment. The large variation in treatment scores is notable, as is the variation in adherence to MCC pathways in the 4 different sites. Future analyses will examine data from rounds 2 to 6 in breast cancer and other clinical areas, such as lung and gastrointestinal cancers. We did not make statistical comparisons across the sites, as the goal of this initiative was to examine and improve pathway adherence collectively at all sites.
Other studies, such as a recently published report among cardiologists in a large Northwestern U.S. health care system, have shown that the use of CPVs to measure variability among providers is similarly feasible and successful in demonstrating the variability of care practice. Like this study, provider engagement is high, with comments on the feedback, the quality criteria, and differences among colleagues dominating discussions (Peabody 2015). Other large cancer care centers are looking for ways to transform the culture of measuring and improving evidence-based care. City of Hope, for example, is implementing a continuing medical education (CME)-based initiative that aims to link education with improving performance to meet current standards of care (Uemura 2013). Much like CME, MCC pathways are a learning tool, offering the best evidence for therapeutic equivalence or superiority by highlighting benefits and costs. They are intended to manage patient care in real time by documenting local practice patterns and by prospectively defining specific treatment strategies.
With their use in the MON, the initiative to use vignettes went a step further, requiring buy-in from other institutions. With the measurement system in place, we hope to create the groundwork for quality as the basis for forming partnerships with other hospitals to provide high-quality care to ever-larger numbers of patients. Vignettes are ideally suited for large-scale cross-system measurement in networks. They do not require uniform IT systems, as they operate on their own data platform, and can be written to address such network-related issues as referrals and adoption of other quality initiatives. With the successful implementation of the CPVs, the MON is providing feedback to providers in 2 ways: individually and confidentially to the provider on pathway adherence for each case completed, and to the group as a whole in a conference setting to discuss the areas (e.g. work-up of the axilla) where there is the most disparity. By doing feedback repeatedly, providers have a chance to absorb the individual and group feedback and improve their performance.
Evidence suggests that improving pathway adherence reduces costs and improves outcomes (Neubauer 2010, Smith 2011, Butcher 2010, Hoverman 2011). The Highmark Blue Cross Blue Shield program, for example, showed that the total cost of care for patients under pathways practices grew at a slower rate than did the cost for patients who were not in pathways program (Butcher 2010). Outpatient costs were 35% lower for patients with non–small-cell lung cancer treated on a clinical pathway compared with those who received nonpathway treatment. CareFirst reduced its costs by 15% using a clinical pathways program for breast, lung, and colon cancers, resulting primarily from a 7% decline in emergency room visits, shorter hospital stays, increased use of generic medications, and more appropriate use of chemotherapy (Hoverman 2011). Newer reports from a large NIH-funded experimental study indicate that when CPV vignettes are used serially, they change practice and result in better patient outcomes (Peabody 2011, Peabody 2014). Further research is needed to determine if this is the case in the MON with the MCC pathways.
Implementation challenges exist. First, while clinical pathways in oncology offer a method to reduce unnecessary and costly treatment variation, their success relies on physician participation. In this study, strong leadership and institutional commitment to pathway adherence helped to make voluntary participation very high. The perception among some physicians that pathways represent “cookbook medicine” is a barrier to acceptance of pathways, as are constraints on physcians’ time and discomfort with changing practice patterns. The counterpoint is that without accountability, providers may not — and, typically in other studies, have not — complied with pathway programs, suggesting a need for a vignette-based or similar approach and other research on how to introduce collaborative efforts that engage physicians.
We present an initiative that uses evidence-based pathway care to build a network based on quality and a culture of measurement and feedback. Contemporary quality improvement initiatives have begun calling for innovative ways to approach health care delivery and teaching. At the core of this shift is increased attention to learning how quality-improvement efforts can engage physicians to examine their practice performance in a systematic way. This movement has also affected academic continuing medical education (CME) and would link quality improvement to provider education so they would not be pursued as separate activities.
In this study, we used measurement and feedback with the vignettes as a way to identify areas of quality gaps and as a bridge to education. With this initiative, the MCC worked with 3 network member institutions to identify areas of improvements in quality for breast cancer; this is now being expanded to lung and gastrointestinal cancers. As we continue to collect serial data, our next steps will be to examine whether this initiative has improved clinical care, increased adherence to clinical pathways, reduced unnecessary testing, become integrated into the culture of practice at MCC and its partners, and demonstrated value to payers.
Biller-Andorno N, Lee TH. Ethical physician incentives — from carrots and sticks to shared purpose. N Engl J Med. 2013;368(11):980–982.
Bohmer RM. The four habits of high-value health care organizations. N Engl J Med. 2011;365(22):2045–2047.
Burns LR, Pauly MV. Accountable care organizations may have difficulty avoiding the failures of integrated delivery networks of the 1990s. Health Aff (Millwood). 2012;31(11):2407–2416.
Butcher L. Will clinical pathways work? Biotechnol Healthc. 2010;7(3):16–20.
Casey DE Jr. Why don’t physicians (and patients) consistently follow clinical practice guidelines? JAMA Intern Med. 2013;173(17):1581–1583.
Dresselhaus TR, Peabody JW, Lee M, et al. Measuring compliance with preventive care guidelines: standardized patients, clinical vignettes, and the medical record. J Gen Intern Med. 2000;15(11):782–788.
Farina K. How payers and oncologists really feel about oncology pathways. Am J Manag Care. 2012;18(4):SP188.
Hoverman JR, Cartwright TH, Patt DA, et al. Pathways, outcomes, and costs in colon cancer: retrospective evaluations in two distinct databases. J Oncol Pract. 2011;7(3 suppl):52s–59s.
IOM (Institute of Medicine). Delivering High Quality Cancer Care: Charting a New Course for a System in Crisis. Washington, DC: IOM. 2013.
James BC, Savitz LA. How Intermountain trimmed health care costs through robust quality improvement efforts. Health Aff (Millwood) 2011;30(6):1185–1191.
Meyer GS, Nelson EC, Pryor DB, et al. More quality measures versus measuring what matters: a call for balance and parsimony. BMJ Qual Saf. 2012;21(11):964–968.
McDonald KM, Sundaram V, Bravata DM, et al. Care coordination. In: Shojania KG, McDonald KM, Wachter RM, Owens DK, eds. Closing the Quality Gap: A Critical Analysis of Quality Improvement Strategies, vol 7. Technical Review 9. AHRQ Publication No. 04(07)-0051-7. Rockville, MD: Agency for Healthcare Research and Quality; 2007.
Mountford J, Shojania KG. Refocusing quality measurement to best support quality improvement: local ownership of quality measurement by clinicians. BMJ Qual Saf. 2012;21(6):519–523.
Neubauer MA, Hoverman JR, Kolodziej M, et al. Cost effectiveness of evidence-based treatment guidelines for the treatment of non–small-cell lung cancer in the community setting. J Oncol Pract. 2010;6(1):12–18.
Peabody J, Shimkhada R, Quimbo S, et al. Financial incentives and measurement improved physicians’ quality of care in the Philippines. Health Aff (Millwood). 2011;30(4):773–781. Erratum in: Health Aff (Millwood). 2011;30(6):1217.
Peabody JW, Huang X. A role for specialists in resuscitating accountable care organizations. Harvard Business Review. November 2013.
Peabody JW, Huang X, Shimkhada R, Rosenthal M. Managing specialty care in an era of heightened accountability: emphasizing quality and accelerating savings. Am J Manag Care. 2015;21(4):284–292.
Peabody JW, Luck J, Glassman P, et al. Comparison of vignettes, standardized patients, and chart abstraction: a prospective validation study of 3 methods for measuring quality. JAMA. 2000;283(13):1715–1722.
Peabody JW, Luck J, Glassman P, et al. Measuring the quality of physician practice by using clinical vignettes: a prospective validation study. Ann Intern Med. 2004;141(10):771–780.
Peabody JW, Shimkhada R, Quimbo S, et al. The impact of performance incentives on child health outcomes: results from a cluster randomized controlled trial in the Philippines. Health Policy Plan. 2014;29(5):615–621.
Porter ME. What is value in health care? N Engl J Med. 2010; 363(26):2477–2481.
Porter ME, Teisberg EO. How physicians can change the future of health care. JAMA. 2007;297(10):1103–1111.
Smith TJ, Hillner BE. Ensuring quality cancer care by the use of clinical practice guidelines and critical pathways. J Clin Oncol. 2011;19(11):2886–2897.
Uemura M, Morgan R Jr, Mendelsohn M, et al. Enhancing quality improvements in cancer care through CME activities at a nationally recognized cancer center. J Cancer Educ. 2013;28(2):215–220.
|Box 1 Breast cancer CPV vignette case description|
|Case||Age of patient||Description||Diagnosis||Diagnostic teaching points||Treatment teaching points||Unnecessary items|
|1||52||Abnormal mammogram (BIRADS 3) but no abnormal findings on history and PE; patient on hormone replacement||Dense breasts, benign||
|2||55||Abnormal mammogram (BIRADS 4), no abnormal findings on history and PE||Atypical ductal hyperplasia, benign||
|3||56||Presents with new breast mass, apparent breast cancer with enlarged axillary node||Diffuse large B cell NHL, Stage IIIAE
No breast cancer
|4||60||Apparent metastatic disease in a woman s/p Stage II at 24-month follow-up||Hyperparathyroidism from adenoma
No progression of breast cancer
|5||57||Presents with a small (<2cm) mass with abnormal mammogram, HER2 equivocal||Stage IA breast cancer||
|6||63||Diagnosed with CHF, presents with new moderate size, 2.5 cm mass with abnormal mammogram||Stage II breast cancer (clinical)||
|7||38||Presents with new moderate size, 3 cm mass, no history of mammography HER2+ Disease||Stage II breast cancer (clinical)||
|8||38||Presents with moderate size, 2.5 cm mass, triple-negative disease, s/p incomplete excisional biopsy||Stage II multifocal breast cancer||
|9||42||Treated for HER2+ Stage II breast cancer (chemo RT) presents at 36 months with headache and supraclavicular node||Stage II breast cancer with recurrent disease||
|10||42||Presents with new onset >5cm mass which on biopsy is ER/PR+, HER2- invasive breast cancer||Stage III breast cancer (clinical)||Tumor >5cm with negative axillary ultrasound
Sentinel lymph node biopsy needed (positive)
Need for mets workup (negative)
|Neoadjuvant chemotherapy with ACT followed with lumpectomy and ALND (3/22 positive nodes)
Post-op radiation therapy
Delayed breast reconstruction preferred
Follow-up mammography q6 months
RT if patient decides on mastectomy over lumpectomy
|11||60||Status post Stage III 10 years ago with new-onset back pain||Recurrent breast cancer, metastatic to thoracic spine and ribs with compression fracture but no cord compression||Metastatic workup (includes full spine MRI) positive for T-5 and rib disease
Other met workup negative
Cardiac evaluation required
|Consultations with neurosurgery, intervention radiology and radiation oncology
Treatment options for spine metastasis with no apparent cord compression: vertebro/kyphoplasty, radiation, bisphosphonates
Hormonal therapy with another aromatase inhibitor (previously on anastrozole
Patient/physician decision on chemotherapy
Work-up for pulmonary embolism or anemia (anemia consistent with chronic disease)
Steroids (no cord compression/neurologic deficits)
|12||42||Treated for Stage IIIB HER2+ breast cancer 2 years ago, now presents with new onset headaches||Stage IIIB with recurrent disease to brain, liver||Normal neurologic exam
Cranial MRI/CT shows new parietal mass, Abdominal CT 2–3 cm mets to the liver, other met workup negative
|Tissue confirmation with CT-guided biopsy of liver, followed by complete staging necessary
Treatment for brain mets: dexamethasone and RT
Offer systemic treatment with pertuzumab, trastuzumab, docetaxel
Emphasize palliative and supportive care
|Surgical resection of brain and liver metastasis
Third line chemo: lapatinib, capecitabine
FNA=fine needle aspiration
BIRADS=Breast Imaging Reporting and Data System (categories 0-4)
CNS=central nervous system
FISH=Fluorescence in situ hybridization
Oncotype DX=Oncotype DX Breast Cancer Assay
ALND=axillary lymph node dissection
MRM=modified radical mastectomy
John W. Peabody, MD, PhD, FACP
President, QURE Healthcare
450 Pacific Avenue
San Francisco, CA 94123
Funding source: Moffitt Cancer Center
Disclosure statement: Peabody is president of QURE Healthcare, which owns the quality measurement used in the initiative described.
Acknowledgements: The authors wish to acknowledge Riti Shimkhada, PhD, for her exceptional assistance with the drafting and preparation of this manuscript.