specialseries:qualitycaresymposium original contribution...improving care with a portfolio of...
TRANSCRIPT
Stanford Health Care; and Stanford CancerInstitute, Stanford, CA
Corresponding author: Julie Bryar Porter,MSc, Stanford Health Care, 875 BlakeWilbur Dr, CC-2211, Stanford, CA 94305;e-mail: [email protected].
Disclosures provided by the authors areavailable with this article atjop.ascopubs.org.
DOI: https://doi.org/10.1200/JOP.2017.021139; published online aheadof print at jop.ascopubs.org on July 20,2017.
Improving Care With a Portfolio of Physician-Led CancerQuality Measures at an Academic CenterJulieBryar Porter, EbenLloydRosenthal,MarcyWinget, AndreaSeguraSmith, SridharBelavadi Seshadri, YohanVetteth,Eileen F. Kiamanesh, Amogh Badwe, Ranjana H. Advani, Mark K. Buyyounouski, Steven Coutre, Frederick Dirbas, VasuDivi, Oliver Dorigo, Kristen N. Ganjoo, Laura J. Johnston, Lawrence David Recht, Joseph B. Shrager, Eila C. Skinner, SusanM. Swetter, Brendan C. Visser, and Douglas W. Blayney
QUESTION ASKED: Can an academic on-cology practice measure and improve thequality and value of care as measured by aportfolio of multidisciplinary metrics?
SUMMARY ANSWER: We asked physiciansto take the lead in choosing measures of cancercare quality that could be extracted from theelectronic health record (EHR) with minimalmanual augmentation. By allowing physiciansto select the pertinent measures, by providingtimely and accurate feedback and a small fi-nancial incentive, and by setting realistic andachievable targets, we demonstrated im-provements in all measures over the studyperiod. All cancer care programs (CCPs) mettheir predetermined targets and earned thefinancial incentive.
WHAT WE DID: We initiated a project toimplement and test the initial steps of amultiyear quality improvement program in alarge, academic multidisciplinary cancer cen-ter. We guided CCP leadership in choosingmeasures that (1) were meaningful and im-portant to thecare teamandpatients, (2)haddataelements available in existing electronic data-bases, (3) had not been broadly measured at ourcenter, (4) were multidisciplinary, and (5) wouldnot significantly increase clinician workload. Weprovided monthly adherence reports to eachCCP physician leader (an example is shownbelow for use of the EHR staging module). Theleaders shared the feedback with their team. Afinancial incentive was provided if the CCP metits predetermined target.
WHATWE FOUND: A total of 5,604 patientsor visits were included in the denominators for
the metrics. The development and continuousrefinement of the measurement system re-quired two full-time equivalent roles, includ-ing quality analysts and data specialists. Ourquality program had three essential elementsthat led to its success: (1) we engaged physi-cians to choose the quality measures and toprespecify goals, (2) we used automated ex-tractionmethods for rapid and timely feedbackon improvement and on progress towardachieving goals, and (3) we offered the CCP afinancial team-based incentive if prespecifiedgoals were met.
BIAS, CONFOUNDING FACTOR(S), REAL-LIFE IMPLICATIONS: Our new quality pro-gram was successful in embedding and cen-tralizing our physician leaders in its design andimplementation.Althoughmanyof themetricswere easily extractable from the EHR, othersrequired manual effort to abstract records andperform quality control on all negative results.This manual review effort was important tomaintain credibility by the care teams, whocould then focus on their clinical improvementefforts. The metrics chosen were measures ofcare processes, with outcome measures to beadded in future years. Staging data com-pleteness, which is available as a discrete datafield in the EHR of this organization, will serveas the foundation for measuring value-drivencare and providing our teamswith data to buildpatient. Building and measuring specific pa-tient cohorts will help our clinicians developclinical outcome improvement programs, willguide program expansion, and will help ensureconsistency of care delivery across our geo-graphic network of care.
ReCAPs (ResearchContributions Abbreviated forPrint) provide a structured,one-page summary of eachpaper highlighting the mainfindings and significance ofthe work. The full version ofthe article is available online atjop.ascopubs.org.
Copyright © 2017 by American Society of Clinical Oncology jop.ascopubs.org 1
Special Series: Quality Care Symposium ORIGINAL CONTRIBUTIONSpecial Series: Quality Care Symposium ORIGINAL CONTRIBUTION
Downloaded from ascopubs.org by Stanford University Medical Center on August 3, 2017 from 171.065.089.189Copyright © 2017 American Society of Clinical Oncology. All rights reserved.
Stanford Health Care; and Stanford CancerInstitute, Stanford, CA
ASSOCIATED CONTENT
Appendix available online
DOI: https://doi.org/10.1200/JOP.2017.021139; published online aheadof print at jop.ascopubs.org on July 20,2017.
Improving Care With a Portfolio ofPhysician-Led Cancer QualityMeasures at an Academic CenterJulie Bryar Porter, Eben Lloyd Rosenthal, Marcy Winget, Andrea Segura Smith, SridharBelavadi Seshadri, Yohan Vetteth, Eileen F. Kiamanesh, Amogh Badwe, Ranjana H.Advani,Mark K. Buyyounouski, Steven Coutre, FrederickDirbas, VasuDivi, Oliver Dorigo,Kristen N. Ganjoo, Laura J. Johnston, Lawrence David Recht, Joseph B. Shrager, Eila C.Skinner, Susan M. Swetter, Brendan C. Visser, and Douglas W. Blayney
AbstractPurposeDevelopment and implementation of robust reporting processes to systematically provide
quality data to care teams in a timely manner is challenging. National cancer quality
measures are useful, but the manual data collection required is resource intensive, and
reporting is delayed. We designed a largely automated measurement system with our
multidisciplinary cancer care programs (CCPs) to identify, measure, and improve quality
metrics that were meaningful to the care teams and their patients.
MethodsEach CCP physician leader collaborated with the cancer quality team to identify metrics,
abiding by established guiding principles. Financial incentive was provided to the CCPs if
performance at the end of the study period met predetermined targets. Reports were
developed and provided to the CCP physician leaders on a monthly or quarterly basis, for
dissemination to their CCP teams.
ResultsA total of 15 distinct quality measures were collected in depth for the first time at this
cancer center.Metrics spanned the patient care continuum, fromdiagnosis through end of
life or survivorship care. All metrics improved over the study period,met their targets, and
earned a financial incentive for their CCP.
ConclusionOur quality program had three essential elements that led to its success: (1) engaging
physicians in choosing the quality measures and prespecifying goals, (2) using automated
extraction methods for rapid and timely feedback on improvement and progress toward
achieving goals, and (3) offering afinancial team-based incentive if prespecified goalswere
met.
INTRODUCTION
Impetus for Quality Care DeliveryThe cancer care delivery system, whichoperates at the intersection of basic and
translational science,clinicalmedicine,andpatient and family needs, is complex andmultidisciplinary. All stakeholders agreethat the system must deliver better healthcare,atahigherqualityandatanappropriate
Copyright © 2017 by American Society of Clinical Oncology jop.ascopubs.org 1
Special Series: Quality Care Symposium ORIGINAL CONTRIBUTION
Downloaded from ascopubs.org by Stanford University Medical Center on August 3, 2017 from 171.065.089.189Copyright © 2017 American Society of Clinical Oncology. All rights reserved.
cost. Physicians and provider teams can directly influence thequality of cancer care delivered to patients; this is one com-ponent of value.1 New payment models (eg, the MedicareAccess & CHIP Reauthorization Act of 2015)2 encourageachievement of this “triple aim”3 through a focus on paymentfor value delivered rather than on volume of services.
Quality Measures at National and Health SystemLevelsWe propose a three-level construct for quality measures (Fig1). Broadly, there aremeasures that either are widely endorsedby national organizations such as the National Quality Fo-rum,4 are in the process of adoption by government and byother payers such as the Center for Medicare and MedicaidServices,2 or are endorsed by professional societies such asASCO or the American Society for Therapeutic RadiologicOncology for voluntary use by their members and others.5,6
We call the next level of quality measures dashboardmeasures. Dashboard measures are also broad and are gen-erally used for comparison both across and within health caresystems and by providers of care. As the name implies, thesemeasures are often used to create dashboards to track im-provement or deterioration in performance over time. Ex-amples include patient satisfaction measurement systems,such as those reported by Press-Ganey,7 or the observed-to-expected mortality ratios promulgated by the UniversityHealthCare Consortium (now Vizient).8
Quality Measures at the Oncology Care Team LevelThe third level ofmeasurement consists of thosemeasures thatoncologistsmay findmore directlymeaningful and relevant totheir patient care. These measures are often applicable spe-cifically to their subspecialty or practice setting and are thusrelevant for evaluating the team’s performance. One lessonfrom a previous experience in implementing ASCO’s QualityOncology Practice Initiative (QOPI)measurement system in alarge multidisciplinary academic cancer center was that on-cologist engagement in quality improvement was essential todrive improvement and to sustain broad-based quality care.9
On the basis of this construct and previous experience, weinitiated a project to implement and test a robust quality im-provement program in a large, academic, multidisciplinaryNational Cancer Institute–designated comprehensive cancercenter. The three fundamental aspects of the program wereengaging physicians in selection metrics and targets, auto-mating data extraction and providing timely provider feedback,and providing group financial incentives to meet targets. Wereport the methods used to engage physicians, the resourcesneeded, the success to date, and the next steps in our program.
METHODS
SettingPatient care in our center is provided by oncologists (medical,surgical, and radiation),with closely aligned advancedpractice
National, Wide-Scale Quality Metrics
Nationally endorsed and accepted
Measured across health systems; includes CMS Hospital Compare, PhysicianQuality Reporting System, QOPI, and payor pay-for-performance measures
Dec
reas
ing
deno
min
ator
siz
e an
d fe
wer
oppo
rtun
ities
to b
ench
mar
kIncreasing granularity and relevance to
clinical care
Oncology Team Quality Metrics
Clinician-identified and driven
Specific measures identifiedlocally to directly improvepatient care in real time
Dashboard, System-Wide Quality Metrics
Reflect priorities of health system
Measured within health systems; includes patientsatisfaction, documentation completion, mortality rates,
and physician-level performance
Fig 1. Quality measurement at three defined levels. CMS, Center for Medicare and Medicaid Services; QOPI, Quality Oncology Practice Initiative.
2 Journal of Oncology Practice Copyright © 2017 by American Society of Clinical Oncology
Porter et al
Downloaded from ascopubs.org by Stanford University Medical Center on August 3, 2017 from 171.065.089.189Copyright © 2017 American Society of Clinical Oncology. All rights reserved.
providers (APPs), nurse coordinators, and associated ad-ministrative staff, organized into 13 disease-site or treatment-specific teams known as cancer care programs (CCPs). Duringfiscal year 2016 (FY16 [September1, 2015, toAugust 31, 2016]),wehad140physiciansand58full-timeequivalent (FTE)APPsatour Palo Alto center, where we had had 5,397 analytic patient-cases and delivered 20,092 new patient visits (each patientusually had new patient visits to multiple subspecialists). Weprovided 51,177 infusions and 27,337 radiation treatments.
StanfordHealthCare (SHC) uses the Epic electronic healthrecord (EHR), Version 2015 (Epic Systems, Verona, WI). Forquality measurement purposes, we used the SHC EnterpriseDataWarehouse, the Stanford Cancer Institute (SCI) researchdatabase, which includes curated EHR and Stanford cancerregistry data. In addition, we used investigator-maintained,disease-specific databases in the neuro-oncology, blood andmarrow transplantation, and hematology CCPs.
Physician Engagement ActivitiesThe clinical and administrative executive leadership of the SCIand SHC designed a program to build a portfolio of cancer
qualitymetrics in FY16. Themedical director of cancer qualitymet individually with the physician leader of each CCP todevelop potential metric ideas for their CCP, with the guidingprinciples that final metrics must (1) be meaningful and im-portant to the care team and patients, (2) have data elementsavailable in existing databases, (3) not have been broadlymeasuredat thiscancercenter, (4)bemultidisciplinary, and(5)not increase clinician workload significantly.
The cancer quality team reviewed and additionally refinedthe metric ideas with SHC and the SCI analytic teams. Theexecutive leadership team reviewed and approved the finalmetrics.
A monetary incentive was provided by SHC for each CCPthat met its predetermined and mutually agreed on metrictarget at the endof FY16.The incentive fundswere provided totheCCP(not to individuals) for reinvestment in theirprogram,with strong encouragement to spend funds on quality im-provement activities. Potential distributionswere$75,000 to$125,000 per CCP, depending on patient volume andnumber of physicians in theCCP. To set the targets, the teamreviewed the resultsof themetric for the final quarterof fiscalyear 2015, then added a percentage for improvement on thebasis of attainability and challenge to the CCP. The finalquarter of FY16 was used to determine achievement of thetarget performance.
CCPs inwhich tumor stage was relevant, and for which theEHRcontainedaspecific stagingmodule (ie,breast, cutaneous,GI, gynecologic, headandneck, lymphoma, sarcoma, thoracic,and urologic CCPs), were offered staging module adherenceas a metric. Individual new patient disease and staging in-formation is entered in coded data fields in the Epic stagingmodule, and is finalized with a signature by a physician or anAPP.Adherence to this documentation standardwas scored assuccessful if completed within 60 days of the initial cancertreatment at Stanford, regardless ofmodality of treatment.Thephysician who provided the initial treatment was responsiblefor compliance, although anyprovider on the patient’s care teamwas encouraged to complete staging information. Targets werecalculated by doubling the fiscal year 2015 baseline performance,with a minimum of 40% and a maximum of 70% adherence.
To increase adherence to the staging module, the qualityteam distributed interim reports that demonstrated theirperformance during the past month to each CCP physicianleader at the end of themonth. Included in the report were themedical record numbers of those patients whose stagingmodule had not yet been completed, by attributing physician.
CCPphysician leaders sharedthe interimresultswith theentireteam or with individual physicians who provided incompletedata. Final monthly reports that included CCP performanceover time were provided to all CCPs, and each CCP physicianleader received a report with unblinded physician-level per-formance, including patient-level data.
A similar reporting feedback loop was conducted for theother quality metrics. Specifically, each CCP physician leaderwasprovideda routine report ofhis orhermetricperformance.Reports included the overall rate of success, physician-levelresults, and patient-level detail. Physician leaders shared theresults with the entire CCP through e-mail reminders or inpresentations at CCP meetings. Improvement efforts weredriven by the CCP physician leaders, who engaged withunderperformers in their group and identified opportunitiesfor increased participation by the entire care team to increaseperformance.
RESULTSIn total, 15 distinct quality metrics were identified, developed,andmeasured for13CCPs(Table 1) in FY16.NineCCPs chosestagingmodule adherence, two chose chemotherapy in the last2 weeks of life, and the remaining 13 metrics were unique to aspecific CCP, for a total of 24CCP-specificmetrics. Allmetricswere process measures, and six were based on nationally
Copyright © 2017 by American Society of Clinical Oncology jop.ascopubs.org 3
Improving Care With Physician-Led Cancer Quality Measures
Downloaded from ascopubs.org by Stanford University Medical Center on August 3, 2017 from 171.065.089.189Copyright © 2017 American Society of Clinical Oncology. All rights reserved.
recognized measures. The remaining nine metrics were basedon evidence-based guidelines; theywere identified as clinicallymeaningful and important to the care team and the patientsbut are not nationally endorsed. Sevenmetricswere relevant topatient quality of life, seven to treatment decisionmaking, andone to timeliness of care. Importantly, themetrics spanned theperiod from cancer diagnosis through survivorship or end oflife (Table 2). All the metrics related to treatment decisionmaking and timeliness of care fell under the heading of di-agnosis, whereas the quality-of-life metrics fell under theheading of treatment or the heading of survivorship or end-of-life.
A total of 5,604 patients or visits (depending on themeasure) were included in the denominators for the metrics.Data from five different databases were accessed to measureand track results for allmetrics on amonthly or quarterly basis.The development and continuous refinement of the qualitymetrics required the time of two FTE roles, divided among aquality manager and three data architects over the course of9 months. The management and dissemination of the metricreports to the 13 CCPs required an average of 8 hours per
month.The staging module adherence metric was rolled out in
phases by clusters ofCCPsover the course of 4months.Within6 months, the nine CCPs were receiving monthly stagingmodule adherence reports. Baseline performance was 31%,which rose to 81% by study end, for a total of 1,506 recordsstaged out of 2,406 eligible over the year. The lowest CCPadherence at study end was 57% (an increase from 11%) andthe maximum was 100%, which was reached by two CCPs. Amajor driver for staging module adherence was the encour-agement for APPs to enter and finalize staging information inthe module.
DISCUSSIONOurqualityprogramhadthree essential elements: (1) engagingphysicians in choosing the qualitymeasures and prespecifyinggoals, (2) using automated extraction methods for rapid andtimely feedback on improvement and progress towardachieving goals, and (3) offering a financial team-based in-centive if prespecified goals were met.
Ournewquality programwas successful in embedding andcentralizing our physician leaders in its design and imple-mentation. This led to increased interest and engagement,while achieving thegoalof improvingcareasmeasuredby theirselected metrics’ improvements.
Measurement of quality performance, including imple-mentation of the QOPImeasurement tool within an academicmedical center,9 in a statewide network sponsored by a re-gional payer,10 in a state-wide affiliation of practices centeredon an academic medical center,11 and in a government-sponsored affiliation of oncology practices,12 has beenaccomplished. Retrospective analysis of quality of caremeasures within a system, such as at the Veterans HealthAdministration,13,14 is also possible. Unfortunately, a gapexists in the use of measures to improve the quality of care,demonstrate measurable outcome improvement, andsustain the improvement. Engaging physicians and patientcare teams to improve the quality of care that they deliverand to change behavior is difficult.15
Stanford Cancer Center has participated in QOPI since2011 and was certified in 2013. Although the QOPI site andbenchmarking results are useful in identifying improvementopportunities,ourphysicianswere relatively removed fromtheprocess, including the selection and development of themetrics and the collection of the data. In addition, the resultswere provided only twice a year for a subset of our CCPs, were
based on a small sample of our patients, and were not wellcommunicated across CCPs.
Physician leaders were surveyed, and they noted that thisprogram was either useful or very useful to improving patientcare, and all were very satisfied with their experiences. Theyalso noted that the most important driver was their engage-ment and leadership in the selection and improvement effortsfor their metrics, more so than the financial incentive or datatransparency. Data and comments are presented in the Ap-pendix (online only).
Asmeasureswere developed and analyzed for the first timein our setting, it was unknown what the baseline performancewould be. Some CCPs showed high performance from thebeginning of the measurement period, such as in the mea-surement of epidermal growth factor mutation testing fornewly diagnosed patients with lung adenocarcinoma. Theformal measurement verified that that evidence-based carewas being delivered in that area, knowledge that was notavailable before this measurement. We acknowledge that ourresultsmaybeunique toour academicmedical center, butnotethat our clinical care program has multiple stakeholders andmultiple competing demands on the EHR and informationtechnology reporting staff and on our clinicians’ time andattention. We posit that our findings of early clinicianengagement in the quality process, timely and accurate
4 Journal of Oncology Practice Copyright © 2017 by American Society of Clinical Oncology
Porter et al
Downloaded from ascopubs.org by Stanford University Medical Center on August 3, 2017 from 171.065.089.189Copyright © 2017 American Society of Clinical Oncology. All rights reserved.
Table 1. Physician-Selected Cancer Quality Metrics, by CCP
CCPAnalytic Cases,Annualized FY16 Measure Metric Purpose
FY16 MetricDenominator
FY16Target(%)
BaselineResults(Q4 FY15; %)
Target PeriodResults(Q4 FY16; %)
Multiple* 4,820 Staging moduleadherence
Drives treatmentdecisions
2,406 40-70 31 81
Blood and marrowtransplantation
— Referrals tosurvivorship by day100
Quality of life 115 75 66† 86
Patient visits tosurvivorship by day180
Quality of life 87‡ 75 63† 76
Cutaneous 428 Hand and foot painrecorded
Quality of life 252 90 66 100
Radiation dermatitispain recorded
Quality of life 252 90 68 90
GI 1,007 Mismatch repairtesting of newlydiagnosed patientswith colorectalcancer
Drives treatmentdecisions
420 95 88 98
Gynecologic oncology 260 Referrals of all newlydiagnosed patientswith ovarian cancerto geneticcounseling
Drives treatmentdecisions
149 60 39 82
Hematology 301 Cytogenetic testing ofnewly diagnosedpatients with acutemyeloid leukemia,acute lymphoblasticleukemia, ormultiplemyeloma
Drives treatmentdecisions
220 90 81 98
Molecular testing ofnewly diagnosedpatients with acutemyeloid leukemia
Drives treatmentdecisions
45‡ 85 95† 95
Lymphoma 348 Hepatitis B testingbefore rituximabadministration
Drives treatmentdecisions
113 95 83 100
Neuro-oncology 99 Chemotherapy in thelast 2 weeks of life
Quality of life 33 10 9 0§
Hospice enrollment attime of death§
Quality of life 27 75 100 100
(continued on following page)
Copyright © 2017 by American Society of Clinical Oncology jop.ascopubs.org 5
Improving Care With Physician-Led Cancer Quality Measures
Downloaded from ascopubs.org by Stanford University Medical Center on August 3, 2017 from 171.065.089.189Copyright © 2017 American Society of Clinical Oncology. All rights reserved.
performance feedback, the setting of achievable targets,and a small financial incentive award to the program can beeffectively replicated in other settings, including smallercommunity-based practices and hospital-based cancerprograms.
Qualitymeasures, including thewidely usedQOPI system,were designed in the era of paper-based recording, andmeasurement requires by-hand abstraction of data16 of rel-atively small samples of patients. It was important to our careteams that wemeasure entire, defined populations of patients,and this was a major impetus to select only metrics that werepulled by automated extraction, which would require little tono manual abstraction.
Although outcome measures were not excluded from theprogram, all metrics for the portfolio ended up being processmetrics because of the limited resources typically required tobuildandimproveoutcomemeasures,suchastime, informationtechnologysupport, andagreementonwhichoutcomemeasuretoselect.Thedevelopmentandmanagementof themetricswereresource intensive, requiring approximately two FTEs of data
architects, quality analysts, and other EHR experts. Althoughmany of the metrics were easily extractable from the EHR,others requiredmanual effort to abstract records andperformquality control on all negative results (to ensure that missingdata were not scored incorrectly as negative or nonadherent,because clinical data are captured in the EHR in a variety offields and text notes) and to abstract records for the death-related metrics. This manual review effort was important tomaintain credibility by the care teams, who could then focuson improvement efforts.
Close collaboration among the cancer quality team,physician leaders, and data architects was critical to ensurehigh-quality metric development. Although metric selec-tionwas a criticalmilestone of engagement, the involvementof the physician leader to continuously improve the metricwas arguably the most important in developing additionalengagement by the CCPs. For example, the original stag-ingmodule associatedwith lymphoma, cutaneous, andheadand neck cancer was incomplete, inaccurate, and poorlyconstructed, and needed refinement by a local disease-level
Table 1. Physician-Selected Cancer Quality Metrics, by CCP (continued)
CCPAnalytic Cases,Annualized FY16 Measure Metric Purpose
FY16 MetricDenominator
FY16Target(%)
BaselineResults(Q4 FY15; %)
Target PeriodResults(Q4 FY16; %)
Radiation oncology — At least 2 nights fortreatment planphysics qualityassurance beforeplanned start ofnonemergenttreatment
Timeliness of care 955 75 62 76
Patients receiving .
10 fractions forpalliative bonemetastases
Quality of life 314 5 9 0
Sarcoma 129 Chemotherapy in thelast 2 weeks of life§
Quality of life 13 10 0 0§
Thoracic 499 Epidermal growthfactor receptortesting of newlydiagnosed patientswith lungadenocarcinoma
Drives treatmentdecisions
203 90 98 98
Abbreviations: CCP, cancer care program; FY15, fiscal year 2015; FY16, fiscal year 2016; Q, quarter.*Breast, cutaneous, GI, gynecologic, head and neck, lymphoma, sarcoma, thoracic, and urologic CCPs.†Baseline period is first quarter of available data.‡Incomplete data.§Lower is better.
6 Journal of Oncology Practice Copyright © 2017 by American Society of Clinical Oncology
Porter et al
Downloaded from ascopubs.org by Stanford University Medical Center on August 3, 2017 from 171.065.089.189Copyright © 2017 American Society of Clinical Oncology. All rights reserved.
expert’s leadership. The engagement and leadership of
these three CCPs led to increased rates of staging moduleadherence.
Automated, timely, high-quality reporting allowed ourphysician leaders to provide theirCCPswith unblinded, recentdata thatwere immediately actionable by individual providers.All providers and CCP staff were incentivized to meet targets,because the financial incentive would be provided to the CCP.Including adherence to qualitymeasures as part of a physiciancompensation plan for an 11-member academic health center–based hematology-oncology division has been reported to beeffective.17 Financial awards here were distributed to theCCP team for use at their discretion and were not meant forindividual compensation. There seemed to be more com-munication and collaboration by care teams in an effortto have these metrics be successful, because the entire teamwas incentivized to meet their CCP goal. For example,APPs increased their involvement in staging module input,medical assistants ensured that pain was documented forcutaneous patients, and nurse coordinators checked thathepatitis B testing was complete for patients with lymphomawho were starting rituximab. In addition, at the gyneco-logic oncology CCP tumor board, the multidisciplinaryteam systematically reviewed whether genetic referrals werecompleted for newly diagnosed patients with ovarian cancer.
Our findings may be useful in promulgating team science in
oncology.18
Stagingdata,whicharenowavailableasadiscretedata field,will serve as the foundation for measuring value-driven careand providing our teams with data to build patient cohorts bydisease, stage, and other clinical characteristics that measurethe care beingdelivered. Building these specific patient cohortswill help the cancer center in the development of clinical trialsthat are basedonpotential accrual information, in the businessdevelopment of new or expanding programs, and in ensuringthat consistent care is being delivered across our network ofcare by analyzing use data for these cohorts.
Although the first iteration of this portfolio of qualitymetrics was successful, lessons learned have led to enhance-ments for the FY17 portfolio. First, CCPs have been stronglyencouraged to select outcome, rather than process, metrics,especially ones that will define and increase the value of care.ManyCCPshavechosen tomeasureunplannedcare events (ie,emergency department visits, unplanned admissions withinour system) for patients they have treated in the preceding6months.The logicof theunplannedcarereport isbasedonthestaging module, which will also permit CCPs to build specificcohorts and measure the care used for a defined period.This information will drive efforts to reduce identified un-necessary variation and ensure consistent, predictable care.
Table 2. CCP Quality Metrics Across the Care Continuum
Diagnosis and Treatment Planning During Active Treatment Survivorship or End of Life
Mismatch repair testing of newly diagnosedpatients with colorectal cancer
Hand and foot pain recorded BMT referrals to survivorship by day 100
Referrals of all newly diagnosed patients withovarian cancer to genetic counseling
Radiation dermatitis pain recorded BMT patient visits to survivorship by day 180
Cytogenetic testing of newly diagnosed patientswith acute myeloid leukemia, acutelymphoblastic leukemia, or multiple myeloma
Patients receiving . 10 fractions forpalliative bone metastases
Chemotherapy in the last 2 weeks of life
Molecular testing of newly diagnosed patients withacute myeloid leukemia
Hospice enrollment at time of death
Hepatitis B testing before rituximab administration
At least two nights for treatment plan physicsquality assurance before planned start ofnonemergent treatment
Epidermal growth factor receptor testing of newlydiagnosed patients with lung adenocarcinoma
Abbreviations: BMT, blood and marrow transplantation; CCP, cancer care program.
Copyright © 2017 by American Society of Clinical Oncology jop.ascopubs.org 7
Improving Care With Physician-Led Cancer Quality Measures
Downloaded from ascopubs.org by Stanford University Medical Center on August 3, 2017 from 171.065.089.189Copyright © 2017 American Society of Clinical Oncology. All rights reserved.
Furthermore, in consideration of sustainability, we haveconsolidatedCCPsaround specificmetrics, such asunplannedcare, to limit the number of unique metrics that require de-velopment and management.
Second, because these metrics were determined to bemeaningful to the care teams and their patients, we will verifyand solicit feedback regarding future metric ideas from ourcancer patient and family advisory council, composed offormer and actively treated patients with cancer and familymembers or caregivers of patients. An important opportunityexists tobetter engagepatients and familymembers indefiningquality of care. One example of this may be patient-reportedoutcome measures, which we will incorporate as resourcesallow.
Third, benchmarking opportunities are a missing com-ponent in using locally identified metrics. We invite peer andlocal cancer centers to explore measuring with us and col-laborating on identifying best practices.
Finally, an increasing challenge for many organizations ishow to monitor and maintain quality of care across growingnetworksof care.Although thesemeasureswere completed for
our Palo Alto center, the next steps are to promulgate thesemetrics at our other network sites of care and to identifyopportunities to improve across the network.
AcknowledgmentWe thank Ruchille B. Ruiz for administrative support and thank Terri Owen,Karen Tong, JamesMartin, Jimmy Lau, Raj Behal, MD, and Chris Holsinger, MD,for their contributions.
Authors’ Disclosures of Potential Conflicts of InterestDisclosures provided by the authors are available with this article atjop.ascopubs.org.
Author ContributionsConception and design: Julie Bryar Porter, Eben Lloyd Rosenthal, SridharBelavadi Seshadri, Douglas W. BlayneyCollection and assembly of data: Julie Bryar Porter, Andrea Segura Smith,YohanVetteth, EileenF. Kiamanesh, AmoghBadwe, LawrenceDavidRecht,Joseph B. Shrager, Douglas W. BlayneyData analysis and interpretation: Julie Bryar Porter, Marcy Winget,Andrea Segura Smith, Eileen F. Kiamanesh, Amogh Badwe, Ranjana H.Advani,Mark K. Buyyounouski, StevenCoutre, FrederickDirbas, VasuDivi,Oliver Dorigo, Kristen N. Ganjoo, Laura J. Johnston, Joseph B. Shrager,Eila C. Skinner, Susan M. Swetter, Brendan C. Visser, Douglas W. Blayney
Manuscript writing: All authorsFinal approval of manuscript: All authorsAccountable for all aspects of the work: All authors
Corresponding author: Julie Bryar Porter, MSc, Stanford Health Care, 875Blake Wilbur Dr, CC-2211, Stanford, CA 94305; e-mail: [email protected].
References1. Porter ME: What is value in health care? N Engl J Med 363:2477-2481, 2010
2. Centers for Medicare and Medicaid Services: Oncology care model. https://innovation.cms.gov/initiatives/oncology-care/
3. Institute for Healthcare Improvement: The IHI triple aim initiative. http://www.ihi.org/engage/initiatives/TripleAim/Pages/default.aspx
4. National Quality Forum: NQF endorses cancer measures. http://www.qualityforum.org/News_And_Resources/Press_Releases/2012/NQF_Endorses_Cancer_Measures.aspx
5. Blayney DW, McNiff K, Eisenberg PD, et al: Development and future of theAmerican Society of Clinical Oncology’s Quality Oncology Practice Initiative. J ClinOncol 32:3907-3913, 2014
6. Hampton T: Radiation oncology organization, FDA announce radiation safetyinitiatives. JAMA 303:1239-1240, 2010
7. Press-Ganey: Patient experience – Patient-centered care. Understanding patientneeds to reduce suffering and improve care. http://www.pressganey.com/solutions/patient-experience
8. Vizient. https://www.vizientinc.com/
9. Blayney DW, McNiff K, Hanauer D, et al: Implementation of the Quality OncologyPractice Initiative at a university comprehensive cancer center. J Clin Oncol 27:3802-3807, 2009
10. Blayney DW, Severson J, Martin CJ, et al: Michigan oncology practices showedvarying adherence rates to practice guidelines, but quality interventions improvedcare. Health Aff (Millwood) 31:718-728, 2012
11. Gray JE, Laronga C, Siegel EM, et al: Degree of variability in performance onbreast cancer quality indicators: Findings from the Florida initiative for quality cancercare. J Oncol Pract 7:247-251, 2011
12. Siegel RD, Castro KM, Eisenstein J, et al: Quality improvement in the NationalCancer Institute community cancer centers program: The quality oncology practiceinitiative experience. J Oncol Pract 11:e247-e254, 2015
13. Nipp RD, Kelley MJ, Williams CD, et al: Evolution of the Quality OncologyPractice Initiative supportive care quality measures portfolio and conformance at aVeterans Affairs medical center. J Oncol Pract 9:e86-e89, 2013
14. Walling AM, Tisnado D, Asch SM, et al: The quality of supportive cancer care inthe Veterans Affairs health system and targets for improvement. JAMA Intern Med173:2071-2079, 2013
15. Wilensky G: Changing physician behavior is harder than we thought. JAMA 316:21-22, 2016
16. Goldwein JW, Rose CM:QOPI, EHRs, and quality measures. J Oncol Pract 2:262,2006
17. Makari-Judson G, Wrenn T, Mertens WC, et al: Using quality oncology practiceinitiativemetrics for physician incentive compensation. J Oncol Pract 10:58-62, 2014
18. Kosty MP, Hanley A, Chollette V, et al: National Cancer Institute-AmericanSociety of Clinical Oncology Teams in Cancer Care Project. J Oncol Pract 12:955-958, 2016
8 Journal of Oncology Practice Copyright © 2017 by American Society of Clinical Oncology
Porter et al
Downloaded from ascopubs.org by Stanford University Medical Center on August 3, 2017 from 171.065.089.189Copyright © 2017 American Society of Clinical Oncology. All rights reserved.
AUTHORS’ DISCLOSURES OF POTENTIAL CONFLICTS OF INTEREST
Improving Care With a Portfolio of Physician-Led Cancer Quality Measures at an Academic Center
The following represents disclosure information provided by authors of this manuscript. All relationships are considered compensated. Relationships areself-held unless noted. I = Immediate Family Member, Inst =My Institution. Relationships may not relate to the subject matter of this manuscript. For moreinformation about ASCO’s conflict of interest policy, please refer to www.asco.org/rwc or ascopubs.org/journal/jop/site/misc/ifc.xhtml.
Julie Bryar PorterNo relationship to disclose
Eben Lloyd RosenthalNo relationship to disclose
Marcy WingetNo relationship to disclose
Andrea Segura SmithNo relationship to disclose
Sridhar Belavadi SeshadriNo relationship to disclose
Yohan VettethNo relationship to disclose
Eileen F. KiamaneshNo relationship to disclose
Amogh BadweNo relationship to disclose
Ranjana H. AdvaniConsulting or Advisory Role: Genentech/Roche, Juno, Bristol-MyersSquibb, Spectrum Pharmaceuticals, Sutro, NanoString, Pharmacyclics,Gilead Sciences, Bayer HealthCare Pharmaceuticals, Cell MedicaResearch Funding: Millennium Pharmaceuticals (Inst), Seattle Genetics(Inst), Genentech(Inst), Bristol-Myers Squibb (Inst), Pharmacyclics (Inst),Janssen Pharmaceuticals (Inst), Celgene (Inst), Forty Seven (Inst), Agensys(Inst), Merck (Inst), Kura (Inst), Regeneron Pharmaceuticals (Inst), InfinityPharmaceuticals (Inst)
Mark K. BuyyounouskiPatents, Royalties, Other Intellectual Property: Up-To-Date
Steven CoutreConsulting or Advisory Role: Genentech, Celgene, Pharmacyclics, GileadSciences, Abbvie, Novartis, Janssen OncologyResearch Funding: Celgene (Inst), Novartis (Inst), Pharmacyclics (Inst),Gilead Sciences (Inst), Abbvie (Inst), Janssen Oncology (Inst)Travel, Accommodations, Expenses: Celgene, Gilead Sciences, Novartis,Abbvie, Pharmacyclics, Janssen Oncology
Frederick DirbasStock or Other Ownership: Cianna Medical, Hologic
Vasu DiviNo relationship to disclose
Oliver DorigoNo relationship to disclose
Kristen N. GanjooHonoraria: Novartis, Daiichi Sankyo (I)Consulting or Advisory Role: Novartis, Daiichi Sankyo
Laura J. JohnstonNo relationship to disclose
Lawrence David RechtNo relationship to disclose
Joseph B. ShragerConsulting or Advisory Role: Benton Dickinson, Varian Medical SystemsResearch Funding: Varian Medical SystemsPatents, Royalties, Other Intellectual Property: Patent on a method ofpreventing muscle atrophy
Eila C. SkinnerNo relationship to disclose
Susan M. SwetterNo relationship to disclose
Brendan C. VisserNo relationship to disclose
Douglas W. BlayneyHonoraria: Apotex, Mylan, TerSera, Cascadian Therapeutics, VarianMedical SystemsResearch Funding: Amgen (Inst), BeyondSpring Pharmaceuticals (Inst)
Copyright © 2017 by American Society of Clinical Oncology jop.ascopubs.org
Improving Care With Physician-Led Cancer Quality Measures
Downloaded from ascopubs.org by Stanford University Medical Center on August 3, 2017 from 171.065.089.189Copyright © 2017 American Society of Clinical Oncology. All rights reserved.
Appendix
Six months after conclusion of implementation, we surveyed physician leadership to get their opinions. There were nine responses (out
of 13 cancer care program [CCP] leaders):
1. When asked how satisfied they were on a scale from 1 (very unsatisfied) to 5 (very satisfied), the mean response was 4.56.
2. When asked how useful the program was to improve quality of patient care, four said somewhat useful, four said very useful, and one
was unsure.
3. When asked how important the financial incentive was to the CCP, six said somewhat important, and three said very important.
4. When asked what the key drivers were, in order of importance:
a. Eight of nine said the participation and leadership of selecting a metric that was meaningful to the patient and care team was most
important
b. Seven of nine said transparent data and competition was second most important
c. Six of nine said financial incentive was least important (of choices, but respondents were able to note other drivers and rank them).
5. Weaknesses identified by respondents included:
a. Continued challenges with sustainability
b. Criticism of colleagues when imperfect data were provided
c. Need for more involvement from administration leadership to ensure that targets require striving to achieve; conflict of interest for
quality and clinical teams to identify targets
d. Limited resources to create and carry out improvement projects
e. Challenges in identifying quality metrics that can easily be measured across entire CCP
f. Difficulty in measuring outcomes rather than process. Need to continue to focus on measuring how our efforts translate into
patient benefit.
Journal of Oncology Practice Copyright © 2017 by American Society of Clinical Oncology
Porter et al
Downloaded from ascopubs.org by Stanford University Medical Center on August 3, 2017 from 171.065.089.189Copyright © 2017 American Society of Clinical Oncology. All rights reserved.