cms proposed reporting

23
37 © 2008 The Advisory Board Company • 17631 I. Perfecting Data Selection Special Report: Metric Selection Playbook Step #1: Apply a Reality Check Step #2: Be Consistent with National Efforts Step #3: Evaluate Impact Opportunity Step #4: Ensure Metric Balance Step #5: Exhaust Benchmarking Opportunities

Upload: marceljulie

Post on 28-Nov-2014

33 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: CMS Proposed Reporting

37

© 2008 The Advisory Board Company • 17631

I. Perfecting Data Selection

Special Report: Metric Selection Playbook

� Step #1: Apply a Reality Check

� Step #2: Be Consistent with National Efforts

� Step #3: Evaluate Impact Opportunity

� Step #4: Ensure Metric Balance

� Step #5: Exhaust Benchmarking Opportunities

Page 2: CMS Proposed Reporting

© 2008 The Advisory Board Company • 17631

38 The New Quality Mandate

National Organizations Making (Some) Progress

Although oncology has largely been spared from the national scrutiny of quality performance across recent years, tracking clinical quality remains a top-of-mind issue for cancer programs. Yet choosing a meaningful list of oncology-related metrics is a daunting challenge for many administrators. Fortunately, national organizations have begun to provide some guidance. In April 2007, the National Quality Forum approved fi ve quality measures related to breast and colon cancer. The endorsement of a single set of quality measures was the product of various initiatives undertaken across recent years by four national organizations with a stake in cancer quality—the NQF, ASCO, NCCN, and the CoC.

A Step in the Right Direction

2004 Early 2006 Late 2006 2007

NQF Project Initiated

• NQF Quality of Cancer Measures Project initiated in 2004

• Aimed at validating measures for diagnosis, treatment of breast and colorectal cancers, and symptom management of end-of-life care

1

NQFNational Quality Forum

Quality of Care

Measures Project

NCCN/ASCO Panels Convene

• Members of ASCO, NCCN met to review existing validated measures, relevant data, and NCCN and ASCO guidelines

• Indicators and data reviewed were derived from the National Initiative for Cancer Care Quality, which was completed in 2006

2 Organizations Collaborate on

Measures

NQFNational Quality Forum

ASCO/NCCN

Measures Harmonized, Endorsed by NQF

• In April 2007, fi ve harmonized measures were approved by the NQF

NQFNational Quality Forum

April 2007

Press Release

NQF Endorses Cancer Quality Measures. . .

3 4

National Initiative

for Cancer Care

Quality

Source: Descht C, et al, “ASCO/NCCN Quality Measures,” Journal of Clinical Oncology, 2008, 26: 3631–3637; Oncology Roundtable interviews and analysis.

• At conclusion of ASCO/NCCN project, NQF project was nearing completion

• Groups convened to discuss measures

Page 3: CMS Proposed Reporting

Perfecting Data Selection 39

© 2008 The Advisory Board Company • 17631

Despite this monumental achievement, garnering national consensus on additional measures is far from a straightforward proposition. While ASCO and NCCN have endorsed additional metrics, the CoC and NQF have questioned the validity of these indicators. In addition, controversy continues to exist regarding some of the previously approved measures, with certain studies questioning whether there is any statistically signifi cant relationship between the number of lymph nodes examined at surgery and a colon cancer patient’s likelihood of survival.

Measure ASCO/NCCN ACoS/CoC NQF

Radiation therapy is administered within 1 year (365 days) of diagnosis for women under age 70 receiving breast conserving surgery for breast cancer ✓ ✓ ✓

Combination chemotherapy is considered or administered within 4 months (120 days) of diagnosis for women under 70 with AJCC T1cN0M0, or Stage II or III hormone receptor negative breast cancer.

✓ ✓ ✓

Tamoxifen or third generation aromatase inhibitor is considered or administered within 1 year (365 days) of diagnosis for women with AJCC T1cN0M0, or Stage II or III hormone receptor positive breast cancer

✓ ✓ ✓

Adjuvant chemotherapy is considered or administered within 4 months (120 days) of diagnosis for patients under the age of 80 with AJCC Stage III (lymph node positive) colon cancer

✓ ✓ ✓

At least 12 regional lymph nodes are removed and pathologically examined for resected colon cancer ✓ ✓ ✓

Radiation therapy is considered or administered within 6 months (180 days) of diagnosis for patients under the age of 80 with clinical or pathologic AJCC T4N0M0 or Stage III receiving surgical resection for rectal cancer

✓ ✓ Not Endorsed

Postoperative adjuvant chemotherapy is considered or administered within 9 months (270 days) of diagnosis for patients under the age of 80 with AJCC stage II or stage III rectal cancer

✓ Not Endorsed Not Endorsed

Still a Long Way to Go

Quality Measures Endorsed by Leading Organizations

Source: American College of Surgeons, available at: www.facs.org; National Quality Forum, available at www.qualityforum.org, accessed August 28, 2008; Wong S, et al., “Hospital Lymph Node Examination Rates and Survival After Resection for Colon Cancer,” JAMA , 2007, 298: 2149–2154.

Metric Endorsement Among Organizations Still Varied

Page 4: CMS Proposed Reporting

© 2008 The Advisory Board Company • 17631

40 The New Quality Mandate

Lack of Guidance Creating Inconsistency in Metric Selection

Given the lack of consensus on which metrics are the most meaningful to track—in addition to the controversy surrounding existing validated measures—many providers are taking the initiative to develop their own unique sets of cancer quality indicators. While providers should be applauded for their efforts to independently measure the quality of care delivered at their own institutions, results from the 2008 Oncology Roundtable Quality Survey indicate there is signifi cant inconsistency in the specifi c indicators being monitored across programs.

Signifi cant Variation in Number of Metrics Tracked

“How Many Total Quality-Related Metrics is Your Oncology Program Currently Tracking?” 1, 2

n = 157 Hospitals

Number of Metrics

0 5 10 15 20 25 30 35 40 45 50+

9%

22%

33%

8%10%

5% 5%2% 3%

1%3%

1 Respondents were asked to round to the nearest 5.2 Total distribution does not equal 100% due to rounding.

Source: Oncology Roundtable 2008 Membership Survey.

Page 5: CMS Proposed Reporting

Perfecting Data Selection 41

© 2008 The Advisory Board Company • 17631

The wide variability in the type and scope of metrics tracked by cancer programs has made it challenging for institutions to benchmark their performance externally. At fi rst glance, additional data from the Oncology Roundtable Quality Survey seem to indicate that fi nding performance benchmarks is a simple task, with 83 percent of hospitals responding that they are tracking their performance on quality measures against benchmarks. However, a deeper analysis provides evidence that most programs have identifi ed benchmarks for only a small portion of their metrics.

No 83%17%

“Does Your Oncology Program Compare Its Performance on Quality

Metrics Against Benchmarks?”

n=140 Hospitals

“For What Percentage of Metrics Do You Have Benchmarks?”1

n=99 Hospitals

10% orLess

20% 30% 40% 50% 60% 70% 80% 90% 100%

8%7%

15%

10%12%

5%

9%

12%11%

10%

Percentage of Metrics with Benchmarks

Yes

Resulting in Limited Benchmarking

1 Total distribution does not equal 100% due to rounding. Source: Oncology Roundtable 2008 Membership Survey.

Variability Posing Challenges to Benchmarking

Page 6: CMS Proposed Reporting

© 2008 The Advisory Board Company • 17631

42 The New Quality Mandate

The fi rst step to establishing a core set of oncology quality measures is to apply a reality check. Metrics should be evaluated for alignment with the larger goals of the service line, hospital, or health system. In addition, the reliability of the data source and the ease of collection should also be considered. Lastly, programs should evaluate whether the indicator will resonate with physicians, staff, and other audiences, and whether is it is rooted in evidence-based support.

Included on the facing page is an exercise developed to assist cancer program administrators in eliminating metrics that do not pass these fi ve screens. Each set of questions maps to a specifi c screen and is intended to help determine if the indicator under evaluation meets the criteria for meaningfulness, reliability, feasibility, communicability, and scientifi c support.

Suggested Screens for Metric Selection

Metric Screen Description Rationale

Meaningfulness

Selected metrics should align with service line and hospital-wide organizational goals

Metrics misaligned with larger priorities unlikely to receive adequate resources and support; moreover, misalignment may stunt service line growth and development opportunities

Reliability

Data available from information system should be accurate, clearly defi ned, and measure what is intended

Absence of trustworthy data results in suspicion toward performance, often yielding inaction

Collection Feasibility

Data collection process should be manageable given institutional resources

Metrics that require laborious manual abstraction may drain available resources; similarly, electronic sources not built around specifi c metrics cannot be easily queried for data

Ease of Communication

Defi nition and rationale for metric should be easy to follow and understand

Lack of understanding about metric defi nition and relevance will not impact the audience’s decision-making process

Scientifi c Support

Measure should be rooted in evidence-based literature and support

Questions or controversy over the clinical validity of measures results in physician resistance to metric tracking

5

3

4

1

2

Metric Selection Playbook

Five Steps for Optimizing Data Utility

Data Utility

1 Apply a Reality Check

Be Consistent with National Efforts

Evaluate Impact Opportunity

Ensure Metric Balance

Exhaust Benchmarking Opportunities

2

3

5

4

Step #1: Apply a Reality Check

To help organizations overcome the innate challenges associated with identifying a robust set of cancer-related measures, the Oncology Roundtable has created a Metric Selection Playbook. This step-by-step instruction guide is meant to aid cancer programs in choosing the most meaningful set of metrics based on their own institutional realities.

Source: Oncology Roundtable interviews and analysis.

Page 7: CMS Proposed Reporting

Perfecting Data Selection 43

© 2008 The Advisory Board Company • 17631

“Reality Check” Red Flag Questions

YES NO

Meaningfulness

Does your clinical staff consider the metric to be important?

Does your senior administration consider the metric to be important?

Does the metric align with hospital-wide goals?

Does the metric align with oncology service line goals?

Reliability

Is the metric calculated by an automated system?

If not, are you certain the reported data is accurate?

Do managers and clinicians trust the data for decision making?

Collection Feasibility

Are existing information systems capable of generating reports on the metric?

Can staff collect and report the data within a reasonable time frame?

Is the cost of measurement acceptable?

Can this metric be tracked frequently enough to inform action?

Ease of Communication

Is this metric easily explained to and understood by internal audiences, including physicians, managers, and executives?

Can the different constituencies involved come to consensus on the metric defi nition?

Are physicians and staff aware of the importance for tracking the metric?

Is the metric easily understood by external audiences, including consumers and payers?

Scientifi c Support

Is the metric derived from clinical guidelines or published standards?

Do physicians agree that the metric is evidence-based and clinically relevant?

Is there consensus in the industry regarding the scientifi c acceptability of the metric?

Note on Use

The questions listed below are designed to help organizations prioritize which oncology-related quality metrics to track. Questions should be applied to each metric under consideration. A majority of “no” answers for any one category suggests a metric should be eliminated from consideration.

Source: Oncology Roundtable interviews and analysis.

Page 8: CMS Proposed Reporting

© 2008 The Advisory Board Company • 17631

44 The New Quality Mandate

Step #1: Apply a Reality Check

Relying on Oncology Expertise

Percentage of Hospitals Citing Stakeholder as Havinga High Level of Infl uence on Metric Selection

n=140 Hospitals

OncologyAdministrative

Director

OncologyMedical Director

CancerCommittee

Physicians MultidisciplinaryTumor Site Teams

69%63%

58%

43%

35%

An obvious, yet critical, step to honing in on a manageable list of quality performance metrics is inviting key oncology leaders to be involved in such discussions. As the data from the 2008 Oncology Roundtable Membership Survey demonstrate, the most infl uential stakeholders for the service line are in fact highly involved in the metric selection process.

Source: Oncology Roundtable 2008 Membership Survey.

Page 9: CMS Proposed Reporting

Perfecting Data Selection 45

© 2008 The Advisory Board Company • 17631

Forgetting Some Key Players

Percentage of Hospitals Citing Group as Having a High Level of Infl uence on Metric Selection

28%

Registry IT

2%

Benefi ts to Involvement:

• Offer realistic expectations surrounding accuracy and completeness of data in registry

• Provide accurate estimates of time lag of registry data points

• Provide realistic expectations regarding what can and cannot be collected

• Accurately estimate time and resources required for data collection

• Provide insight on whether data can be feasibly collected

• Design systems that can capture the metric or modify existing information systems to ease data collection

n = 140 Hospitals

Step #1: Apply a Reality Check

That said, other key players are often left out of the discussion. Additional fi ndings from the Membership Survey reveal only 28 percent of hospitals include registry staff in metric selection, and even fewer engage key IT personnel. This is an unfortunate oversight, as registry and IT staff can provide realistic insight into the feasibility and availability of resources associated with data collection.

Source: Oncology Roundtable 2008 Membership Survey; Oncology Roundtable interviews and analysis.

Page 10: CMS Proposed Reporting

© 2008 The Advisory Board Company • 17631

46 The New Quality Mandate

Page 11: CMS Proposed Reporting

Perfecting Data Selection 47

© 2008 The Advisory Board Company • 17631

Step #2: Be Consistent with National Efforts

In addition to evaluating metrics against a number of key screens, a second step in the selection process is to be consistent with national efforts. Detailed below are those metrics Roundtable researchers identifi ed as having the highest level of overlap across organizations. Since these are the measures with the greatest consensus to date, it is likely that they will be considered for adoption in future pay-for-reporting or pay-for-performance initiatives. Focusing on these measures allows cancer programs to establish effi cient tracking mechanisms, as well as proactively gauge (and improve) performance.

Measure NQF ASCO/NCCN

CMS (proposed)

ASCO-QOPI NCBC CoC (e-QuIP

or CP3R) NAPBC PQRI

Radiation therapy is administered within 1 year (365 days) of diagnosis for women under age 70 receiving breast conserving surgery for breast cancer

✓ ✓ ✓ ✓ ✓ ✓ ✓

Combination chemotherapy is considered or administered within 4 months (120 days) of diagnosis for women under 70 with AJCC T1cN0M0, or Stage II or III hormone receptor negative breast cancer

✓ ✓ ✓ ✓ ✓ ✓

Tamoxifen or third generation aromatase inhibitor is considered or administered within 1 year (365 days) of diagnosis for women with AJCC T1cN0M0, or Stage II or III hormone receptor positive breast cancer

✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓

Adjuvant chemotherapy is considered or administered within 4 months (120 days) of diagnosis for patients under the age of 80 with AJCC Stage III (lymph node positive) colon cancer1

✓ ✓ ✓ ✓ ✓ ✓

At least 12 regional lymph nodes are removed and pathologically examined for resected colon cancer

✓ ✓ ✓ ✓ ✓

Radiation therapy is considered or administered within 6 months (180 days) of diagnosis for patients under the age of 80 with clinical or pathologic AJCC T4N0M0 or Stage III receiving surgical resection for rectal cancer

✓ ✓

Postoperative adjuvant chemotherapy is considered or administered within 9 months (270 days) of diagnosis for patients under the age 80 years with AJCC Stage II or Stage III rectal cancer

✓ ✓

Mammography follow-up rates ✓ ✓

Needle biopsy to establish diagnosis of cancer precedes surgical excision/resection ✓ ✓

Plan for chemotherapy documented before chemo administered ✓ ✓

Documentation of ER and PR receptor for breast cancer patients ✓ ✓

Avoid Reinventing the Wheel

Metrics Endorsed by Leading Organizations

1 CMS defi nes measure as adjuvant chemotherapy is considered or administered within 4 months of surgery rather than diagnosis.

Source: National Quality Forum, available at www.qualityforum.org, accessed August 28, 2008; American College of Surgeons, available at: www.facs.org, accessed August 28, 2008; National Consortium of Breast Centers, available at: www.breastcare.org, accessed August 28, 2008; CMS, available at: www.hhs.cms.gov, accessed August 28, 2008; Oncology Roundtable interviews and analysis.

Page 12: CMS Proposed Reporting

© 2008 The Advisory Board Company • 17631

48 The New Quality Mandate

Step #3: Evaluate Impact Opportunity

The third step in the metric selection process is to evaluate whether the measure is within the hospital’s span of control. Detailed to the right are the percentages of metrics proposed by leading cancer organizations that are controlled by physicians, the hospital, or jointly controlled by both parties. As the vast majority are controlled by physicians, the ability of the hospital to infl ect change on certain measures is highly dependent on service line structure and whether or not physicians are employed by the institution.

Despite the prevalence of physician-controlled metrics, there are a few categories of quality measures, as detailed on the facing page, that are within the purview of most cancer programs. The fi rst category—timeliness of care measures—not only impacts patient satisfaction but is also relevant to most hospital-based breast centers. The second and third categories—inpatient and surgical indicators and patient safety and nursing-sensitive measures—take place within the four walls of the hospital and are highly dependent on the processes and protocols implemented at the institution. Finally, although still largely under the jurisdiction of physicians, documentation is a “must have” area of focus, as it is a prerequisite for evaluating quality performance.

Majority of Indicators Physician-Controlled

Quality Indicators Controlled by Hospitals Versus Physicians, by Organization

NQF-Endorsed CMS (proposed)

NCBC QOPI

80%

20%

50%

9%

41%79%

21%

100%

Physician + Hospital

Physician Only

Physician + Hospital

Physician + Hospital

Hospital Only

Physician Only

Physician Only

Source: Oncology Roundtable interviews and analysis.

Physician Only

Page 13: CMS Proposed Reporting

Perfecting Data Selection 49

© 2008 The Advisory Board Company • 17631

Step #3: Evaluate Impact Opportunity

Identifying What You Can Change

Much Occurring Within Your Four Walls

Timeliness of Care Inpatient and Surgical Outcomes

• Mammography callback rate

• Time between diagnostic mammogram and needle/core biopsy

• Time between needle biopsy and initial breast cancer surgery

• Time between initial breast biopsy (excluding open surgical) and pathology results

Medical Record

Documentation

• Documented plan for chemotherapy, including doses and time intervals, before chemotherapy started

• Smoking cessation counseling provided to cigarette smokers by second visit to offi ce

• Cancer stage documented

• Patients who have a baseline AJCC cancer stage or documentation that the cancer is metastatic in the medical record at least once during the 12-month reporting period

Patient Safety and Nursing-Sensitive Indicators

• Flow sheet for chemotherapy with doses, dates of administration, and blood counts available in the chart

• Chemotherapy medication errors

• Infection rates

• Falls incidence

• Breast conservation surgery rate

• Length of stay >14 days after elective lobectomy for lung cancer

• Risk-adjusted morbidity and mortality for esophagectomy for cancer

• Death in an acute care setting

Source: Oncology Roundtable interviews and analysis.

Page 14: CMS Proposed Reporting

© 2008 The Advisory Board Company • 17631

50 The New Quality Mandate

Step #3: Evaluate Impact Opportunity

Dig for Defi ciencies

Historical Performance Gaps

Frontline Staff

Physicians Patients Payers

• Do we have a history of suboptimal performance in certain areas?

• Have past quality improvement initiatives failed?

• Does the frontline staff feel that care is below standard in certain areas?

• Would understanding performance on certain metrics affect the testing or treatment provided to patients?

• Would understanding performance on certain metrics impact whether or not patients purchased services from that entity?

• Would understanding performance on certain metrics impact whether or not payers include the hospital in their network?

• Would it affect the reimbursement received from payers?

Aside from evaluating whether a measure falls within the jurisdiction of the hospital, an alternative method for evaluating impact opportunity is to consider whether tracking the metric will affect the care being delivered to patients or the decisions being made by internal and external stakeholders. Detailed here are the relevant constituencies to survey and a list of key questions to ask when assessing whether the measure represents an area that will truly impact care processes or perceptions of the organization.

Source: Oncology Roundtable interviews and analysis.

Page 15: CMS Proposed Reporting

Perfecting Data Selection 51

© 2008 The Advisory Board Company • 17631

Step #3: Evaluate Impact Opportunity

Illustrated here is the story of Phelps Hospital,1 one organization that successfully sought input from frontline staff to uncover opportunities for improvement. When oncology nurses voiced concern that referrals to hospice were occurring too late in the continuum of care, the cancer committee elevated hospice referrals to the dashboard. The fi rst round of data confi rmed that indeed, 25 percent of referrals were not occurring until the last week before death, which led Phelps to set a timely hospice referral target and evaluate strategies to improve the referral process.

Trusting Their Instincts at Phelps Hospital1

Frontline Staff Identify, Address Key Performance Gap

Concern Over Hospice Referrals

Measurement Confi rms Suspicions

Goal Set toElevate Performance

• Frontline staff voice concern over hospice referrals occurring too late in continuum of care

• Cancer committee decides to focus on decreasing hospice defects

• Upon measurement, hospital fi nds that 25% of referrals were defective2

• Sets goal of less than 15% of referrals within last week of death

Goal 15%

Hospice Referrals

Case in Brief

Phelps Hospital

• Academic medical center located in the Midwest

• Cancer committee sought input from frontline staff regarding areas of underperformance

• Hospice referrals identifi ed as area in need of improvement; subsequent measurement proved intuition correct

• Committee sets performance benchmark, implements strategies to elevate performance

Phelps Cancer CenterDashboard 2008

Source: Oncology Roundtable interviews and analysis.1 Pseudonym.2 Patient was enrolled in hospice within one week of death.

Page 16: CMS Proposed Reporting

© 2008 The Advisory Board Company • 17631

52 The New Quality Mandate

Step #4: Ensure Metric Balance

The fourth step in the selection process is to ensure metric balance. For a cancer programs to obtain a well-rounded view of quality performance, they must track indicators across tumor sites, as well as along the full care continuum. That said, as it would be impossible for any one institution to measure performance on care provided to all patients at every point in the process, they should prioritize those tumor sites with the greatest patient volumes or growth potential.

Indicators Span the Continuum of Care

Callback rate for mammography

Needle biopsy to establish diagnosis precedes surgical excision

Documented discussion (prior to treatment) of infertility risks with patients in reproductive years

Combination chemotherapy recommended within 4 months of diagnosis for women under 70 with AJCC Stage I, II or III hormone receptor negative breast cancer

Pain assessed by the second offi ce visit

Patients receiving chemotherapy within last two weeks of death

Percentage of patients aged 50 through 80 years who received the appropriate colorectal screening

Colorectal cancer patients who have a pT and pN category and histologic grade for their cancer

Treatment discussion and patient consent for administration of chemotherapy documented

Adjuvant chemotherapy recommended within 4 months of AJCC stage III colon cancer diagnosis for patients with lymph node involvement

Pain intensity quantifi ed by the second offi ce visit

Patient enrolled in hospice or referred for palliative care services before death

Appropriateness of PSA screening tests

Newly diagnosed with no PSA in prior 3 months receiving test within 1 month after diagnosis or prior to any treatment

Review of treatment options in patients with clinically localized prostate cancer

Three-dimensional radiotherapy for patients with prostate cancer

For patients with moderate to severe pain, documentation that pain was addressed

For patients not referred in last two months of life, hospice or palliative care discussed

Time from symptom presentation to diagnostic workup

Diagnosis at stage I

Patient evaluated for clinical trial enrollment

Adjuvant cisplatin-based chemotherapy recommended for patients within 60 days after curative resection for NSCLC with Stages IIA, IIB, and IIIA disease

Effectiveness of pain medication assessed on visit following new narcotic prescription

Patient enrolled in hospice within 3 days of death

Screening Diagnosis Treatment Planning Treatment Pain

Management End-of-Life

Breast

Colorectal

Prostate

Lung

Source: Oncology Roundtable interviews and analysis.

Page 17: CMS Proposed Reporting

Perfecting Data Selection 53

© 2008 The Advisory Board Company • 17631

Step #4: Ensure Metric Balance

In addition to monitoring quality metrics across the care continuum of various tumor sites, hospitals must also balance quality with other measures that assess the overall health of the service line. Shown here is a sample service line dashboard from Horton Health System,1 which not only balances functional categories (e.g., clinical quality, service, HR, and fi nance), but also ensures that within any given category, there is a suffi cient mix of indicators to provide comprehensive measurement of the cancer program’s strategic priorities.

Quality in the Larger Context

Balancing Clinical Quality with Other Organizational Priorities

Horton Health System1 Oncology Dashboard

Indicator Goal Target Status/Comments

Clinical Quality

Radiation therapy is administered within 1 year (365 days) of diagnosis for women under age 70 receiving breast conserving surgery for breast cancer

70% 85%

Combination chemotherapy is considered or administered within 4 months (120 days) of diagnosis for women under 70 with AJCC T1cN0M0, or Stage II or III hormone receptor negative breast cancer

75% 85%

Adjuvant chemotherapy is considered or administered within 4 months (120 days) of diagnosis for patients under the age of 80 with AJCC Stage III (lymph node positive) colon cancer

73% 85%

At least 12 regional lymph nodes are removed and pathologically examined for resected colon cancer

65% 80%

Pathology reports viewable in chart at time of consultation

80% 100%

Service

Overall patient satisfaction 90% 95%

Patients would recommend physician to a friend 92% 100%

Patient satisfaction with shared decision making for choice of surgical option for breast surgery

90% 100%

Overall physician satisfaction 92% 100%

Overall employee satisfaction 91% 100%

Human Resources

RN retention rate 75% 100%

RN vacancy rate 5% 1%

Overtime hours worked by oncology staff 900 600

Number of employee recordable injuries 1 0

Finance

Total payroll expense $700 K $600 K

Revenue per oncologist $200 K $300 K

Average drug cost $75 $50

Net revenue per FTE provider $75 K $100 K

Average days in accounts receivable 45 30

Include key functional categories

Balanced number of indicators in each category ensures suffi cient coverage of priorities

1 Pseudonym. Source: Oncology Roundtable interviews and analysis.

Page 18: CMS Proposed Reporting

© 2008 The Advisory Board Company • 17631

54 The New Quality Mandate

Step #5: Exhaust Benchmarking Opportunities

As a fi nal step in selecting quality metrics, cancer programs should strive to measure performance against benchmarks whenever possible. Unfortunately, this is a task that is often easier said than done, since current processes for identifying benchmarks, including literature reviews and internal audits, tend to be both time and resource intensive.

Due to the challenges associated with traditional methods for benchmarking, cancer programs should give preference to metrics for which credible and objective external benchmarks exist. Fortunately, there are several private programs publicly available and data sources that provide comparative performance information on a variety of indicators. The chart on the facing page lists the organizations identifi ed through the research that offer benchmarks for oncology quality metrics, along with the access requirements, submission cycles, and information on the most recent data available.

Internal Audits

Traditional Approaches Rife with Challenges

Literature Reviews

• Exhaustive literature reviews time and resource intensive

• Physician champion and medical staff involvement critical to acceptance of published benchmarks

• Ongoing, continual reassessment needed to stay current with new research

• Inclusive chart audits extremely resource intensive; however, also challenging to determine a representative sample size

• Process entirely dependent on completeness and accuracy of documentation

• Diffi cult to risk-adjust outcomes measures

JOURNAL OFCLINICAL ONCOLOGY

Key Challenges

Source: Oncology Roundtable interviews and analysis.

Page 19: CMS Proposed Reporting

Perfecting Data Selection 55

© 2008 The Advisory Board Company • 17631

Step #5: Exhaust Benchmarking Opportunities

Source Benchmarking Data Available1 Access Requirements

Data Submission

Cycle

Most Recent Data

AvailableASCO QOPI Provides comparative practice results and

aggregate data for 73 quality measuresMust participate to view data

Biannually Spring 2008

CP3R Offers comparative performance on concordance with guideline for adjuvant chemotherapy for Stage III colon cancer

Must be accredited by the Commission on Cancer

Annually 2006

e-QuiP Compares individual hospital performance to concordance with fi ve NQF-endorsed measures in order to establish a baseline for improvement

Must be accredited by the Commission on Cancer

Annually 2006

IMPAC Oncology Data Alliance

Provides benchmark and comparative outcome reports

Must be an IMPAC Mosaiq EMR client

Annually 2005

IMPAC National Oncology Database (NODB)

Offers adhoc and customized comparative reports about the incidence, distribution, treatment, and outcome of cancer cases

Must be an IMPAC registry client Annually 2005

National Cancer Database Provides benchmark reports for the 11 solid-organ tumors, allowing users to defi ne queries based on patient gender, age, ethnicity, histology, stage, fi rst-course therapy, type of surgical resection, hospital type, and geographic location; NCDB also offers fi ve-year survival reports stratifi ed by AJCC staging

Publicly available

Annually 2005

National Consortium of Breast Centers (NCBC) National Quality Measures for Breast Centers Program (NQMBC)

Provides “apples to apples” comparative performance on breast measures to centers of similar size and type

Must participate to view data

Biannually 2008

Oncology Roundtable 2004 Benchmarking Clinical Quality and 2005 Clinical Quality Strategy

Various Available at no cost to members

N/A N/A

Society of Thoracic Surgeons (STS) General Thoracic Surgery Database

Collects data on surgical management of primary lung tumors, including complication rates, median operating time, 30-day and in-hospital mortality rates, stage distribution, median length of stay

Must participate to access data, annual participation fee per surgeon

Biannually 2007

Surveillance Epidemiology and End Results (SEER)

Five-year survival rates, relative survival rates by year of diagnosis, mortality by tumor site, incidence by tumor site

Publicly availableAnnually 2005

A Growing External Arsenal

Sources of Oncology Benchmarks

Source: Oncology Roundtable interviews and analysis.1 Full list of indicators available in the Appendix.

Page 20: CMS Proposed Reporting

© 2008 The Advisory Board Company • 17631

56 The New Quality Mandate

An important consideration when utilizing benchmarking data is ensuring the validity of comparisons with like organizations. Detailed here is a benchmarking initiative sponsored by the National Consortium of Breast Centers (NCBC), which allows participating centers to compare performance on 33 discrete quality indicators against organizations with similar services, volumes, and patient demographics.

In utilizing the online tool, participants fi rst defi ne which services are actually offered at their facility, and which are referred out. Next, they answer a number of questions that helps to categorize them into one of fi ve types of centers: a screening breast center, a diagnostic breast center, a clinical breast center, a treatment breast center, or a comprehensive breast center. These designations later serve as the “like” groups for comparing performance on each quality indicator. Finally, the user enters the performance data for up to 33 quality indicators that are intended to refl ect the quality of breast care being delivered at the facility.

Case Study: National Consortium of Breast Centers

Comparing “Apples to Apples”

User defi nes service offerings at breast center

1

Source: National Consortium of Breast Centers, available at: www.breastcare.org, accessed August 28, 2008; Oncology Roundtable interviews and analysis.

Additional Questions

2. Regional and district location

3. Location of patient population

4. Number of mammograms performed annually

5. Number of new breast cancer patients treated or diagnoses annually

6. Patient population demographics

7. Type of ownership, administration, and oversight

Additional questions about the center answered, allowing for comparison with “like” centers

2

User enters performance data for up to 33 indicators

3

Page 21: CMS Proposed Reporting

Perfecting Data Selection 57

© 2008 The Advisory Board Company • 17631

Highlighted here is a sample output of the NCBC benchmarking tool. After each six-month submission cycle, the tool provides the mean, median, and mode on each indicator, and profi les that organization’s performance relative to the top and bottom performers in the cohort. Participants can also view their data over time, which allows for quick identifi cation (and correction) of a slide in performance.

Case Study: National Consortium of Breast Centers

Instant Gratifi cation

Six-Month Snapshot

Trend Data

Source: National Consortium of Breast Centers, available at: www.breastcare.org, accessed August 24, 2008; Oncology Roundtable interviews and analysis.

Provides aggregate of all responses for one data submission cycle

Displays score relative to top and bottom performers

Performance trended and compared over time

Actual values for each data cycle

Page 22: CMS Proposed Reporting

© 2008 The Advisory Board Company • 17631

58 The New Quality Mandate

Page 23: CMS Proposed Reporting

Perfecting Data Selection 59

© 2008 The Advisory Board Company • 17631

Key Takeaways Regarding Metric Selection

Realistic Metric Selection a Must

With endless metric options and limited resources, hospitals must apply a reality check to the metric selection process; assessing metrics against the screens of meaningfulness, reliability, feasibility, scientifi c support, and communicability is the fi rst step to ensuring organizations select metrics with the greatest utility.

Extend Involvement Outside of Traditional Circles

While oncology staff are essential players in the metric selection game, their involvement alone may be insuffi cient; inclusion of individuals from tumor registry and the IT department not only helps to weed out metrics that may not be feasible to collect, but also allows for timely and effi cient modifi cations of existing data sources and streamlined data collection.

Piggyback on National Efforts

Despite the fact that consensus on a uniform and robust set of national endorsed oncology metrics is still years away, national efforts are not all for naught; oncology programs should continue to monitor those metrics proposed by national organizations and adopt those with the greatest overlap in consensus, as they have passed rigorous scrutiny and will likely be the building blocks of a federal pay-for-performance program.

Minimum Set of Metrics Essential for All

When choosing oncology metrics, at the minimum, hospitals should commit to a core set of indicators; fi rst, patient safety, as every patient has the right to safe and accurate care, and second, documentation, as without accurate capture of data, quality measurement is an impossibility.

Choose Additional Measures Based on Impact Opportunity

Moving beyond the basics, additional indicators should be screened for the impact they will ultimately have on care quality; choosing metrics within the control of the hospital, as well as those that will address past defi ciencies or dramatically impact key constituencies (patients, physicians, payers) is essential to optimizing the return on investment.

Balance Metric Selection Both Within and Outside of Quality Realm

Given the breadth and depth of indicators available for measurement, hospitals should seek to achieve balance among metrics; cancer programs should consider spreading metrics across various tumor sites and points in the care continuum (to ensure a balanced view of program quality), while at the same time, keeping the total number of quality metrics in check, as quality is just one of many program attributes that need to be monitored to ensure the health of the service line.

Leverage External Benchmarking Opportunities

In light of the resources required to monitor the literature and conduct an external audit, oncology programs should utilize external benchmarking opportunities whenever possible; tapping into an existing cancer network or accessing external databases will help programs effi ciently and effectively understand their performance and uncover areas for improvement.

1

4

3

2

5

7

6