performance improvement projects: validating process, tips, and hints eric jackson, ma research...

27
Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Upload: dana-townsend

Post on 31-Dec-2015

216 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Performance Improvement Projects:Validating Process, Tips, and Hints

Eric Jackson, MA

Research Analyst

October 19, 2009

Page 2: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Overview of the Training• Meeting Logistics• Validation Process• CMS Protocol Activities• Tips and Hints • Resources• Questions and Answers• Other Training Topics

Page 3: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Meeting Logistics• Using GoToWebinar™• How to ask questions to team during the

presentation – Through the questions section in the GoToWebinar

application– You can use the GoToWebinar raise your hand option

• We will review submitted and additional questions at the end of the presentation

Page 4: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

First the Basics of GoToWebinar: The GoToWebinar Attendee Interface

1. Viewer Window 2. Control Panel

Page 5: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Validation Process for PIP• Request desk materials

– All PIPs currently in progress or completed since the last EQR should be sent

– Example: NCQA Activity Forms

• After receiving desk materials, three PIPs are selected for validation

• Perform CMS Protocol Activities by reviewing the desk materials we received

Page 6: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

CMS Protocol Activities

• Activity One: Assessing the Study Methodology• Activity Two: Verify Study Findings (optional)

• Activity Three: Evaluating Overall Validity and Reliability of Study Results

Page 7: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Activity One: Assessing the Study Methodology

• Ten steps with related review questions and scoring– 27 components / questions and 116 possible points

• Specific questions can be found in the CCME PIP Validation Overview document

Page 8: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Activity One: Assessing the Study Methodology (cont)

• Each component is scored by the degree they meet the protocol requirements– MET: component fully meets the criteria without any issues– PARTIALLY MET: component meets some but not all of the

criteria– NOT MET: component fails to meet most or all the criteria– NA: component does not apply to the project being reviewed

Page 9: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Activity One: Step One

• Review the Selected Study Topic(s)– How was the topic of the study selected– Is the topic appropriate

• 3 questions with 7 possible points

Page 10: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Activity One: Step Two

• Review the Study Question(s)– Is the study question documented

• Single question with 10 possible points

Page 11: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Activity One: Step Three

• Review the Selected Study Indicator(s)– Are the study indicators objective and clearly defined– Are the study indicators appropriate

• 2 questions with 11 possible points

Page 12: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Activity One: Step Four

• Review the Identified Study Population– Is the study population well defined– Is the population being captured correctly

• 2 questions with 6 possible points

Page 13: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Activity One: Step Five

• Review the Sampling Methods– Applies only if sampling was used in the project– Is a valid sampling method being used– Is the sample large enough

• 3 questions with 20 possible points

Page 14: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Activity One: Step Six

• Review the Data Collection Procedures– Are data sources clearly specified– Was a data analysis plan established in the

documentation– Did the instruments used for data collection provide

consistent, accurate data

• 6 questions with 18 possible points

Page 15: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Activity One: Step Seven

• Assess Improvement Strategies– Are reasonable interventions being planned and

implemented

• Single question with 10 possible points

Page 16: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Activity One: Step Eight

• Review the Data Analysis and Interpretation of Study Results– Was the data analysis plan followed– Are numerical results presented accurately and

clearly

• 4 questions with 17 possible points

Page 17: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Activity One: Step Nine

• Assess Whether Improvement is “Real” Improvement– Is there any documented, quantitative improvement– Was the same methodology used for baseline and

repeated measurement

• 4 questions with 12 possible points

Page 18: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Activity One: Step Ten

• Assess Sustained Improvement– Is sustained improvement demonstrated in the

repeated measurements of the study

• Single question with 5 possible points

Page 19: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Activity Two: Verify Study Findings

• Optional Activity • Requires the plans to produce the data that

generated the results of the PIP• Review would try to mimic the documented

results from the data received from the plan

Page 20: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Activity Three: Evaluating Overall Validity and Reliability of Study Results

• Scores are summarized• Validation Finding is calculated

o VF = (score project received / total possible points) o Multiply by 100 to report as a percentage

• Final Audit Designation is assigned

Page 21: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Activity Three: Evaluating Overall Validity and Reliability of Study Results (cont)

• Score ranges for the Final Audit DesignationAUDIT DESIGNATION POSSIBILITIES

High Confidence in Reported Results

Little to no minor documentation problems or issues that do not lower the confidence in what the plan reports. Validation findings must be 90%–100%.

Confidence in Reported Results

Minor documentation or procedural problems that could impose a small bias on the results of the project. Validation findings must be 70%–89%.

Low Confidence in Reported Results

Plan deviated from or failed to follow their documented procedure in a way that data was misused or misreported, thus introducing major bias in results reported. Validation findings between 60%–69% are classified here.

Reported Results NOT Credible

Major errors that put the results of the entire project in question. Validation findings below 60% are classified here.

Page 22: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Tips and Hints• Remember the Study Questions in your project

documentation!• Document the numerators and denominators

along with the rates• Double check rate calculations

Page 23: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Tips and Hints (cont)

• Check your terminologyo Percent change is not the same as percentage point

change• Example: Baseline = 55% Re-measurement = 95%• Percent Change = (95% - 55%) / 55% = 73% change• Percentage Point Change = 95% - 55% = 40 percentage

point change

Page 24: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Resources• Go to: www.thecarolinascenter.org

– Search for SC EQR – Training materials should be one of the top results

• CCME PIP Validation Overview document• CMS PIP Protocol

o “Validating Performance Improvement Projects: A protocol for use in conducting Medicaid external quality review activities”

Page 25: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Question and Answers

Page 26: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Additional Training Topics

Page 27: Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009

Please Remember the Evaluation!

It will display after you end the webinar.