performance improvement projects technical assistance prepaid mental health plans thursday, march...
TRANSCRIPT
Performance Improvement Projects Technical Assistance
Prepaid Mental Health Plans
Thursday, March 29, 200710:30 a.m. – 12:15 p.m.
Cheryl L. Neel, RN, MPH, CPHQ
Manager, Performance Improvement Projects
David Mabb, MS
Sr. Director, Statistical Evaluation
Presentation Outline
• PIP Overall Comments• Aggregate MCO PIP Findings• Aggregate PMHP Specific Findings• Technical Assistance with Group Activities
– Study Design– Study Implementation– Quality Outcomes Achieved
• Questions and Answers
Key PIP StrategiesKey PIP Strategies
1. Conduct outcome-oriented projects
2. Achieve demonstrable improvement
3. Sustain improvement
4. Correct systemic problems
Validity and Reliability of PIP Results
• Activity 3 of the CMS Validating Protocol: Evaluating overall validity and reliability of PIP results:– Met = Confidence/High confidence in reported
PIP results– Partially Met = Low confidence in reported PIP
results– Not Met = Reported PIP results not credible
Summary of PIP Validation Scores
Percentage Score of Evaluation Elements Met HMO NHDP PMHP Total 90% to 100% 2 0 0 2 80% to 89% 13 0 6 19 70% to 79% 13 1 0 14 60% to 69% 8 3 1 12 Less than 60% 17 16 1 34
Total 53 20 8 81
Proportion of PIPs Meeting the Requirements for Each Activity
Aggregate Valid Percent Met
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51 53
Evaluation Element Number
I II III IV V VI VII VIII IX X
PMHP Specific Findings
• 8 PIPs submitted• Scores ranged from 37% to 89%• Average score was 77%• Assessed evaluation elements were scored
as Met 78% of the time
Summary of PMHP Validation Scores
Percentage Score of Evaluation Elements Met PMHP 90% to 100% 0 80% to 89% 6 70% to 79% 0 60% to 69% 1 Less than 60% 1
Total 8
Study Design
Four Components:
1. Activity I. Selecting an Appropriate Study Topic
2. Activity II. Presenting Clearly Defined, Answerable Study Question(s)
3. Activity III. Documenting Clearly Defined Study Indicator(s)
4. Activity IV. Stating a Correctly Identified Study Population
Activity I. Selecting an Appropriate Study Topic - PMHP Overall Score
88% 88% 88%
100% 100%
88%
80%
82%
84%
86%
88%
90%
92%
94%
96%
98%
100%
1 2 3 4 5 6
Evaluation Element Number
Activity I. Selecting an Appropriate Study Topic
Results:
• 92 percent of the six evaluation elements were Met• 8 percent were Partially Met or Not Met• None of the evaluation elements were Not Applicable
or Not Assessed
Activity I: Review the Selected Study Topic
HSAG Evaluation Elements:• Reflects high-volume or high-risk conditions (or was selected
by the State).• Is selected following collection and analysis of data (or was
selected by the State). • Addresses a broad spectrum of care and services (or was
selected by the State).• Includes all eligible populations that meet the study criteria.• Does not exclude members with special health care needs.• Has the potential to affect member health, functional status,
or satisfaction.
Bolded evaluation elements show areas for improvement
Activity II. Presenting Clearly Defined, Answerable Study Question(s) - PMHP
Overall Score
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
1 2
Evaluation Element Number
No PIP studies received a Met for both evaluation elements
Activity II. Presenting Clearly Defined, Answerable Study Question(s)
Results:
• 0 percent of the two evaluation elements were Met
• 100 percent were Partially Met or Not Met• None of the evaluation elements were Not
Applicable or Not Assessed
Activity II: Review the Study Question(s)
HSAG Evaluation Elements:
• States the problem to be studied in simple terms.
• Is answerable.
Bolded evaluation elements show areas for improvement
Activity III. Documenting Clearly Defined Study Indicator(s) - PMHP Overall Score
88%
75%
0%
88%
100%
0%
100%
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
1 2 3 4 5 6 7
Evaluation Element Number
Activity III. Documenting Clearly Defined Study Indicator(s)
Results:
• 59 percent of the seven evaluation elements were Met
• 21 percent were Partially Met or Not Met• 20 percent of the evaluation elements were
Not Applicable or Not Assessed
Activity III: Review Selected Study Indicator(s)
HSAG Evaluation Elements:
• Is well defined, objective, and measurable.• Are based on practice guidelines, with sources
identified. • Allows for the study question to be answered. • Measures changes (outcomes) in health or functional
status, member satisfaction, or valid process alternatives. • Have available data that can be collected on each
indicator.• Are nationally recognized measure such as HEDIS®, when
appropriate.• Includes the basis on which each indicator was adopted, if
internally developed.
Bolded evaluation elements show areas for improvement
Activity IV. Stating a Correctly Identified Study Population - PMHP Overall Score
88%
0%
88%
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
1 2 3
Evaluation Element Number
Activity IV. Stating a Correctly Identified Study Population
Results:
• 58 percent of the three evaluation elements were Met
• 13 percent were Partially Met or Not Met• 29 percent of the evaluation elements were
Not Applicable or Not Assessed
Activity IV: Review the Identified Study Population
HSAG Evaluation Elements:
• Is accurately and completely defined. • Includes requirements for the length of a
member’s enrollment in the managed care plan.
• Captures all members to whom the study question applies.
Bolded evaluation elements show areas for improvement
Group Activity
Study Implementation
Three Components:
1. Activity V. Valid Sampling Techniques
2. Activity VI. Accurate/Complete Data Collection
3. Activity VII. Appropriate Improvement Strategies
Activity V. Presenting a Valid Sampling Technique - PMHP Overall Score
100% 100% 100% 100% 100% 100%
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
1 2 3 4 5 6
Evaluation Element Number
Activity V. Presenting a Valid Sampling Technique
Results:
• 3 out of the 8 PIP studies used sampling.• 38 percent of the six evaluation elements were
Met. • 0 percent were Partially Met or Not Met. • 63 percent of the evaluation elements were Not
Applicable or Not Assessed.
Activity V: Review Sampling Methods
* This section is only validated if sampling is used.
HSAG Evaluation Elements:
• Consider and specify the true or estimated frequency of occurrence. (N=8)
• Identify the sample size. (N=8)• Specify the confidence level to be used. (N=8)• Specify the acceptable margin of error. (N=8)• Ensure a representative sample of the eligible population.
(N=8)• Ensure that the sampling techniques are in accordance
with generally accepted principles of research design and statistical analysis. (N=8)
Populations or Samples?
Generally,Generally,
– Administrative data uses populations– Hybrid (chart abstraction) method uses
samples identified through administrative data
Activity VI. Specifying Accurate/Complete Data Collection - PMHP Overall Score
100% 100%
86%
100% 100% 100% 100% 100%
0%
86%
43%
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
1 2 3 4 5 6 7 8 9 10 11
Evaluation Element Number
Activity VI. Specifying Accurate/Complete Data Collection
Results:
• 55 percent of the eleven evaluation elements were Met
• 10 percent were Partially Met or Not Met• 35 percent of the evaluation elements were
Not Applicable or Not Assessed
Activity VI: Review Data Collection Procedures
HSAG Evaluation Elements:
• Clearly defined data elements to be collected. (55 percent Met)
• Clearly identified sources of data. (77 percent Met)• A clearly defined and systematic process for collecting
data that includes how baseline and remeasurement data will be collected. (62 percent Met)
• A timeline for the collection of baseline and remeasurement data. (57 percent Met)
• Qualified staff and personnel to collect manual data. (13 percent Met; 77 percent N/A)
• A manual data collection tool that ensures consistent and accurate collection of data according to indicator specifications. (13 percent Met; 77 percent N/A)
Bolded evaluation elements show areas for improvement
Activity VI: Review Data Collection Procedures (cont.)
HSAG Evaluation Elements:
• A manual data collection tool that supports interrater reliability. (13 percent Met; 77 percent N/A)
• Clear and concise written instructions for completing the manual data collection tool. (13 percent Met; 77 percent N/A)
• An overview of the study in the written instructions. • Administrative data collection algorithms that show steps
in the production of indicators• An estimated degree of automated data completeness
(important if using the administrative method).
Bolded evaluation elements show areas for improvement
Where do we look for our sources of data?
Baseline Data Sources
• Medical Records• Administrative claims/encounter data• Hybrid• HEDIS• Survey Data• MCO program data• Other
Activity VII. Documenting the Appropriate Improvement Strategies - PMHP Overall
Score
100% 100%
0% 0%0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
1 2 3 4
Evaluation Element Number
Activity VII. Documenting the Appropriate Improvement Strategies
Results:
• 44 percent of the four evaluation elements were Met
• 3 percent were Partially Met or Not Met• 53 percent of the evaluation elements were
Not Applicable or Not Assessed
Activity Seven: Assess Improvement Strategies
HSAG Evaluation Elements:
• Related to causes/barriers identified through data analysis and Quality Improvement (QI) processes.
• System changes that are likely to induce permanent change.
• Revised if original interventions are not successful.
• Standardized and monitored if interventions are successful.
Bolded evaluation elements show areas for improvement
Determining Interventions
Once you know how you are doing at baseline, what interventions will produce meaningful improvement in the target population?
First Do A Barrier Analysis
What did an analysis of baseline results show ?
How can we relate it to system improvement?
• Opportunities for improvement• Determine intervention(s) • Identify barriers to reaching improvement
How was intervention(s) chosen?
• By reviewing the literature– Evidence-based– Pros & cons– Benefits & costs
• Develop list of potential interventions -- what is most effective?
Types of Interventions
• Education • Provider performance feedback• Reminders & tracking systems• Organizational changes• Community level interventions• Mass media
Choosing Interventions
• Balance – potential for success with ease of use– acceptability to providers & collaborators– cost considerations (direct and indirect)
• Feasibility – adequate resources – adequate staff and training to ensure a
sustainable effort
Physician Interventions: Multifaceted Most Effective
• Most effective: – real-time
reminders– outreach/detailing – opinion leaders – provider profiles
• Less effective: – educational
materials (alone)– formal CME
program without enabling or reinforcing strategies
Patient Interventions
• Educational programs– Disease-specific education booklets– Lists of questions to ask your physician– Organizing materials: flowsheets, charts,
reminder cards– Screening instruments to detect
complications– Direct mailing, media ads, websites
Evaluating Interventions
• Does it target a specific quality indicator? • Is it aimed at appropriate stakeholders?• Is it directed at a specific process/outcome of
care or service?• Did the intervention begin after baseline
measurement period?
Interventions Checklist
Analyze barriers (root causes) Choose & understand target audience Select interventions based on cost-benefit Track intermediate results Evaluate effectiveness Modify interventions as needed Re-Measure
Group Activity
Quality Outcomes Achieved
Three Components:
1. Activity VIII. Presentation of Sufficient Data Analysis and Interpretation
2. Activity IX. Evidence of Real Improvement Achieved
3. Activity X. Data Supporting Sustained Improvement Achieved
Activity VIII. Presentation of Sufficient Data Analysis and Interpretation - PMHP Overall
Score
50%
0%
50%
100%
50%
100%
50% 50% 50%
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
1 2 3 4 5 6 7 8 9
Evaluation Element Number
Activity VIII. Presentation of Sufficient Data Analysis and Interpretation
Results:
• 14 percent of the nine evaluation elements were Met
• 8 percent of the evaluation elements Partially Met or Not Met
• 78 percent of the evaluation elements were Not Applicable or Not Assessed
Activity VIII: Review Data Analysis and Interpretation of Study Results
HSAG Evaluation Elements:
• Is conducted according to the data analysis plan in the study design.
• Allows for generalization of the results to the study population if a sample was selected.
• Identifies factors that threaten internal or external validity of findings.
• Includes an interpretation of findings. • Is presented in a way that provides accurate,
clear, and easily understood information.
Bolded evaluation elements show areas for improvement
Activity VIII: Review Data Analysis and Interpretation of Study Results (cont.)
HSAG Evaluation Elements:
• Identifies initial measurement and remeasurement of study indicators.
• Identifies statistical differences between initial measurement and remeasurement.
• Identifies factors that affect the ability to compare initial measurement with remeasurement.
• Includes the extent to which the study was successful.
Bolded evaluation elements show areas for improvement
Changes in Study Design?
Study design should be same as baselineStudy design should be same as baselineData sourceData collection methodsData analysisTarget population or sample sizeSampling methodology
If change:If change: rationale must be specified & appropriaterationale must be specified & appropriate
Activity IX. Evidence of Real Improvement - PMHP Overall Score
100%
50% 50% 50%
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
1 2 3 4
Evaluation Element Number
Activity IX. Evidence of Real Improvement
Results:
• 16 percent of the four evaluation elements were Met
• 9 percent were Partially Met or Not Met• 75 percent of the evaluation elements were
Not Applicable or Not Assessed
Activity IX: Assess the Likelihood that Reported Improvement is “Real” Improvement
HSAG Evaluation Elements:
• The remeasurement methodology is the same as the baseline methodology.
• There is documented improvement in processes or outcomes of care.
• The improvement appears to be the result of intervention(s).
• There is statistical evidence that observed improvement is true improvement.
Bolded evaluation elements show areas for improvement
Statistical Significance Testing
Time Periods
Measurement Periods
Numerator Denominator Rate or Results
Industry Benchmark
Statistical Testing and Significance
CY 2003 Baseline 201 411 48.9% 60% N/A
CY 2004 Re-measurement 1 225 411 54.7% 60% Chi-square = 2.8
P-value = 0.09387
NOT SIGNIFICANT AT THE 95% CONFIDENCE LEVEL
Activity X. Data Supporting Sustained Improvement Achieved - PMHP Overall Score
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
1
Evaluation Element Number
No PMHP reached this Activity
Activity X. Data Supporting Sustained Improvement Achieved
Results:
• 0 percent of the one evaluation element was Met
• 0 percent was Partially Met or Not Met• 100 percent of the evaluation element was
Not Applicable or Not Assessed
Activity X: Assess Whether Improvement is Sustained
HSAG Evaluation Elements:
• Repeated measurements over comparable time periods demonstrate sustained improvement, or that a decline in improvement is not statistically significant.
Bolded evaluation elements show areas for improvement
Quality Outcomes Achieved
BaselineBaseline
1st Yr1st Yr
Demonstrable Demonstrable ImprovementImprovement
Sustained Sustained ImprovementImprovement
Sustained Improvement
• Modifications in interventions• Changes in study design• Improvement sustained for 1 year
HSAG Contact Information
Cheryl Neel, RN, MPH,CPHQ
Manager, Performance Improvement Projects
602.745.6201
Denise Driscoll
Administrative Assistant
602.745.6260
Questions and Answers