assessment and feedback on clinical processes: questioning results and conclusions mariam “aria”...
TRANSCRIPT
Assessment and Feedback on Clinical
Processes: Questioning Results and Conclusions
Mariam “Aria” Kwamin Health Analysis Department
Navy and Marine Corps Public Heath Center
1
2012 Navy Medicine Audit Readiness Training Symposium
Disclaimer
The views expressed in this presentation are those of the author and do not necessarily reflect the official policy or position of the Department of the Navy, Department of Defense, or the U. S. Government.
2
2
Learning Objectives
Introduce NMCPHC’s Health Analysis Department
Analyzing medical informatics through performance measurement
How results can be misinterpreted How NMCPHC develops performance
metrics
3
4
Health Analysis Department
Health AnalysisWho We Are
Department within the Population Health Directorate of the Navy and Marine Corps Public Health Center, Portsmouth, VA Epidemiologists Program Manager Technical Affairs Officer Biostatistician Physician Lead Navy Tumor Registry Consultant
5
Health Analysis Department
Mission
6
Provide epidemiologic expertise and leadership to improve the
value of Navy healthcare through evidence-based
methods and clinical health analysis.
Health Analysis DepartmentFunctions
Increase implementation of evidence-based care throughout Navy Medicine
Develop performance metrics focusing on the improvement of clinical processes
Review policy, programs, and publications relating to metrics and management of health services outcomes
Develop MTF-specific tools and services to promote efficient resource allocation and services
Goal: Improve Processes and Promote Positive Health Outcomes
7
8
Analyzing Medical Informatics Through Performance Measurement
Who Wants Healthcare Quality Measured?
Purchaser of Care: US Government: interested in direct vs. purchased care cost, quality of care, clinical efficiency
Governing Bodies: BUMED: Identify areas of excellence, as well as opportunities for improvement Determine whether there are procedures or areas that put patients, MTFs, or providers at risk
Providers: Healthcare personnel care about encounters for a certain disease and how well that cohort is being managed. How to identify and prevent diseases based on certain conditions. How to use evidence based info in their practice
9
Health Care Improvements Using Performance Measurements
10
Performance Measurements: Measure & Data Collection
11
Performance Measurements: Design, Assess, Improve
Design/Redesign: This is a systematic process involving appropriate scientific methodology and data availabilityRedo a methodology to accurately assess/answer questions about a programAssess:Translating available data into aggregated analysis that conclusions can be drawn about performance and improvement process Improve:Evaluated after the analysis phase with recommendations on how improvements can be achieved.
12
Information Flow
M2Former CDM ICDBCOHORT
MHSPHP CarePointPHN Dashboard
MDR
Examples:
CHCSAHLTADEERSPDTSCDR
DataMartDataMart
TransactionalDatabases
TransactionalDatabases
Storage and Data Enhancement
End User Applications for Queries and Analysis
Real Time Data Collection
13
1414
Types of MHS Data Availability?
14
CHCS M2 MDR
Allergies Yes No No
Lab In/Out Financial Financial
Radiology In/Out No Financial
Appointments Yes Yes Yes
Enrollment Yes Yes Yes
ICD & CPT In/Out Out <= 10 Date of Injury
Pharmacy Yes Yes Yes
Network Care No Yes Yes
Readiness Labs No No
Height/Weight Vitals No No Yes 14
Quality of Database
Data Availability: Metric development is dependent on the type of data that is available .
Data Validity: Usefulness of the measures in performance assessment and improvement
Data Completeness: Useful in gathering retrospective data over longer time periods more quickly than in a prospective study.
Data Reliability: Data used for analysis is only as good as the accuracy taking during data entry.
Poor data creates poor interpretations and undermines even the most sophisticated assessment tools.
Data Availability
Data completene
ss
15
Misinterpretation of Results
Quality of Data: Chance variability: may falsely identify outliers for praise or blameChanges in data recording overtime: reports may show apparent improvement or deterioration Errors in the transfer process from one data system to anotherPoor data handling procedures and processes may cost organizations a lot of money overall
16
Source: Modified from Quality improvement research: understanding the science of change in healthcare
The Impact of Misinterpreting Results
Potential to negatively impact an MTF, region or Service Monetary:
Decrease in available budget Inaccurate demand forecasting for business decisions
Productivity: Under or overestimation of frequency in which a
procedure or service was delivered (Quality of Care) Inaccurate staff availability/utilization estimates
(Access to Care) Surveillance:
Under or overestimation of disease/condition prevalence
17
18
NMCPHC Developing Performance Metrics
SMART Test: S = Specific: clear and focused to avoid
misinterpretationM = Measurable: can be quantified and compared
to other data A = Attainable: achievable, reasonable, and
credible under conditions expectedR = Realistic: fits into NAVMED constraints.T= Timely: doable within the time frame given
Quality Of Performance Metric
Source: Modified from DEVELOPING PERFORMANCE METRICS – UNIVERSITY OF CALIFORNIA APPROACH
Classification of Performance Metrics
Source: Modified from DEVELOPING PERFORMANCE METRICS – UNIVERSITY OF CALIFORNIA APPROACH
21
NMCPHC Example #1 : Evaluation of a WII Program
22
Project Comprehensive
Aesthetic Restorative Effort (C.A.R.E.) process flow chart (NMW
and NME)• NMC San Diego•Walter Reed National Military Medical Center•NMC Portsmouth
Patient becomes a project CARE participant; Case manager coordinates
patients process of care
Case ManagerMedical Doctor
Patient becomes a project CARE participant; Case manager or project CARE project manager coordinates
patients process of care
Case Manager
Project CARE project
manager
All patients are eligible for project
CAREPlastic Surgery
patient
Dermatology patient
ENT/ Facial Plastics patient
Oculoplasticspatient
Oral Surgery patient
Comprehensive Combat & Complex
Casualty Care Center (C5) participant
Patient becomes a project CARE participant;
Medical doctor coordinates
patients process of care
Project C.A.R.E. Decision Tree
23
Not Eligible
AD patient?
Sufficiently Complex?
No
No
Yes
Yes
No
Yes
Wounded Ill or Injured?
Start
Signature Diagnosis?
Eligible
No
Yes
How to identify Project C.A.R.E. participants?
MEPRS code ELA2 was the proposed code to capture project C.A.R.E. participants.
ELA2 is a MEPRS code that case managers use when they consult with their wounded warrior patients.
According to The DoD Coding Guidance for case management services, ELA2 is designated for Global War On Terrorism (GWOT) funded warriors in transition.
However there is an issue with using this MEPRS code as the designated code to identify project C.A.R.E. participants. Why?
24
25
Project C.A.R.E. process flow chart (NMW AND NME)
• NMC San Diego•Walter Reed National Military Medical Center•NMC Portsmouth
Patient becomes a project CARE participant; Case manager coordinates
patients process of care
Case ManagerMedical Doctor
Case Manager
Project CARE project
manager
All patients are eligible for project
CAREPlastic Surgery
patient
Dermatology patient
ENT/ Facial Plastics patient
Oculoplasticspatient
Oral Surgery patient
Comprehensive Combat & Complex
Casualty Care Center (C5) participant
ELA2Patient becomes a project CARE participant;
Medical doctor coordinates
patients process of care
?
Project C.A.R.E. Recommendations
Surveillance How to track project C.A.R.E. patients?
Measure the process of project C.A.R.E. Designated code for project C.A.R.E. patients.
Measure the outcome of project C.A.R.E. Derriford Appearance Scale (DAS24)
26
NMCPHC Example #2 TRAUMATIC BRAIN INJURY METRICS
27
mTBI Metrics
Wounded, Ill, and Injured (WII) Program: Navy Medicine effort to monitor and improve the care
offered to wounded, ill, and injured service members and their families
mTBI MetricsTBI Screening: Percent of Coded Head Injury/Trauma
Patients Coded as Screened for TBICo-Occurring Conditions Screen: Percent of Coded
mTBI Patients Coded as Screened for Co-Occurring Conditions
Six Week Follow Up Visit: Percent of Coded mTBI Patients with Follow-Up within Six Weeks
28
mTBI Metrics (cont’d)
Metric Definitions: The Department of Veterans
Affairs (VA) and Department of Defense (DoD) Concussion and mTBI Clinical Practice Guideline (CPG)
Coding Guidance: Defense Centers of Excellence
for Psychological Health and Traumatic Brain Injury (DCoE PH/TBI)
Navy Medicine TBI subject matter experts (SMEs) 29
General Coding Guidance & CPG Recommendation
mTBI Metrics (cont’d)
30
CPG Recommendation:
“Regardless of the time that has elapsed
since injury, management should
begin with the patient’s first
presentation for treatment,” and head injury cases should be
screened for TBI.
All Head Injury Cases: 5%
Active Duty Only: 7%
mTBI Metrics (cont’d)
Not using codes?Not screening for
TBI?Actual process different than
recommendation?31
Conclusion
Introduce NMCPHC’s Health Analysis DepartmentAnalyzing medical informatics through performance measurementHow results can be misinterpreted How NMCPHC Develops Performance Metrics
32
QUESTIONS
33