use of routinely collected service delivery and m&e indicator data for timely feedback
DESCRIPTION
Use of routinely collected service delivery and M&E indicator data for timely feedback. Denis Nash, PhD, MPH Associate Professor of Epidemiology Director, ICAP M&E Unit Mailman School of Public Health, Columbia University, NYC, USA [email protected]. - PowerPoint PPT PresentationTRANSCRIPT
![Page 1: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/1.jpg)
Use of routinely collected service delivery and M&E indicator data for timely feedback
Denis Nash, PhD, MPHAssociate Professor of Epidemiology
Director, ICAP M&E Unit
Mailman School of Public Health, Columbia University, NYC, [email protected]
![Page 2: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/2.jpg)
Common M&E Challenges in scale-up (1)• Large number of sites with relevant info residing with multiple
individuals– e.g., sites, districts, partner country teams, partner HQ , etc.
• Increasingly complex array of services to report on/evaluate– Collection, management and use of indicator data within country
• Traditionally siloed areas of reporting for program activities that are integrated at the site level– e.g., care and treatment, PMTCT, TB/HIV, testing & counseling
• Separate M&E reports for each program area• Comprehensive program evaluation? Triangulation?
• MOH vs. donor reporting requirements• Many important aspects of implementation and program
quality not captured in conventional, routinely collected M&E indicators– Generally M&E systems do not take context into account
![Page 3: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/3.jpg)
Common M&E Challenges in scale-up (2)• Providing timely data processing and feedback of information
to implementation staff for program improvement– National-level (i.e., technical and management staff, IPs)– District-level– Site-level (and below)
• Program improvement ultimately happens and most often starts at the site level
• Integrated data management– Adequate database to house M&E indicator data is essential– Capture/store/process/utilize reported data in a streamlined and efficient way– Dynamic and flexible to accommodate changes in indicators
• Data quality– Missing or incomplete data– Incorrect data
• Demand for indicators that reflect quality of care/program– M&E indicators do not typically measure quality of care/program
![Page 4: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/4.jpg)
Feeding data back to programs in the form of information
• Scale-up and sheer number of sites and geographic spread makes regular and timely feedback challenging, especially at site level
• Need for information at multiple levels– For implementation teams at national and district-levels
• Which sites to focus scarce mentoring and implementation support resources?
• Are efforts to maximize quality of care having an impact?
– For site staff• How is our site doing? Where can we improve?• Are our efforts to improve things working?
• Difficult to do without some form of automation (e.g., reports) and decentralization of information (e.g., web-based)
![Page 5: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/5.jpg)
Num
ber o
f site
sNumber of sites by country,
as of March 31, 2010
Source: ICAP Site Census, March 2010
Tanza
niaKen
ya
Mozam
bique
Nigeria
Cote d'
Ivoire
South
Africa
Ethiop
ia
Rwanda
Swazila
nd
Leso
tho0
50
100
150
200
250
300
350
400
450
405
153 138 137107
73 68 57 47 34
Total number of sites supported by ICAP : 1,219
![Page 6: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/6.jpg)
![Page 7: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/7.jpg)
![Page 8: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/8.jpg)
Feeding data back to programs in the form of information
• Scale-up and sheer number of sites and geographic spread makes regular and timely feedback challenging, especially at site level
• Need for information at multiple levels– For implementation teams at national and district-levels
• Which sites to focus scarce mentoring and implementation support resources?
• Are efforts to maximize quality of care having an impact?
– For site staff• How is our site doing? Where can we improve?• Are our efforts to improve things working?
• Difficult to do without some form of automation (e.g., reports) and decentralization of information (e.g., web-based)
![Page 9: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/9.jpg)
![Page 10: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/10.jpg)
Priority indicators by site
![Page 11: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/11.jpg)
Examples of feedback tools used by ICAP
• Mainly aimed at providing feedback from ICAP-NY to ICAP country teams on reported data
• But some tools can also be used to feedback data to district and sites
Examples• ICAP URS dashboards and reports• Maps (static and interactive)• PFaCTS reports• Quarterly eUpdate• Patient-level data reports
![Page 12: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/12.jpg)
Patient-level data
![Page 13: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/13.jpg)
ICAP patient-level data warehouse elements
Enrollment Table•Basic demographic information
• Age• Sex• enrollment
date•Prior ARV use•Point of entry•Transfer
Visit Table: Visit date, WHO stage, height, weight, Hb, ALT, nextscheduled visit date
CD4 Table: CD4 test date, CD4 count, CD4 percent
ART Table: ART regimen, regimen start & end date,reason(s) for switching ART regimen
Medication Table: TB screening date and result, TB medicationreason (treatment or prophylaxis) and dates, CTX & fluconazole
Status Table: Patient disposition status (dead, transferred, withdrew, LTF, stopped ART, etc) and status date
Pregnancy Table: Visit date, weeks gestation at visit, due date, actual pregnancy end date
Baseline: 1 rowPer patient Follow-up data: 1 row per measure per patient
*measures at key points of interest (e.g., enrollment, ART initiation) calculated based on visit dates
Databases are anonymized using an automated tool. Data use governed by MOH approved protocols.
![Page 14: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/14.jpg)
Patient-level data feedback reports• Multi-site feedback reports
– Combines and compares data across multiple sites– One for adult patients and one for pediatrics patients
• Site-specific feedback reports– General feedback report
• Summary of information on currently enrolled patients– Standards of care (SOC) report
• Quality of care indicators
• Reports are:– 100% automated and are in PDF format– generated and shared with sites within two weeks of submission of database– Currently generated in NYC at ICAP HQ– Report generation tools can be deployed, owned, and maintained by MOHs
where capacity exists or where it can be developed
![Page 15: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/15.jpg)
Multi-site report
![Page 16: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/16.jpg)
![Page 17: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/17.jpg)
![Page 18: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/18.jpg)
Site-specific general feedback report
![Page 19: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/19.jpg)
PDF format, 100% automated
![Page 20: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/20.jpg)
![Page 21: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/21.jpg)
![Page 22: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/22.jpg)
![Page 23: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/23.jpg)
![Page 24: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/24.jpg)
![Page 25: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/25.jpg)
![Page 26: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/26.jpg)
![Page 27: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/27.jpg)
Site-specific SOC report
![Page 28: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/28.jpg)
![Page 29: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/29.jpg)
![Page 30: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/30.jpg)
Dissemination of patient-level data reports
![Page 31: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/31.jpg)
M&E Indicator data
![Page 32: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/32.jpg)
Integrated data at site level
![Page 33: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/33.jpg)
![Page 34: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/34.jpg)
Filterable home page and program area dashboards
![Page 35: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/35.jpg)
Example of care and treatment dashboard table
![Page 36: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/36.jpg)
![Page 37: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/37.jpg)
Filterable home page and program area dashboards
![Page 38: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/38.jpg)
![Page 39: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/39.jpg)
![Page 40: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/40.jpg)
![Page 41: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/41.jpg)
Conclusions
• Timely feedback and dissemination of routinely collected service data and M&E data is an increasing challenge, especially as the number of sites increases (i.e., scale-up)– National, district, site, IPs
• Database tools, automation, and decentralization of information are critical– Improves data quality and utility of information!
• Capacity building on interpreting and applying disseminated data to program improvement is needed
![Page 42: Use of routinely collected service delivery and M&E indicator data for timely feedback](https://reader036.vdocument.in/reader036/viewer/2022062410/568165d8550346895dd8e8a9/html5/thumbnails/42.jpg)
Thank you!