practical approaches to evidence-based evaluation practice in public health

74
Practical Approaches to Evidence-Based Evaluation Practice in Public Health Joseph Telfair, DrPH, MSW/MPH Professor Department of Public Health School of Health and Human Performance University of North Carolina at Greensboro Greensboro, NC (USA)

Upload: papina

Post on 16-Jan-2016

25 views

Category:

Documents


0 download

DESCRIPTION

Practical Approaches to Evidence-Based Evaluation Practice in Public Health. Joseph Telfair, DrPH, MSW/MPH Professor Department of Public Health School of Health and Human Performance University of North Carolina at Greensboro Greensboro, NC (USA) j [email protected] ♦ (336) 334 - 3240. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Joseph Telfair, DrPH, MSW/MPHProfessor

Department of Public HealthSchool of Health and Human Performance University of North Carolina at Greensboro

Greensboro, NC (USA)[email protected] ♦ (336) 334 - 3240

Page 2: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

OVERVIEW OF PRESENTATIONBest Practices/Evidence: MCHB

Perspective Setting the Stage: Why Important,

Definitions and Key ConceptsPerformance Measurement: Selecting and

Constructing MeasuresProcess Monitoring: Developing a

Monitoring SystemConcluding RemarksQuestions and Discussion

Page 3: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Tell me ....I Forget

Show me....I remember

Involve me....I understand

Chinese Proverb

Page 4: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Of RelevanceThe MCHB has developed key strategies that

are the broad, cross-cutting approaches the Bureau uses in order to reach its five-year (and beyond) goals in the Bureau Strategic Plan. Goal 4 of the Strategic Plan is:“Improve the Health Infrastructure and Systems

of Care.” One key strategy used to support this goal is:

“Using the best available evidence, develop and promote guidelines and practices that improve services and systems of care.”

Page 5: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Best Practices/EvidenceMCHB Perspective

(http://mchb.hrsa.gov/about/stratplan03-07.htm)

Page 6: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Best Practices/Evidence (1)MCHB/AMCHP defines “best practices”

as a continuum of practices, programs and policies ranging from promising to evidence-based to science-based

EVALUATION of best practices requires the identification and establishment of evidence

Page 7: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Evaluating Evidence Evidence can be evaluated in four categories

ResearchExpert OpinionField LessonsTheoretical Rationale

All best practice approaches reported have a strong conceptual/theoretical rationaleHowever, the strength of evidence from

research, expert opinion and field lessons fall within a spectrum

Page 8: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Strength of Evidence Spectrum

Promising

Best Practice Approaches

Research +

Expert Opinion +

Field Lessons +

Theoretical Rationale +++

Proven

Best Practice Approaches

Research +++

Expert Opinion +++

Field Lessons +++

Theoretical Rationale +++

Page 9: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Strength of Evidence Spectrum

Promising

Best Practice Approaches

Little researchA beginning of agreement

in expert opinionVery few field lessons

evaluating effectiveness

Proven

Best Practice Approaches

Supported by strong research

Extensive expert opinion from multiple authoritative sources

Solid field lessons evaluating effectiveness

Page 10: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Grading Evidence (1)

Research/Evaluation• + A few studies in public health reporting

effectiveness (Promising)• ++ Descriptive review of scientific literature

supporting effectiveness (Promising/Proven)• +++ Systematic review of scientific literature

supporting effectiveness (Proven)

Page 11: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Grading Evidence (2)

Expert Opinion• + An expert group or general professional

opinion supporting the practice (Promising)• ++ One authoritative source (such as a national

organization or agency) supporting the practice (Promising/Proven)

• +++ Multiple authoritative sources (including national organizations, agencies or initiatives) supporting the practice (Proven)

Page 12: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Grading Evidence (3)

Field Lessons/Promising Practices• + Successes in state practices reported

without evaluation documenting effectiveness (promising)

• ++ Evaluation by a few states separately documenting effectiveness

(promising/proven)• +++ Cluster evaluation of several states (group

evaluation) documenting effectiveness (proven)

Page 13: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Grading Evidence (4)

• Practice-based Conceptual/Theoretical Rationale

• +++ Only practices which are linked by strong causal reasoning to the desired outcome of improving health and total well-being of priority populations will be reported (proven)

Page 14: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Best Practices/Evidence (2)MCHB has established that family and

community participation and engagement are key to the development of effective, quality health systems and services

Testing of Best Practices to Build Evidence – Deduction to Verification to Induction - Repeats

Requires a Practical Approach to Evaluation

Page 15: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Setting the Stage

Page 16: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

WHY? (1)Four Primary reasons:

To develop and maintain an effective program and service delivery

process at the state and local levelTo enhance staff’s understanding of the factors

that contribute to the extent to which and in what ways the specific aimsprogram service targetsevaluation objectives

are being followed

Page 17: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

WHY? (2)

Four Primary reasons (cont):To assure staff and stakeholders

by putting in place a process for determining whether or not the program and service delivery activities are succeeding as planned

Page 18: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

WHY? (3)Four Primary reasons (cont):

To Build best practices data by Assessing the application of ‘the best available evidence’ from (MCHB modified) (4 levels): Existing Research/EvaluationExpert OpinionField Lessons/Promising PracticesPractice-based

Conceptual/Theoretical Rationale

Page 19: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Definitions and Key Concepts (1)Definition: “Evaluation or program

measurement (PM) is a systematic process for staff and institutions to obtain information on the service delivery process, its outcomes, and the effectiveness of its work, so that they can improve the process and describe its accomplishments” Mattessich, PW (2003) (p. 3) [modified]

Page 20: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Definitions and Key Concepts (2)

Definition: Program monitoring is the process of assessing progress toward achievement of a service delivery process’s objectives to determine whether the process was implemented as planned (Peoples-Sheps & Telfair (2005 – See Handout)

Page 21: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Definitions and Key Concepts (3)

Evaluation or PM involves a comparison of the staff’s planned processes and outcomes with selected standards in order to assess accomplishments

Evaluation or PM involves the application of social science methods to determine whether assessed efforts are the cause of observed results

Page 22: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Definitions and Key Concepts (4)

Evaluation or PM relies on both qualitative and quantitative methods, and often a triangulation of the two, to produce informative results

Page 23: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Definitions and Key Concepts (5)

Program monitoring is carried out by assessing the extent to which a program is implemented as designed that involves tracking progress toward achievement of a program’s objectives (Peoples-Sheps & Telfair (2005)

It is a very traditional form of assessment that is generally considered an administrative function and integral to the ongoing operations of every program (Kettner, et al., 1999).

Page 24: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Definitions and Key Concepts (6)Definition: A performance measure is a

specific, quantitative or qualitative representation (measure) of a capacity, process, or outcome deemed relevant to the assessment of program performance (Peoples-Sheps & Telfair (2005))

Page 25: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Definitions and Key Concepts (7)

Both program monitoring and performance measures depend on strong, meaningful

measures of program and service delivery process performance

Page 26: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

PRACTICE EXERCISE

Questions 1- 4

Page 27: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Performance Measurement

Page 28: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Selecting or Constructing Measures (1)

Deciding what to measure is an essential first step

The aspects the service delivery process that are measured attract attention and generate action (Hatry, 1999).

Conversely, aspects not measured may go unnoticed until a crisis brings them to the surface (e.g., discovery of inadequate data collection efforts that did not allow for population or service targets to be met)

Page 29: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Selecting or Constructing Measures (2)

If the staff takes the time to think through what is needed, they are much less likely to miss something important

To cover all of the bases, start with the monitoring and evaluation’s specific aim(s) or hypothesis(es) to identify the main program and service delivery efforts and expected outcomes

Page 30: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Selecting or Constructing Measures (3)

To construct performance measures, three tasks must be undertaken: identifying concepts to be measuredselecting or constructing measureslocating or developing data sources

Page 31: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Selecting or Constructing Measures (4)

Performance measures can be formulated in many different ways. They may be:numbers (number of TB deaths)rates (TB mortality rate)proportions or percentages

(percentage of days missed at work among person with TB)

Page 32: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Selecting or Constructing Measures (5)

Performance measures can be formulated in many different ways. They may be (cont):averages (average number of emergency

department visits per person 18 to 44 years of age in a given year)

Categories (team meetings held)Numbers, percentages, and rates are the

most frequently used in MCH

Page 33: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Selecting or Constructing Measures (6)

Numbers, percentages, and rates are the most frequently used in MCH

Least used, but often just as critical are Qualitative indicators such as consensus measures, aggregated (agreement/ disagreement) statements, archival text-based descriptors (e.g., policy statements and group opinions from advisor or consumer groups

Page 34: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Selecting or Constructing Measures (7)

It is often helpful to include numbers and qualitative indicators along with rates and percentages so that the latter measures can be understood in the context of the type of service focus for which they were derived

To select or develop high-quality performance measures, candidate measures are generally assessed according to criteria that represent both scientific rigor and practical relevance

Page 35: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Selecting or Constructing Measures (8)

Responsive measures are able to detect a change

Measures need to be understandable to the audience to whom they will be presented

Regardless of how it is formulated, a measure should have very precise wording, a specific timeframe, and a clearly defined research population (e.g., persons with TB - Quant) or set of tasks (e.g., steps for securing needed sample - Qual)

Page 36: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Selecting or Constructing Measures (9)

A performance measure should be meaningful, valid, reliable, responsive, and understandable and should allow for risk adjustments (errors)

Page 37: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Selecting or Constructing Measures (10)

A valid measure is one that measures what it intends to measure. Validity, like all of the qualities in

this list, is measured on a continuum, meaning that some measures have greater validity than others

Page 38: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Selecting or Constructing Measures (11)

Reliable performance measures can be reproduced regardless of who collects the data or when they are collected (assuming the true results have not changed)Like validity, reliability is viewed as

a continuum

Page 39: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Selecting or Constructing Measures (12)

The selection of measures is closely tied to the data or research project information available to construct them

Data or information sources shouldBe of high quality, with standardized

definitions (as defined and agreed upon by the research team) and data collection methods and

Have acceptable levels of validity and reliability on the items of interest

Page 40: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Selecting or Constructing Measures (13)

Data or information sources should (cont)Be available within the program service

delivery timeframe (e.g., 3 years)Have cost conforming to budgetary

constraints of the programIt is more efficient, but not essential, to

construct measures from existing, or secondary, data sources, rather than to collect new data specifically for a given set of performance measures

Page 41: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

PROCESS MONITORING

Page 42: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Source: Mattessich, PW (2003). p. 10

Page 43: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Developing a Monitoring System (1)

Development of a monitoring system is an essential component of program and service delivery process measurement plan

The monitoring process described in this presentationidentifies the program’s objectives the base from which formulas to

measure progress are developed

Page 44: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Developing a Monitoring System (2)

The monitoring process described in this presentation (cont)relative strength or emphasis of a

measure is assigned as necessary data collection plans are developedachievement scores are calculated at

predetermined intervals

Page 45: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Developing a Monitoring System (3)

Start with the Aim-linked Objectives The objectives of a Specific Aim, each of

which consists of a performance measure and a target, serve as the foundation for project monitoring

Fully developed, measurable objectives must correspond with the program or service purpose

Performance measures must be developed as the program is being planned

Page 46: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Developing a Monitoring System (4)

Each objective should have an explicit date by which the target is to be achieved (see example next slide)

With objectives clearly and precisely stated, the next challenge is to develop a system through which progress towards meeting the program’s targets can be monitored

Page 47: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Performance Measure Target

Percentage of adults in village 2 by desired gender and age within normal range

A 7% increase over baseline (estimated at 80%)

Average amount of time spent collecting staff comments per week by program assistants

Four hours

Number adults from Village 2 in the project shuttled to and from the city for the purpose of data gathering

Thirty adults sampled 80% of the allocated study days per month

Page 48: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Developing a Monitoring System (5)

The information derived from monitoring shows which program objectives need more attention in the future and whether any of them require less intensive work

If the process has fallen short on some objectives, this information should trigger an in-depth search for the reasons expected targets were not achieved

Page 49: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Developing a Monitoring System (6)

The Table on the slide to come shows the components of a monitoring system

The first two columns are identical to those in the previous slide showing performance measures and targets

Page 50: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Developing a Monitoring System (7)

The remaining three columns represent the basic elements of a monitoring system, as it builds on the program’s Specific Aims linked objectives

See Expanded Matrix (Handout)

Page 51: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Performance Measure Target Formula to Measure Progress

Results at End of Year 1

Achievement Score

Percentage of adults in village 2 by desired gender and age within normal range

A 7% increase over baseline (estimated at 80%)

Percentage over baseline with BMI within normal range

7

1.75% 0.25

Number adults from Village 2 in the project shuttled to and from the city for the purpose of data gathering

Thirty adults sampled 80% of the allocated study days per month

Number of adults

sampled 80% of study days

30

24 0.80

Average amount of time spent collecting comments per week by program assistants

Four hours Number of hours spent in collecting comments

4

3.2 hours 0.80

Page 52: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Developing a Monitoring System (8)

FormulasThe first step in developing a

monitoring system is to construct formulas to reflect progress toward achievement of the objectives’ targets.

The formula is based on the principle that a score of 1.00 is complete accomplishment

Page 53: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Developing a Monitoring System (9)

Formulas (cont)For example, A score of 0.99 or lower signifies that the performance measure fell short of the target; a score that exceeds 1.00 indicates greater than expected achievement

Page 54: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Developing a Monitoring System (10)

Formulas (cont)Three types of formulas can serve this

purposeWhen the target is a percentage, proportion,

or a simple count, the most informative and frequently used formula involves dividing the level of actual achievement at a specified time with the level given in the target -

Actual value

Targeted value

Page 55: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Developing a Monitoring System (11)

Data collection planThe first three columns of the

previous Table should be completed with the project’s initial plan.

To create a fully operational monitoring system, one more step is required:data items and sources necessary

to construct performance measures should be identified

Page 56: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Developing a Monitoring System (12)

This step should not be missed even if some data sources seem obvious since it is far too common to discover that researchers had incorrectly assumed the necessary data would be available and accessible when needed

Page 57: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Developing a Monitoring System (13)

Analyses/Interpretation of ResultsThe information derived from

monitoring shows which objectives need more attention in subsequent years and whether any of them require less intensive work

Adjustments in resource allocations can be based on the needs of specific objectives for more or less effort

Page 58: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

Developing a Monitoring System (14)

Analyses/Interpretation of Results (cont)Careful assessment of the reasons for

shortfalls on objectives should be conducted before any reallocation decisions are made.

A review of end of year achievement scores provides helpful information for further investigation and subsequent adjustments to the process

Page 59: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

PRACTICE EXERCISE

Questions 5 - 9

Page 60: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

IN CONCLUSION

Page 61: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

IN CONCLUSION (1)Service Programs may not reach their

targets for a number of reasons A primary reason is inadequate

resources, which may take the form of insufficient funds across the board or misallocation of funds across Specific Aims linked objectives

It may be possible to detect misallocation if some targets are overachieved, whereas others fall short

Page 62: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

IN CONCLUSION (2)Other commonly cited reasons why programs

may fall short in achieving objectives include a lack of adequate knowledge about feasible

target levelsexternal factors that make it difficult or

impossible to reach the target (e.g., inability to find or retain clients that meet the program criteria)

inaccurate measurement of the objectivea conceptual error in the program purpose

Page 63: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

IN CONCLUSION (3)As an evaluation strategy, monitoring has

three important shortcomingsFirst, it does not produce evidence of

cause–effect relationships; only evaluation research can do that.

Second, the results of monitoring are limited to a single program; they cannot be extrapolated from one program to another

Page 64: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

IN CONCLUSION (4)As an evaluation strategy, monitoring has

three important shortcomings (cont)Finally, there are no firm guidelines for

interpretation of the scoresAlthough a score of 0.70 might be

considered good and 0.90 might be superior, the most useful interpretations depend on the program’s context and purpose (Peoples-Sheps, Rogers, & Finerty, 2002).

Page 65: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

IN CONCLUSION (5)Advantages and Disadvantages of

Monitoring as an Evaluation effortProgram monitoring is a valuable tool

for building and establishing evidenceProgram monitoring is a valuable tool

for planning and management decisions

The process is inexpensive and can be applied readily by anyone with entry-level training or experience

Page 66: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

IN CONCLUSION (6)Advantages and Disadvantages of

Monitoring as an Evaluation effort (cont)It includes a flexible set of methods that

can be modified to accommodate the needs of each service program at both the state and local level

Monitoring requires staffs to develop objectives that serve as the basis of the service delivery process and then to plan for necessary data so that the capability for tracking progress is assured

Page 67: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

IN CONCLUSION (7)

Another important advantage is that it encourages the production of information for critical management decisions, identifying and assessing Promising/Best Practices in both short- and long-term time frames and across all levels of the service delivery process

Thus, it is compatible with most governmental programmatic guidelines

Page 68: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

“Just because you can quantify something, doesn’t mean you

understand it” (Aubel, 1993, p. 10)

“Not everything that counts can be counted and not everything

that be counted counts”Anonymous

Page 69: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

QUESTIONSand

Discussion

Page 70: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

References (1)Peoples-Sheps, M. D., Byars, E., Rogers, M. M.,

Finerty, E. J., & Farel, A. (2001). Setting objectives (revised). Chapel Hill, NC: The University of North Carolina at Chapel Hill.

Peoples-Sheps, M. D., & Telfair, J (2005). Maternal and Child Health Program Monitoring and Performance Appraisal in J. Kotch (ed). Maternal And Child Health: Programs, Problems And Policies In Public Health, 2nd. Edition (Chapter 16). Boston, MA: Jones & Bartlett Publishers

Page 71: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

References (2)Grembowski, D. (2001). The practice of

health program evaluation. Thousand Oaks, CA: Sage Publications. Hatry, H. P. (1999). Performance measurement: Getting results. Washington, DC: The Urban Institute Press.

Kettner, P. M., Moroney, R. M., & Martin, L. L. (1999). Designing and managing programs: An effectiveness-based approach (2nd ed.). Thousand Oaks, CA: Sage Publications.

Page 72: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

References (3)Durch, J. S., Bailey, L. A., & Stoto, M. A.

(Eds.). (1997). Improving health in the community: A role for performance monitoring. Washington, DC: National Academy Press.

Mattessich, PW (2003). The Manager’s Guide to Program Evaluation. Saint Paul, MN: Wilder Publishing Center

Page 73: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

References (4)Roberts, A. R. & Yeager, K. (2004),

Evidence-based Practice Manual: Research and Outcome Measures in Health and Human Services, Oxford University Press

Aubel, J (1993), Participatory program Evaluation: A manual for involving program stakeholders in the evaluation process. The Gambia: Catholic Relief Services – USCC.

Page 74: Practical Approaches to Evidence-Based Evaluation Practice in Public Health

References (5)• Telfair, J., & Mulvihill, B.A. (2000),

Bridging science and practice: The integrated model of community-based evaluation. Journal of Community Practice, 7(3), 37-65.