a “dose-response” strategy for assessing program impact in naturalistic contexts megan...

14
A “Dose-Response” Strategy for Assessing Program Impact in Naturalistic Contexts Megan Phillips George Tremblay Antioch University Antioch University New England New England Michael Duffin Program Evaluation and Educational Research Associates, Inc. Presented at the Annual Convention of the Association for Behavioral and Cognitive Therapies, November 18, 2006

Upload: garry-lane

Post on 31-Dec-2015

213 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: A “Dose-Response” Strategy for Assessing Program Impact in Naturalistic Contexts Megan PhillipsGeorge Tremblay Antioch University Antioch University New

A “Dose-Response” Strategy for Assessing Program Impact in Naturalistic Contexts

Megan Phillips George Tremblay

Antioch University Antioch University New England New England

Michael Duffin Program Evaluation and Educational

Research Associates, Inc.Presented at the Annual Convention of the Association for Behavioral and Cognitive Therapies, November 18,

2006

Page 2: A “Dose-Response” Strategy for Assessing Program Impact in Naturalistic Contexts Megan PhillipsGeorge Tremblay Antioch University Antioch University New

RATIONALE

With increasing accountability pressures in virtually all service sectors (American

Psychological Association, 2005), we need evaluation strategies that can be utilized outside of highly controlled research contexts (cf. Strosahl et al., 1998, on the “manipulated training research

method”).

Evaluation of dose-response relationships, while not a “strong” form of causal evidence (McCabe, 2004), has nevertheless been recognized as providing some support for a causal relationship (O’Neill, 2002).

Page 3: A “Dose-Response” Strategy for Assessing Program Impact in Naturalistic Contexts Megan PhillipsGeorge Tremblay Antioch University Antioch University New

REQUIREMENTS

Variability in exposure to intervention

Identification and measurement of targeted outcomes

Page 4: A “Dose-Response” Strategy for Assessing Program Impact in Naturalistic Contexts Megan PhillipsGeorge Tremblay Antioch University Antioch University New

ADVANTAGES

Provides a relatively efficient probe for active program effects, which can warrant further and more rigorous controlled analyses.

Uses a single measurement event while allowing for the collection of a wide range of dose values.

Data can be readily aggregated across time or settings.

Successfully detects small effects that are statistically significant.

Page 5: A “Dose-Response” Strategy for Assessing Program Impact in Naturalistic Contexts Megan PhillipsGeorge Tremblay Antioch University Antioch University New

LIMITATIONS

Measurement of dose may be somewhat indirect (e.g., estimates of time exposed to intervention). Evaluators must be open to site-specific operationalization of the dose measure, which may complicate comparison across programs.

Requires a deeper level of understanding of statistics than users of the evaluation data may be accustomed to.

Evaluators need to provide users of the data with some benchmark for interpreting the significance of observed effect sizes.

Page 6: A “Dose-Response” Strategy for Assessing Program Impact in Naturalistic Contexts Megan PhillipsGeorge Tremblay Antioch University Antioch University New

AN ILLUSTRATION

The Place-based Education Evaluation Collaborative (PEEC):• represents several innovative

educational programs that share common themes, such as:

Enhanced community-school connections Increased understanding of and

connection to local place Increased civic participation

• maintains an ongoing, cross-program, multi-method evaluation effort.

Page 7: A “Dose-Response” Strategy for Assessing Program Impact in Naturalistic Contexts Megan PhillipsGeorge Tremblay Antioch University Antioch University New

EVALUATION QUESTION:Is variability in dose (independent

variable) of a place-based education program associated with variability in levels of behaviors and attitudes that the program is attempting to impact, i.e. a larger response (dependent variable)?

Sample: • 338 educator and 721 student surveys from 55

schools, collected over one year.• Representative of wide range of demographic

characteristics, grade ranges, & program intensities.

Page 8: A “Dose-Response” Strategy for Assessing Program Impact in Naturalistic Contexts Megan PhillipsGeorge Tremblay Antioch University Antioch University New

METHOD: Measures

Dose measures• Composite dose was calculated from survey items including:

extent of program implementation: measured on a scale of 0 to 4

total # of hours of exposure to program elements: raw scores in hours scaled to 0 to 4 metric comparable to “program implementation”

• Distribution of composite dose scores across sample covered entire range from 0 to 4, offering suitable variability in the independent variable for dose-response calculations.

Response measures• Broad conceptual categories (modules) were developed that

matched desired program outcomes. Each module was composed of indices designed to capture specific dimensions of the modules.

• Individual survey questions were developed for each index, using items from existing surveys when possible to maximize validity of comparing current and future results to previously collected data.

Page 9: A “Dose-Response” Strategy for Assessing Program Impact in Naturalistic Contexts Megan PhillipsGeorge Tremblay Antioch University Antioch University New

METHOD: Analysis

Dose-response analysis:

Multiple regression analyses were used to explore the percent variance of outcome variables (modules & indices) that could be accounted for by the predictor variable (program dose).

Page 10: A “Dose-Response” Strategy for Assessing Program Impact in Naturalistic Contexts Megan PhillipsGeorge Tremblay Antioch University Antioch University New

RESULTS

Statistically significant relationships (p<.01) were found between program dose and all outcome measures except two student-level indices and one educator-level index.

This analysis allowed for the identification of more and less active ingredients of the program (Figures 1 & 2).

Page 11: A “Dose-Response” Strategy for Assessing Program Impact in Naturalistic Contexts Megan PhillipsGeorge Tremblay Antioch University Antioch University New

Indication of an active ingredient

0.0 1.0 2.0 3.0 4.0

Program Dose

1.0

2.0

3.0

4.0

Educa

tor Pra

ctic

e

Figure 1. Overall educator practice was analyzed at the super-ordinate level by combining average Likert scale responses for 12 items. The best fit multiple regression line shows that 19% of the variability in survey response is predicted by program dose.

Page 12: A “Dose-Response” Strategy for Assessing Program Impact in Naturalistic Contexts Megan PhillipsGeorge Tremblay Antioch University Antioch University New

0.0 1.0 2.0 3.0

Program Dose

1.0

2.0

3.0

4.0

Stu

den

t A

ttac

hm

ent to

Pla

ce

Figure 2. Student attachment to place was analyzed at the super-ordinate level by combining average Likert scale responses for 15 student survey items. The best fit multiple regression line shows that 6% of the variability in survey response is predicted by dose.

Indication of a less active ingredient

Page 13: A “Dose-Response” Strategy for Assessing Program Impact in Naturalistic Contexts Megan PhillipsGeorge Tremblay Antioch University Antioch University New

DISCUSSION

Benefits of the dose-response strategy in the PEEC evaluation context:

• Data set can now be cumulative year-to-year.• Once an initial investment in survey

instrument design & administration was made, future evaluation costs should decline.

Limitations of the dose-response strategy in the PEEC evaluation context:

• Relied on self-report data as opposed to more empirically verifiable observations.

• Psychometric properties of the survey instruments have yet to be validated.

Page 14: A “Dose-Response” Strategy for Assessing Program Impact in Naturalistic Contexts Megan PhillipsGeorge Tremblay Antioch University Antioch University New

REFERENCESAmerican Psychological Association. (2005, August). Policy statement on

evidence-based practice in psychology. Retrieved February 19, 2006, from http://www2.apa.org/practice/ebstatement.pdf

McCabe, O.L. (2004). Crossing the quality chasm in behavioral health care: The role of evidence-based practice. Professional Psychology: Research and Practice, 35, 571-579.

Strosahl, K. D., Hayes, S. C., Bergan, J., & Romano, P. (1998). Assessing the field effectiveness of acceptance and commitment therapy: An example of the manipulated training research model. Behavior Therapy, 29, 35-64.

O’Neill, R.T. (2002, June). A perspective on exposure-response relationships. Paper presented at the annual meeting of the American Association of Pharmaceutical Scientists, Arlington, VA. Retrieved October 25, 2006 from http://www.fda.gov/cder/offices/biostatistics/oneill_364/oneill_364.ppt

The data presented here were collected as part of an evaluation conducted by Program Evaluation and Educational Research Associates, Inc., under the supervision of Michael Duffin. The project was undertaken with the support of the Place-Based Education Evaluation Collaborative (PEEC). For more information about PEEC go to: http://www.PEECworks.org/

An electronic version of this poster can be downloaded from: http://www.peecworks.org/PEEC/PEEC_Reports/S0112C7E1-0112C8A6