evaluation research
DESCRIPTION
Evaluation Research. Appropriate for any study of planned or actual social intervention. Goal is to determine whether a social intervention has produced the intended result. Results are not always well received. Examples of Evaluation Research. - PowerPoint PPT PresentationTRANSCRIPT
Evaluation Research
• Appropriate for any study of planned or actual social intervention.
• Goal is to determine whether a social intervention has produced the intended result.
• Results are not always well received.
Examples of Evaluation Research• Fernandez, Kenneth. 2011. "Evaluating School
Improvement Plans and their Influence on Academic Performance." Educational Policy 25, 2: 338-367.
• Fernandez, Kenneth and Timothy Bowman. 2004. “Race, Political Institutions, and Criminal Sentencing: An Examination of the Sentencing of Latino Offenders” Columbia Human Rights Law Review 36, 1 (Fall): 41-70.
• Fernandez, Kenneth and Max Neiman. 1998."California’s Inmate Classification System: Predicting Inmate Misconduct," Prison Journal 78 (December): 406-422.
Policy Analysis vs. Evaluation• Terms often used interchangeably• Policy analysis is often a broader concept– Interested in past and future effects– Often contains a decision criteria– Analysis refers to investigating the true merits of
various actions and using that information to make informed and logical choices (Stone 2002)
• Policy or Program evaluation is usually more narrow - The goal is to determine whether a social intervention has produced the intended result.
Types of Measurement in Evaluation Research
• Outcome (response variable)• Experimental Context - aspects of the context
of an experiment that might affect the experiment.
• Experimental Stimulus (interventions)• Population - demographic variables as well as
variables defining the population.
Ad hoc Evaluation
• Hearings and Agency Reports• Site Visits• Program measures– Descriptive statistics
• Comparison with professional standards– Thresholds, minimums, maximums
• Survey of Public Opinion
Evaluation Research Designs:
• Experimental designs – Control/treatment group
• Quasi-experimental designs – Time-series design – Correlational design – Cross-sectional designs
• Qualitative evaluations – Focus groups, interviews, Historical Comparative
Social Indicators Research• Provides an understanding of broader social
processes.• Researchers are developing more refined
indicators.• Research is being devoted to discovering the
relationships among variables within whole societies.
• Examples: infant mortality rates; homicide rates; auto fatalities; voting rates; government spending
Examples:
• Dye (1966)
Potential Problems with Evaluations
• Bias toward positive results• Hawthorne Effect• Overgeneralization • Ethical problems/withholding treatment
Why Results Are Ignored• Implications may not be presented in a way that
nonresearchers can understand.• Results sometimes contradict deeply held beliefs.• Vested interest in a program. • Ideological/political support/opposition for
policies/policy types
Explaining away negative results
• Effects are long range• Effects are diffuse, multidimensional • Not enough money• Problem with research
design/methodology• Conflicting results
Conflicting Results• Meier, Kenneth J., Robert D. Wrinkle, and J. L. Polinard.
1999. Representative Bureaucracy and Distributional Equity: Addressing the Hard Question. The Journal of Politics 61, 4(November): 1025-1039.
• Nielson, Laura B., and Patrick J. Wolf. 2001. Representative Bureaucracy and Harder Questions: A Response to Meier, Wrinkle, and Polinard. The Journal of Politics 63, 2(May): 598-615.
• Meier, Kenneth J., Warren S. Eller, Robert D. Wrinkle, and J. L. Polinard. 2001. Zen and the Art of Policy Analysis: A Response to Nielson and Wolf. The Journal of Politics 63, 2(May): 616-629.
Conflicting Results
• Greene, Peterson, & Du. 1999. Effectiveness of School Choice. Education & Urban Society 31 (2): 190—213
• Witte. 1998. The Milwaukee Voucher Experiment. Educational Evaluation & Policy Analysis, 20(4): 229-251