11 problem solving with sara
TRANSCRIPT
RESEARCHING PROBLEMS AND ASSESSING RESPONSES
CJS380 – Crime ScienceJ.A.Gilmer ©
SARAThe PROBLEM-SOLVING PROCESS
ASSESS SCAN
RESPOND
ANALYZE
SCANNING
IDENTIFY & PRIORITIZE PROBLEM Identify recurring problems of concern Identify consequences of the problem Prioritize identified problems Develop broad goals Confirm that the problem exists Determine how frequently the
problem occurs and how long it has been taking place
Select problem for closer examination
The CHEERS Test Community – must experience harmful events Harmful – property loss/damage, injury/death,
mental anguish, undermining police (illegality not a defining characteristic of problems)
Expectation – community members expect police to act (not necessarily a majority)
Events – problems made up of discrete events Recurring – acute or chronic Similarity – recurring events must have
something in commonhttp://www.popcenter.org/learning/60steps/index.cfm?stepNum=14
UNDERSTAND YOUR PROBLEM 5 W + 1 H = Hypothesis
Who is involved?What exactly do they do?Why do they do this?Where do they do this?When do they do this? How do they carry out the crime?
Hypothesis – a statement that explains why the problem is occurring
ANALYSIS
RESEARCH THE PROBLEM Identify/understand events/conditions that
precede and accompany the problem Identify relevant data to be collected Research what is known about the problem
type Take inventory of how problem is currently
addressed and the strengths and limitations of current response
Narrow the scope of the problem as specifically as possible
Identify a variety of resources that may assist in developing a deeper understanding of the problem
Develop working hypothesis about why problem is occurring
9
The Five Most Useful WebsitesCenter for Problem-Oriented Policing (
www.popcenter.org) National Criminal Justice Reference
Service (NCJRS) Abstracts Database (http://www.ncjrs.gov/abstractdb/search.asp)
The Home Office | Crime, United Kingdom (http://www.homeoffice.gov.uk/crime/)
Australian Institute of Criminology (www.aic.gov.au)
Four
10
DATABASES at Hellman Library via Blackboard
EBSCO Host (search single or multiple databases) Academic Search Premier Criminal Justice Abstracts ERIC SocINDEX
Google Scholar JSTOR (historical) LexisNexis Academic ProQuest Academic SAGE journals online Interlibrary Loan
Log inand try it
RESPONSE
INTERVENTION
Brainstorm for new interventions Search for what other communities with
similar problems have done Choose among the alternative
interventions Outline a response plan and identify
responsible parties State specific objectives for response
plan Carry out the planned activities
13
POLICE-SPECIFIC PROJECTS Goldstein Awards (http://www.popcenter.org/goldstein/)
Recognizes outstanding police officers and police agencies–both in the United States and around the world–that engage in innovative and effective problem–solving efforts and achieve measurable success in reducing specific crime, disorder, and public safety problems.
Tilley Awards (http://www.homeoffice.gov.uk/crime/partnerships/tilley-awards/) Set up by the U.K. Home Office Policing and Reducing Crime
Unit (now the Crime and Policing Group) in 1999 to encourage and recognize good practice in implementing problem–oriented policing (POP)
IDENTIFY RESPONSES
Keep a summary record of responses Note primary source Explain how response works Under what conditions it works best Any special considerations (costs,
legal requirements, etc.)
RESPONSE
SOURCE
HOW IT WORKS
WORKS BEST IF
CONSIDERATIONS
12
ASSESSMENT
EVALUATE AND ASSESS KEY QUESTION: DID PROBLEM DECLINE
ENOUGH TO END THE EFFORT? Determine whether plan was implemented
(process evaluation) Collecting pre– and post–response data
(qualitative & quantitative) Determine whether broad goals and specific
objectives were attained Identify any new strategies needed to augment
original plan Conduct ongoing assessment to ensure continued
effectiveness
EVALUATION VS ASSESSMENT
EVALUATION – scientific process for determining if a problem declined and if the solution caused the decline Begins the moment the problem-solving process
begins and continues through the completion of the effort
ASSESSMENT – the final stage of both evaluation and problem solving Answers the following questions: Did the response
occur as planned? Did the problem decline? If so, are there good reasons to believe the decline resulted from the response
Evaluation throughout problem-solving process
Fig 1 in Tool Guide No. 1 (2002)
TYPES OF EVALUATIONS
Process Evaluation Did response occur as planned? Did all
response components work?▪ involves comparing the planned response with
what actually occurred Impact Evaluation
Did the problem decline? If so, did the response cause the decline? ▪ To be able to reliably use again, it is important to
determine if the response caused the decline in the problem
Interpreting Results of Process and Impact Evaluations
Tool Guide No. 1 (2002)
CONDUCTING IMPACT EVALUATIONS
Part 1: Measure the problem Quantitative – counts and numerical
estimate; adds comparability Qualitative – (e.g., photos, maps,
interviews); allows comparisons, but not precision; reinforces quantitative information
Part 2: Evaluation design Compare measures systematically
MEASURING THE PROBLEM Take the most direct measure of the problems
The more indirect the measure, the less valid Use multiple measures, where possible
Arrest, as a measure of impact, may be affected by citizen complaint activity and/or police practice.
Whether a measure is direct or indirect depends on how the problem is defined Is focus on “behavior” or “perception of behavior”?
Measure the problem systematically and use the same measures throughout
DID THE REPONSE CAUSE THE CHANGE Is there a Plausible Explanation that the response
changed the level of the problem Based on detailed problem analysis, backed by research
Is there an Association between presence of the response and change in level of the problem
Did the response Precede a change in the problem Have measures before and after response begins
Are there No Plausible Alternative Explanations Could ‘something else’ have caused the results found
EVALUATION DESIGNS
Pre-post designs: simplest Can establish
‘association’ and ‘temporal order”
Weak at ruling out alternative explanations
Can’t assess fluctuations between measurements
Tool Guide No. 1 (2002)
EVALUATION DESIGNS
Repeated measures assess problem trajectory before and after response
Requires time intervals of sufficient duration to derive “meaningful” conclusions
Easy to use with routine data Stability of impact after
response controls for fluctuation
Interrupted Time Series designs: superior
Tool Guide No. 1 (2002)
EVALUATION DESIGNS
Interrupted time series designs not often practical Measurement can be expensive or difficult (surveys) Data may be unavailable for many periods before
response Decision-makers cannot want to wait for time required to
establish results of the response If data recording practices change, inter-period
comparisons become invalid Hard to interpret when problem events are rare in time
period, forcing use of fewer intervals of longer duration Cannot account for ‘something else’ that occurred which
caused the level of the problem to change