introduction to ‘ebsi’ methodologies for a new era summer school school of applied social...
TRANSCRIPT
Introduction to ‘EBSI’
Methodologies for a new era summer school
School of Applied Social Studies, University College Cork
20 June 2011
Dr Paul Montgomery
Why Evidence-Based Social Intervention?
• Why practice needs sound evidence-base• ethical imperative to do more good than harm• best use of limited resources• wide variation in practice
What is EBP?
• “Evidence-Based Practice” popular term, but what does it mean - good quality evidence?
• Brief history• Challenges for EBP; (critiques of EBP)
Why is it important to base practice on good quality research evidence?
If we want to intervene (interfere) in people’s lives, and spend large sums of money doing this, then we have an ethical duty to show that we are basing our interventions on the very best possible available evidence.
If not, we may at best be wasting the precious time, money and hopes of vulnerable clients
At worst we may be doing more harm than good.
Evidence Based Practice:a definition.
“ the conscientious, explicit and judicious use of best currently available evidence, integrated with client values and professional expertise, in making decisions about the care of individuals”
- Can also apply to planning of services -
(adapted from Sackett et al., 2000)
EBP Model
Clinical state and
circumstances
Clinical Expertise
Client Preferences and actions
ResearchEvidence
Haynes, Devereaux, and Guyatt, 2002
Elements of definition of EBP Conscientious: ethical, effective, honest Explicit: transparent re evidence / other
reasons for decisions, with client Judicious: considered, prudent Best, currently available evidence: rigorous
as possible, subject to updating Client values a key part Contrasts with authority-based knowledge
Features of EBP
‘Client values’
Client part of decisions – their preferences, experiences, values etc, integrated with evidence and expertise
Share evidence with client, otherwise informed consent meaningless. Need honesty, openness re. state of knowledge.
Empowering if this is done. These principles are applicable at a community level.
Features of EBP
Anti-authoritarian
Not ‘I know best’; lifelong learner, questioner, always updating
Client as part of decision making teamSharing knowledge and expertiseBased on respect for client and their knowledge
Ethics
Ethical to do good and avoid harm by using best evidence
Ethical to involve client; fully informed consent requires open & up-to-date information about effectiveness
Many ethical codes require this Much concern re. ‘conflict of interests’ among
researchers and practitioners
Why is EBP possible now? (Gambrill, 2004)
Recognition of scarce resources- need for ‘good value’ from public services; transparency, accountability. EBM well established
Pressure / activism from consumers, public. Notions of human rights
Increased attention to harm, mistakes, whistle blowing, etc Internet / Information revolution: data bases, searching, e-
publishing, accessibility Advances in research methods- systematic reviewing,
epidemiology, trial methodology
The ‘5 Core Steps of EBP’
1. Formulating answerable questions2. Searching literature3. Critical appraisal of research 4. Applying findings to practice 5. Evaluation of outcome
Qualitative and related Work
These primary issues develop from detailed (largely) qualitative work
Mechanisms and process issues are similarly explored in these ways
Qual work should generally be in tandem with the Quant work presented here
From basic research questions to evidence based practice
.
Causal models - risk/protective
factors
InterventionTrials- RCT’s
‘efficacy’
InterventionTrials - RCT’s‘effectiveness’
Practice guidelines
Systematic reviews &
meta-analyses
Evidence based practiceJudicious application of
research to client / organisation
Nature & prevalence of problem. Who is it a
problem for?
Randomised controlled trial-RCT
‘Gold standard’ research design for evaluating intervention which attempts to minimise sources of bias
Allocates participants at random to intervention and comparison groups (this is the defining feature)
Uses same, meaningful reliable assessments before and after intervention
Double or single ‘blind’ if possible - reduces a very important source of bias
Systematic reviewAn overview or summary of primary studies, carried
our according to an explicit set of aims & methods - so review is reproducible.
e.g. a set of RCT’s all addressing a similar question, or a set of studies about prevalance or causes or screening.
- May include meta-analysis - quantitative summation of results combined from several similar studies
- Cochrane Collaboration (& Campbell) publishes 1000’s, for intervention questions, on web
An early, pioneering Randomised Controlled Trial (RCT)
Cambridge-Somerville study
Cabot carried out first major RCT in social work in 1930’s Massachusetts, USA (Powers & Witmer, 1951; McCord, 2001)
- theory driven intervention - based on knowledge of risk factors for crime
Cambridge-Somerville study: design 650 boys under 12 (mean age 10) 1935. 506 after WW2Risk of delinquency due to poor, high crime areasPlaced in matched pairs (similar age, SES etc )Randomly assigned one of pair to intervention, other to
control groupIntervention lasted 5.5 years on averageFollow-up: mid 1940’s; late 1970’s, age 47, 98% traced.Outcomes: records of courts, death, mental illness; Careful records of contacts and interventions kept.
Cambridge-Somerville study: results
6-10 years later: found no differences between groups in behaviour or delinquency rates
Note two different methods give same message.
35 year follow up: age 47, traced 98% of sample! using state records, found intervention boys more likely to
have negative outcomes including: serious convictions, deaths by age 35, serious mental illness, compared to control group.
Other programmes that harm?
See McCord (2003) paper on web Systematic review of Scared StraightThese gave youngsters a taste of what prison was
like, adopted in 38 statesPetrosino et al (2002) Campbell/ Cochrane library
Common interventions that do no good/ modest evidence of harm
Rose et al. Cochrane review of brief crisis intervention following exposure to traumatic events (“de-briefing”)
With youth problem behaviour, not effective if based on scare tactics, toughness (bootcamps) , lecturing (DARE), aggregating high-risk youth (lots services), 1-1 non-directive mentoring
What can we learn from studying these?
1. What sorts of interventions appear more likely to harm - or to do no good?
2. What are the mechanisms of harm? Or - What is actually is going wrong in this intervention?
NB We also want to know this with interventions that go well- what are the active useful ingredients- (mediators of intervention)
Factors that may make intervention more likely to harm/ do no good slides from Tom Dishion, Oregon, 2004;
The intervention target is not derived from an empirically derived model or theory (e.g., “Scared Straight” or “DARE” Drug Abuse Resistance Education)
The intervention protocol (target, strategy and context) is not clearly articulated;
The intervention staff are not trained/ supervised well with respect to implementation fidelity or held accountable for outcomes;
Critiques of EBP
Limitations apparently based on misconstrual (‘straw man’): EBP only uses one method; cook-book approach; dictates to
professionals, you cant do RCTs in complex situation, (etc)
Social science/ intervention is different from medicine: Human experience can’t be quantified, other kinds of evidence are
just as valid; interventions & contexts are too complex for RCT
Practical arguments: Not feasible for practitioners (time resources, expertise) Not enough evidence;
Does EBP work? Does it lead to better outcomes for people?
EBP Model
Clinical state and
circumstances
Clinical Expertise
Client Preferences and actions
ResearchEvidence
Haynes, Devereaux, and Guyatt, 2002