september 26, 2012 data evaluation and analysis in systematic review

32
September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Upload: domenic-miles

Post on 23-Dec-2015

216 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

September 26, 2012

DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Page 2: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Stages of Systematic Review

1. Define the Problem2. Literature Search3. Data Evaluation4. Data Analysis5. Interpretation of Results6. Presentation of Results

Page 3: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Data Evaluation

What retrieved research should be included or excluded from your review?Are the methods used in retrieved literature suitable to study your research question?Are there problems in research implementation?

• Evaluating quality of retrieved literature• Coding literature for inclusion and exclusion• Coder reliability and avoiding coding error

Page 4: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Data Evaluation:Study Quality

What makes a high quality study?

Page 5: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Data Evaluation:Study Quality

What makes a high quality study?

• Validity• Relevance• Study design• Reporting quality

Page 6: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Data Evaluation:Validity

• Internal Validity (experimental validity)•Validity of causal inference•Does the cause lead to the effect?•Randomized controlled experiment is gold standard•Requirements:• Cause precedes effect• Cause and effect are related•Other plausible explanations are ruled out

Page 7: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Data Evaluation:Validity

•Threats to internal validity:• Ambiguous timeline of cause and effect•Other plausible explanations• Uncontrolled circumstances• Confounding• Bias

Page 8: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Data Evaluation:Validity

•Example: organic dairy•Does organic dairy farming (effect) produce higher quality milk (outcome)?•What are threats to internal validity?• Other explanations: differences in diet, climate, breed

Page 9: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Data Evaluation:Validity

•External validity•Degree to which a causal inference can be generalized•How would you be wrong to make a generalization?•Does an experiment resemble the real world?•Does it apply in other populations? Other regions?•Example: organic dairy• Can a cause and effect relationship between organic dairy farming and quality of milk in a study apply more generally?

•Ecological validity – are study conditions like those in natural conditions?

Page 10: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Data Evaluation:Validity

•Construct validity•Degree to which operational definitions represent concepts•Does the study measure the variable in a valid way?•Example: organic dairy•What is the concept of milk quality and how is it measured?

Page 11: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Data Evaluation:Validity

•Statistical conclusion validity•Validity of statistical inferences used to assess the strength of relationship between cause and effect•Does the data meet the assumptions of statistical tests used in the study?•Examples:• Study uses a t test, but data are not normally distributed• Study uses linear regression but variables do not have a linear relationship

Page 12: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Data Evaluation:Study Design

•Which study designs should be included in your review?•Design influences validity•Randomized vs non-randomized•Cohort•Case-control•Cross sectional•Case reports

Page 13: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Data Evaluation:Relevance

•Degree to which a retrieved study applies to the review question•High quality ≠ Relevant•Does a retrieved study have features that make it irrelevant to the review?•Population studied•Methods used•Definitions of variables (construct validity)•Determine criteria for relevance to your question

Page 14: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Data Evaluation:Reporting Quality

•How a study is reported affects inclusion in review•Poor reporting quality makes analysis difficult• Incomplete data•Missing information•Space restriction in publishing

Page 15: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW
Page 16: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Data Evaluation:Strategies

• A priori• Rules are determined ahead of time for what studies will be included

• Rules determined before data is examined or outcomes are known

• Need to consider implications of all rules

• Place specific values on rules

• Provide reasons why rules remove bias from review

• Post hoc• Determines the impact of study quality on the review to make

inclusion decisions

• How will inclusion of certain studies impact the results of the review?

• Does not rely on arbitrary rules

• Many systematic reviews use a blend of strategies

Page 17: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Data Evaluation:Strategies

•A priori•Exclusion•Deciding to exclude all studies that do not meet a certain criteria• excluding all non-randomized studies

•Quality Scales• Criteria included in scale has to be adapted to the field and research question•Not based on empirical evidence•Often not an evidence base for predictions of bias for quality indicators

Page 18: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Data Evaluation:Strategies

•Post hoc•Quality is handled as an empirical question•Attempts to avoid problems with a priori assignments•How will the review results be influenced if certain studies are included?•Can compare bias introduced by certain types of studies•Example: inclusion of non-randomized studies• A priori – include only randomized studies• Post hoc – does inclusion of the non-randomized studies influence the results? If not, then include

Page 19: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Data Evaluation:Coding the Literature

•Once you have a set of retrieved studies, code them for inclusion in final review•Coding components:•Eligibility criteria•Study features defined:• Eligible study designs• Eligible methods• Sampling criteria• Statistical criteria•Geographical and linguistic• Time frame

Page 20: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Data Evaluation: Coding the Literature

•Develop a coding protocol

•Develop like a questionnaire:• Clearly define what you want to measure – concepts and study characteristics•May need multiple coding questions to evaluate each concept

•Develop a matrix of all studies after retrieval•Way to organize post hoc • Reflects many characteristics of retrieved studies•May help if unsure what to code before examining studies

Page 21: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Data Evaluation:Coding the Literature

•Develop a coding form•All reviewers will use•Allows efficient record keeping by whole team• Include a report identification – assign each study a number, etc

Page 22: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Report Characteristics

1. First author

2. Journal

3. Volume

4. Pages

Inclusion Criteria

5. Is this study an empirical investigation of the effects of teacher expectancies?

0. No1. Yes

6. Is the outcome a measure of IQ?

0. No1. Yes

7. Are the study participants in grades 1-5 at the start of the study?

0. No1. Yes

Table 7.2 Handbook of Research Synthesis and Meta-AnalysisRelevance Screen

Page 23: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

9. Sampling Strategy 1. Randomly sampled from a defined population

2. Stratified sampling from a defined population

3. Cluster sampling4. Convenience sample5. Can’t tell

10. Group assignment mechanism 1. Random assignment2. Haphazard assignment3. Other nonrandom assignment4. Can’t tell

11. Assignment mechanism 1. Self-selected into groups2. Selected into groups by others

on a basis related to outcome3. Selected into groups by others

not known to be related to outcome

4. Can’t tell

Table 7.3 Handbook of Research Synthesis and Meta-AnalysisCoding for Internal Validity

Page 24: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

18. IQ measure used in study 1. Stanford-Binet 52. Wechsler (WISC) III3. Woodcock Johnson III4. Other5. Can’t tell

19. Score reliability for given IQ measure

________________

20. Metric for score reliability 1. Internal consistency2. Split-half3. Test-retest4. Can’t tell5. None given

21. Source of score reliability 1. Current sample2. Citation from another study3. Can’t tell4. None given

22. Is the validity of the IQ measure mentioned?

0. No1. Yes

Table 7.4 Handbook of Research Synthesis and Meta-AnalysisCoding Construct Validity

Page 25: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

BMC Medical Research Methodology 2008, 8:21

Page 26: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Data Evaluation:Coding the Literature•Assess coding reliability• Intra and inter-coder reliability• Intra – consistency of a single coder (avoid coder drift)• Inter – consistency between coders

•Sources of error in coding decisions•Deficient reporting in the study• Judgments made by coders•Coder bias•Coder mistake

Page 27: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Data Analysis

What procedures should be used to summarize and integrate the research results?

• Quantitative analysis (meta-analysis)• Statistical techniques to synthesize data from

studies

• Qualitative analysis• Allows synthesis and interpretation of non-

numerical data

Page 28: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Data Analysis:Qualitative•Methods of Qualitative Analysis:1. Content Analysis• Synthesis of the content of studies•Organization of content with keywords or concepts

2. Meta-Ethnography• Ethnography – study of a whole culture• Focuses on the culture as a system, understanding the whole

3. Grounded Theory• Formation of a theory from synthesis of data• Reverse of most hypothesis driven research

Page 29: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Content AnalysisDefining categories with keywords from studies

ELO S. & KYNGA ELO S. & KYNGAS H. (2008) ¨ S H. (2008) The qualitative content analysis process. Journal ofAdvanced Nursing 62(1), 107–115

Page 30: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Meta-Ethnography

BMC Medical Research Methodology 2008, 8:21

Page 31: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW

Grounded Theory

Page 32: September 26, 2012 DATA EVALUATION AND ANALYSIS IN SYSTEMATIC REVIEW