michelle maden [email protected] quality assessment

22
Michelle Maden [email protected]. uk Quality Assessment

Upload: laurence-jeremy-morton

Post on 17-Jan-2016

215 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Michelle Maden michelle.maden@edgehill.ac.uk Quality Assessment

Michelle [email protected]

Quality Assessment

Page 2: Michelle Maden michelle.maden@edgehill.ac.uk Quality Assessment

AimsAims

Page 3: Michelle Maden michelle.maden@edgehill.ac.uk Quality Assessment

Defining ‘quality’Defining ‘quality’

The term ‘quality’ can be used in different ways (CRD, 2009; Higgins & Green, 2011)

Page 4: Michelle Maden michelle.maden@edgehill.ac.uk Quality Assessment

What is Quality What is Quality Assessment (QA)?Assessment (QA)?

Balanced assessment of strengths and weaknesses

Assessment of research process and results

Focus is on internal validity – assessing the risk of bias

Page 5: Michelle Maden michelle.maden@edgehill.ac.uk Quality Assessment

Why quality assessment is Why quality assessment is importantimportant

Non-random allocation– non-random allocation may differ from randomised studies but is not consistent

(Schulz et al., 1995; Carroll et al., 1996; Deeks et al., 2003; Osgaard-Jensen et al., 2011)

– Effects were larger in studies which also had allocation concealment

Allocation concealment– Larger effect sizes are associated with inadequate/unclear concealment (Juni et al.,

2001; Schulz et al.,1995; Odgaard-Jensen, et al.,2011; Wood et al., 2008)– Effects over-estimated by up to 41% (Schulz, 1995), more effect on trials reporting subjective

outcomes (Wood et al., 2008)

Blinding– Lack of blinding over-estimates treatment effects (Schulz et al., 1995; Noseworthy et al.,1994)– Noseworthy et al., (1994) found that unblinded outcome assessors were more likely to score

participants in the control group as being worse compared to blinded outcome assessors.

Page 6: Michelle Maden michelle.maden@edgehill.ac.uk Quality Assessment

Impact of poor quality Impact of poor quality studies on evidence studies on evidence

synthesissynthesis Inadequate/unclear random sequence generation

– Based on 944 trials in 112 MA, treatment effects were exaggerated by 11% (ROR 0.89, 95% CrI 0.82-0.96) and between trial heterogeneity was higher (κ = 0.16, 95% CrI 0.03-0.27) (Savovi, et al., 2012)

Inadequate/unclear allocation concealment– Based on 1292 trials from 146 MA, treatment effects were overestimated by

7% (ROR 0.93, 95% CrI 0.87-0.99), effects were greater for trials reporting subjective outcomes (ROR 0.85, 95% CrI 0.75-0.95) with greater between trial heterogeneity (κ = 0.12, 95% CrI 0.02 to 0.23) (Savovi, et al., 2012)

– When studies with inadequate/unclear concealment were excluded 69% of Meta-analyses lost statistical significance (Pidal et al., 2007)

Lack of/unclear double-blinding– Trials with no or unclear double-blinding exaggerated study estimates by an average of

13% (ROR 0.87, 95% CrI 0.79-0.96) with a greater between-trial heterogeneity (κ = 0.14, 95% CrI 0.02 to 0.30) (Savovi, et al., 2012)

Page 7: Michelle Maden michelle.maden@edgehill.ac.uk Quality Assessment

What about Qualitative What about Qualitative Evidence Synthesis?Evidence Synthesis?

Carroll et al. (2013) undertook sensitivity analysis in two case study reviews and found that the exclusion of poorly reported studies had no effect on the overall findings of a qualitative review.

They recommend that post-synthesis sensitivity analysis be carried out to allow for a judgement to be made on the robustness of the review

Page 8: Michelle Maden michelle.maden@edgehill.ac.uk Quality Assessment

How do you do it?How do you do it?

What tools (if any) do you use to assess the quality of studies?

8

Page 9: Michelle Maden michelle.maden@edgehill.ac.uk Quality Assessment

Models of Critical Models of Critical AppraisalAppraisal

Page 10: Michelle Maden michelle.maden@edgehill.ac.uk Quality Assessment

ChecklistsChecklists

10

CASP RCT Checklist

Checklists

Page 11: Michelle Maden michelle.maden@edgehill.ac.uk Quality Assessment

ChecklistsChecklists

Page 12: Michelle Maden michelle.maden@edgehill.ac.uk Quality Assessment

Scales Jadad Score Calculation

 

 

Item Score

Was the study described as randomised? 0/1

Was the method used to generate sequence of randomisation described and appropriate?

0/1

Was the study described as double blind? 0/1

Was the method of double blinding described and appropriate?

0/1

Was there a description of withdrawals and dropouts?

0/1

Jadad AR, Moore RA, Carroll D, et al. Assessing the quality of reports of randomized clinical trials: is blinding necessary? Control Clin Trials 1996;17:1–12.

Page 13: Michelle Maden michelle.maden@edgehill.ac.uk Quality Assessment

ScalesScales

Page 14: Michelle Maden michelle.maden@edgehill.ac.uk Quality Assessment

Domain based

BMJ 2011;343:d5928 doi: 10.1136/bmj.d5928

Page 15: Michelle Maden michelle.maden@edgehill.ac.uk Quality Assessment

Domain basedDomain based

Page 16: Michelle Maden michelle.maden@edgehill.ac.uk Quality Assessment

Compare and contrast the checklists– Look at types of ‘quality’ it assesses– Look at how it assesses items– Identify at least one strength and one

weakness for each of the tools

Models of critical appraisalModels of critical appraisal

Page 17: Michelle Maden michelle.maden@edgehill.ac.uk Quality Assessment

Things to considerThings to consider

Role of the reviewer (Booth, 2007)

– Knowledge– Novice vs. expert

Purpose of tool How does it assess ‘quality’? Weighting – is equal weighting fair? How will you use judgements of quality? Valid/reliable?

Page 18: Michelle Maden michelle.maden@edgehill.ac.uk Quality Assessment

Quality of reporting compared with actual methodological quality of 56 randomised

controlled trials (58 reports) conducted by the Radiation Therapy Oncology Group, based on information from

published reports, protocols, and verification by the group.

Soares H P et al. BMJ 2004;328:22-24

©2004 by British Medical Journal Publishing Group

What are you appraising…What are you appraising…conduct vs reporting?conduct vs reporting?

Page 19: Michelle Maden michelle.maden@edgehill.ac.uk Quality Assessment

And finally…And finally…

WHY?– Methods of statistical analysis cannot compensate for poorly

designed methods (Deeks et al.,2003)

HOW?– Will you define ‘quality’?– Will you incorporate ‘quality assessment’ in your evidence

synthesis

WHAT?– Tools will you use

Page 20: Michelle Maden michelle.maden@edgehill.ac.uk Quality Assessment

“Regardless of the approach eventually chosen for the quality assessment stage of the review there is a need to preserve the transparency of the method through careful documentation of decisions made” (Hannes, 2011, pg.12)

QA ought to be transparent, explicit and reproducible.

Page 21: Michelle Maden michelle.maden@edgehill.ac.uk Quality Assessment

Further readingFurther reading Bai A, Shukla VK, Bak G, Wells G. Quality Assessment Tools Project Report. Ottawa: Canadian Agency for Drugs and Technologies in Health; 2012. Drier, M., Borutta, B., Stahmeyer, J., Karuth, C., Walter, U. (2010)

Comparison of tools for assessing the methodological quality of primary and secondary studies in health technology assessment reports in Germany. GMS Health Technol Assess 2010;6:Doc07

Whiting, P., Rutjes, A.,W.,S., Dinnes, J., Reitsma, J.,B., Bossuyt , P.,M.,M., Kleijnen, J. Development and validation of methods for assessing the quality of diagnostic accuracy studies HTA. Health Technology Assessment 2004; Vol 8: number 25.

Deeks, J.J., Dinnes, J., D’Amico, R., Sowden, A.J., Sakarovitch, C., Song, F., Petticrew, M. and Altman, D.J. (2003) Evaluating non-randomised intervention studies. Health Technology Assessment, 7 (27).

Soares, H.,P., Daniels, S., Kumar, A., Clarke, M., Scott, C., Swann, S., Djulbegovic, B. Bad reporting does not mean bad methods for randomised trials: observational study of randomised controlled trials performed by the Radiation Therapy Oncology Group. BMJ 2004;328:22 West S, King V, Carey TS, et al. (2002) Systems to Rate the Strength of Scientific Evidence.

Evidence. Report/Technology Assessment No. 47 (Prepared by the Research Triangle Institute–University of North Carolina Evidence-based Practice Center under Contract No. 290-97-0011). AHRQ Publication No. 02-E016. Rockville, MD: Agency for Healthcare Research and Quality.

Pluye, P., Robert, E., Cargo, M., et al. (2011) Proposal: a mixed methods appraisal tool for systematic mixed studies reviews. Available at: http://mixedmethodsappraisaltoolpublic.pbworks.com

Sirriyeh, R., Lawton, R., Gardner, P., & Armitage, G. (2012) Reviewing studies with diverse designs: the development and evaluation of a new tool Journal of Evaluation in Clinical Practice

18(4), pp. 746–752.

Page 22: Michelle Maden michelle.maden@edgehill.ac.uk Quality Assessment