use of evidence in decision models: an appraisal of health technology assessments in the uk

31
USE OF EVIDENCE IN DECISION MODELS: An appraisal of health technology assessments in the UK Nicola Cooper Centre for Biostatistics & Genetic Epidemiology, Department of Health Sciences, University of Leicester, U.K. http://www.hs.le.ac.uk/personal/njc21/ Acknowledgements to: Doug Coyle, Keith Abrams, Miranda Mugford & Alex Sutton

Upload: yardan

Post on 08-Jan-2016

19 views

Category:

Documents


0 download

DESCRIPTION

USE OF EVIDENCE IN DECISION MODELS: An appraisal of health technology assessments in the UK. Nicola Cooper Centre for Biostatistics & Genetic Epidemiology, Department of Health Sciences, University of Leicester, U.K. http://www.hs.le.ac.uk/personal/njc21/. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

USE OF EVIDENCE IN DECISION MODELS: An appraisal of health

technology assessments in the UK

Nicola Cooper Centre for Biostatistics & Genetic Epidemiology,

Department of Health Sciences, University of Leicester, U.K.

http://www.hs.le.ac.uk/personal/njc21/Acknowledgements to: Doug Coyle, Keith Abrams,

Miranda Mugford & Alex Sutton

Page 2: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

OUTLINE

• Background to empirical research

• Methods & Findings from Study

• Further work

• Next steps

Page 3: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

• Increasingly decision models developed to inform complex clinical/economic decisions (e.g. NICE technology appraisals)

• Decision models provide:

• Explicit quantitative & systematic approach to decision making

• Compares at least 2 alternatives

• Useful way of synthesising evidence from multiple sources (e.g. effectiveness data from trials, adverse event rates from observational studies, etc.)

                                     BACKGROUND

Page 4: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

• Decision modelling techniques commonly used for:

i) Extrapolation of primary data beyond endpoint of a trial,

ii) Indirect comparisons when no ‘head-to-head’ trials

iii) Investigation of how cost-effectiveness of clinical strategies/interventions changes with values of key parameters

iv) Linking intermediate endpoints to ultimate measures of health gain (e.g. QALYs)

v) Incorporation of country specific data relating to disease history and management.

                                     BACKGROUND

Page 5: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

• Decision models contain many unknown parameters & evidence may include published data, controlled trial data, observational study data, or expert knowledge.

• Need to utilise/synthesise available evidence

• Model parameters can include: –clinical effectiveness, –costs, –disease progression rates, &–utilities

• Evidence-based models – Require systematic methods for evidence synthesis to estimate model parameters with appropriate levels of uncertainty

                                     BACKGROUND

Page 6: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

-2.95 -2.90 -2.85 -2.80 -2.75 -2.70 -2.65

02

46

81

0

-2.95 -2.90 -2.85 -2.80 -2.75 -2.70 -2.65

02

46

81

0

-2.95 -2.90 -2.85 -2.80 -2.75 -2.70 -2.65

02

46

81

0

-2.95 -2.90 -2.85 -2.80 -2.75 -2.70 -2.65

02

46

81

0

RCT1 RCT2 RCT3 OBS1 OBS2 ROUTINE EXPERTDATA SOURCES

Gen. synthesisMeta-analysisEVIDENCESYNTHESIS

ECONOMIC DECISION MODEL

DECISIONMODEL Stroke

No strokeTreating patients with atrial fibrillation?

Warfarin

No warfarin

Stroke

No stroke

Bleed

No bleed

Bleed

No bleed

Bleed

No bleed

Bleed

No bleed

….. …..….. …..

….. …..….. …..….. …..….. …..

….. …..….. …..

Clinical Effect

MODEL INPUTS

Adverse Events

Utility Cost

Opinion pooling

Bayes theorem In combination

Page 7: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

MRC FELLOWSHIP

The use of evidence synthesis & uncertainty modelling in economic evidence-based health-related decision models

• Part 1) To review and critique use of evidence in decision models developed as part of health technology assessments to date

• Part 2) Develop practical solutions for synthesising evidence, with appropriate uncertainty, to inform model inputs:

For example, combining evidence in different formats (e.g. mean and median), from different sources (e.g.

RCT, cohort, registry, etc.), etc.

Page 8: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

NICE GUIDANCE

• NICE methods guidelines to Health Technology Assessment (2004)

‘all relevant evidence must be identified’

‘evidence must be identified, quality assessed and, where appropriate, pooled using explicit criteria and justifiable and reproducible methods’

and

‘explicit criteria by which studies are included or excluded’

Page 9: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

USE OF EVIDENCE IN HTA DECISION MODELS (Cooper et al, In press)

• OBJECTIVE:

• Review sources & quality of evidence used in the development of economic decision models in health technology assessments in the UK

• METHODOLOGY:

• Review included all economic decision models developed as part of the NHS Research & Development Health Technology Assessment (HTA) Programme between 1997 and 2003 inclusively.

• Quality of evidence was assessed using a hierarchy of data sources developed for economic analyses (Coyle & Lee 2002) & good practice guidelines (Philips et al 2004).

Page 10: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

GOOD PRACTICE CRITERIA FOR DECISION MODELS (Philips et al 2004)

• Statement of perspective

• Description of strategies/comparators

• Diagram of model/disease pathways

• Development of model structure and assumptions discussed

• Table of model input parameters presented

• Source of parameters clearly stated

• Model parameters expressed as distributions

• Discussion of model assumptions

• Sensitivity analysis performed

• Key drivers/influential parameters identified

• Evaluation of internal consistency undertaken

Page 11: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

HIERARCHY OF DATA SOURCES

• Hierarchy of evidence - a list of potential sources of evidence for each data component of interest:

• Main clinical effectiveness

• Baseline clinical data

• Adverse events and complications

• Resource use

• Costs

• Utilities

• Sources ranked on increasing scale from 1 to 6, most appropriate (best quality) assigned a rank of 1

Page 12: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

HIERARCHY OF DATA SOURCES RANK DATA COMPONENTS A Clinical effect sizes, & adverse events & complications 1+ Meta-analysis of RCTs with direct comparison between comparator therapies, measuring final outcomes 1 Single RCT with direct comparison between comparator therapies, measuring final outcomes 2+ Meta-analysis of RCTs with direct comparison between comparator therapies, measuring surrogate # outcomes

Meta-analysis of placebo controlled RCTs with similar trial populations, measuring final outcomes for each individual therapy

2 Single RCT with direct comparison between comparator therapies, measuring surrogate# outcomes Single placebo controlled RCTs with similar trial populations, measuring final outcomes for each individual therapy

3+ Meta-analysis of placebo controlled RCTs with similar trial populations, measuring surrogate# outcomes 3 Single placebo controlled RCTs with similar trial populations, measuring surrogate # outcomes for each

individual therapy 4 Case control or cohort studies 5 Non-analytic studies, for example, case reports, case series 6 Expert opinion B Baseline clinical data 1 Case series or analysis of reliable administrative databases specifically conducted for the study covering

patients solely from the jurisdiction of interest 2 Recent case series or analysis of reliable administrative databases covering patients solely from the jurisdiction

of interest 3 Recent case series or analysis of reliable administrative databases covering patients solely from another

jurisdiction 4 Old case series or analysis of reliable administrative databases. Estimates from RCTs 5 Estimates from previously published economic analyses: unsourced 6 Expert opinion #Surrogate outcomes = an endpoint measured in lieu of some other so-called true endpoint (including survival at end of clinical trial as predictor of

lifetime survival)

Page 13: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

C Resource use 1 Prospective data collection or analysis of reliable administrative data for specific study 2 Recently published results of prospective data collection or recent analysis of reliable administrative data –

same jurisdiction 3 Unsourced data from previous economic evaluations – same jurisdiction 4 Recently published results of prospective data collection or recent analysis of reliable administrative data –

different jurisdiction 5 Unsourced data from previous economic evaluation – different jurisdiction 6 Expert opinion D Costs 1 Cost calculations based on reliable databases or data sources conducted for specific study – same jurisdiction 2 Recently published cost calculations based on reliable databases or data course – same jurisdiction 3 Unsourced data from previous economic evaluation – same jurisdiction 4 Recently published cost calculations based on reliable databases or data sources – different jurisdiction 5 Unsourced data from previous economic evaluation – different jurisdiction 6 Expert opinion E Utilities 1 Direct utility assessment for the specific study from a sample either:

a) of the general population b) with knowledge of the disease(s) of interest c) of patients with the disease(s) of interest

Indirect utility assessment from specific study from patient sample with disease(s) of interest: using tool validated for the patient population

2 Indirect utility assessment from a patient sample with disease(s) of interest; using a tool not validated for the patient population

3

Direct utility assessment from a previous study from a sample either: a) of the general population b) with knowledge of the disease(s) of interest c) of patients with the disease(s) of interest

Indirect utility assessment from previous study from patient sample with disease(s) of interest: using tool validated for the patient population

4 Unsourced utility data from previous study – method of elicitation unknown 5 Patient preference values obtained from a visual analogue scale 6 Delphi panels, expert opinion

Page 14: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

FLOW DIAGRAM

22 (out of 42) NICE Appraisals

180 HTA published 1997-2003

147 out of 180 (73%) considered Health Economics

5 out of 42 (12%)

Individual Sampling#

#One HTA reported both decision & Markov models, one reported both Markov & Individual Patient models, and one model type was unclear.

26 out of 42 (62%) Decision

Trees#

12 out of 42 (29%) Markov

Models#

48 out of 147 (33%) Developed Decision Models

42 out of 48 (88%) Economic Evaluation Models

6 out of 48 (15%) Cost Analyses

Models

Page 15: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

GOOD PRACTICE CRITERIA FOR DECISION MODELS (n=42)

YES (%)

Perspective specified 23 (55%)

Description of comparators 42 (100%)

Diagram of model 26 (62%)

Development of model structure & assumptions discussed 5 (12%)

Table of parameters presented 39 (93%)

Source of parameters clearly stated 22 (52%)

Model parameters expressed as distributions 6 (14%)

Model assumptions discussed 39 (93%)

Sensitivity analysis performed 39 (93%)

Key drivers/influential parameters identified 21 (50%)

Statement about test of internal consistency undertaken 1 (2%)

Page 16: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

RESULTS FROM APPLYING HIERARCHIES OF EVIDENCE (n=42

decision models) Clinical

Effect Size

Baseline Clinical

Data

Adverse events & complications

Resource Use Costs Utilities

Max Min Max Min Max Min

1+ 16 (38%) NA 6 (14%) 0 (0%) NA NA NA NA NA

1 7 (17%) 1 (2%) 7 (17%) 1 (2%) 5 (12%) 3 (7%) 1 (2%) 1 (2%) 2 (5%)

2+ 1 (2%) NA 0 (0%) 0 (0%) NA NA NA NA NA

2 3 (7%) 21 (50%) 1 (2%) 0 (0%) 8 (19%) 2 (5%) 34 (81%) 6 (14%) 1 (2%)

3+ 1 (2%) NA 0 (0%) 0 (0%) NA NA NA NA NA

3 0 (0%) 3 (7%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 1 (2%) 1 (2%) 9 (21%)

4 4 (10%) 5 (12%) 6 (14%) 4 (10%) 3 (7%) 3 (7%) 2 (5%) 14 (33%) 6 (14%)

5 1 (2%) 0 (0%) 0 (0%) 2 (5%) 6 (14%) 4 (10%) 3 (7%) 7 (17%) 1 (2%)

6 2 (5%) 1 (2%) 5 (12%) 14 (33%) 2 (5%) 11 (26%) 0 (0%) 8 (19%) 2 (5%)

Unclear 5 (12%) 11 (26%) 13 (31%) 17 (40%) 18 (43%) 19 (45%) 1 (2%) 5 (12%) 4 (10%)

Hie

rarc

hie

s of

evi

den

ce

N/A 2 (10%) 0 (0%) 4 (10%) 4 (10%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 17 (40%)

Max=Best source of evidence, Min=Worst source of evidence, N/A=Not applicable

Page 17: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

5%10%

40%

12%26%

40%

45%

12%

10%

5%

2%

33%

26%

19%

5%

2%

5%

10%

17%

2%

10%

12%

10%

7%

33% 14%

2%

7%

2% 21%

50%

5% 14%

2%

55%

2% 2%7%

2% 5%

10%

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Clinical Effect Size Baseline Clinical Data Adverse events &complications (min)

Resource use (min) Costs (min) Utilities

DATA COMPONENTS

Rank 1Rank 2Rank 3Rank 4Rank 5Rank 6UnclearN/A

High

Medium

Low

Rank 1

Rank 2High

Rank 3

Rank 4Medium

Rank 5

Rank 6low

Unclear

N/A

Page 18: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

CONCLUSIONS

• Evidence on main clinical effect mostly:

identified & quality assessed (76%) as part of companion systematic review for HTA

reported in a fairly transparent & reproducible way.

• For all other model inputs (i.e. adverse events, baseline clinical data, resource use, and utilities)

search strategies for identifying relevant evidence rarely made explicit

sources of specific evidence not always reported

Page 19: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

• Concerns about decision models confirmed by this study:

(1) Use of data from diverse sources (e.g. RCTs, observational studies, expert opinion) - may be subject to varying degrees of bias due to confounding variables, patient selection, or methods of analysis

(2) Lack of transparency regarding identification of model input data & key assumptions underlying model structure and evaluation

(3) Bias introduced by the researcher with regards to choice of model structure & selection of parameter values to input into the model.

CONCLUSIONS

Page 20: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

• Hierarchies of evidence for different data components provide useful tool for assessing

i) quality of evidence,

ii) promoting transparency, &

iii) informing weakest aspects of model for future work.

• Acknowledged, highly ranked evidence for certain model parameters may not always be available .

• Value of evidence input into decision models, regardless of position in hierarchy, depends on its quality & relevance to question of interest.

• QUANTITY vs. QUALITY (PRECISION vs. BIAS)

CONCLUSIONS

Page 21: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

NICE GUIDELINES TO HTA

‘all relevant evidence must be identified’

‘evidence must be identified, quality assessed and, where appropriate, pooled using explicit criteria and justifiable and reproducible methods’

‘explicit criteria by which studies are included or excluded’

• NICE methods guidelines for HTA (2004) lack specific procedural guidance & provide no clear definition of relevant evidence.

Page 22: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

FURTHER WORK

•How much evidence at any level is ‘sufficient’ & when would there be benefit in identifying evidence in lower levels of the hierarchy?

Page 23: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

ILLUSTRATIVE EXAMPLE: Effectiveness of aspirin for prevention

of stroke in atrial fibrillation compared to placebo

• RCT evidence: Out of 189 RCTs identified in literature:

• Level 1: 4 direct ‘head-to-head’ RCTs (i.e. aspirin vs. placebo),

• Level 2: 12 indirect RCTs (i.e. aspirin or placebo vs. an alternative treatment, e.g. warfarin), and

• Level ?: 6 ‘unrelated’ RCTs (e.g. warfarin vs. indobufen, warfarin vs. low dose warfarin, etc.)

• All relevant evidence (Levels 1 to 6):

• ?? out of 2518 publications identified

Page 24: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

NETWORK OF EVIDENCE DIAGRAM

(RCTs only)

Placebo

Aspirin

4

&

6

1

Alternate Day

Aspirin

Low Dose Warfarin

Aspirin

Warfarin

Low Dose

Warfarin

Low Molecular

Weight Heparin

5

3

1

1

1

1

Ximelagatran Indobufen

2 1

4

2

1

Triflusal

Aceno-coumarol

1

Page 25: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

METHODS FOR COMBINING RCT EVIDENCE

• Direct evidence (i.e. Aspirin vs. placebo trials) - Straightforward

- Apply meta-analysis techniques for pairwise comparison• Indirect evidence (e.g. Aspirin vs. Warfarin trials, Warfarin vs.

placebo trials, etc.) – a little bit more tricky!

- Need to maintain randomisation by focusing on the relative effects in each RCT (Lumley 2002; Lu & Ades 2004)

- For example,

dAspirin vs Placebo = dWarfarin vs Placebo – dWarfarin vs Aspirin

• Unrelated evidence (e.g. Warfarin vs. Low Dose Warfarin) – even more tricky!

- Adds to between-study variance estimate & sub-links in the network

-For technical details see: http://www.hsrc.ac.uk/Current_research/ research_programmes/mixedcomp/Web-4-ref.pdf

Page 26: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

-2.5

-2

-1.5

-1

-0.5

0

0.5

1

1.5

2

2.5

Log

Rat

e R

atio

RESULTS: Aspirin vs. Placebo

Direct (N=4)

Direct + Indirect (N=16)

Direct + Indirect + Unrelated (N=22)

line of no difference

Combining direct & indirect (& unrelated) evidence has substantially reduced uncertainty in effectiveness estimates.

RR=0.8

(0.1 to 6.6)

RR=0.5

(0.3 to 0.6)

RR=0.5

(0.4 to 0.6)

Page 27: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

 

Rate Ratio (on log scale) 0.001 0.01 0.1 1.0 10.0

cf. PLC

cf. WFN

LDW

LDWA

LMWH

ADA

WFN

ASP

XML

IBF

LDW

LDWA

LMWH

ADA

ASP

XML

IBF

cf. LDWLDWA

LMWH

ADA

ASP

XML

IBF

cf. ASP

LDWA

LMWH

ADA

XML

IBF

cf. LDWALMWH

ADA

XML

IBF

cf. XMLLMWH

ADAIBF

cf. LMWHADAIBF

cf. IBFADA

worse thancomparator

better thancomparator

line of no difference

SHOULD WE BE ANSWERING A BROADER QUESTION?

For example:

How do the treatments compare with one another?

OR ….

Page 28: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

 

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

PLC WFN LDW ASP LDWA XML LMWH IBF ADA

Ranks

RANKING TREATMENTS……… Which treatment is the best?

Page 29: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

UNANSWERED QUESTIONS• How best to identify the relevant evidence?

• How much evidence is sufficient and when would there be benefit from identifying additional/supplementary evidence?

• How to appropriately assess, and where possible adjust for, quality of different types of evidence?

- Instruments for assessing quality within study designs but across different study designs non-trivial (Downs & Black 1998)

• How to appropriately combine/synthesise evidence from different study types? For example,

- meta-analyse all data assuming equal weight,

- observational data as prior for RCT data, or

- hierarchical synthesis model

Page 30: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

WHERE NEXT?

• Two one-day (closed) workshops:

1) “Establishing the current situation” July 2005, Leicester

2) “Appropriate methodology for identifying & combining the evidence” Autumn 2005

• Collaborators: Tony Ades (Bristol), Carole Longson (NICE), Miranda Mugford (East Anglia), Suzy Paisley (Sheffield), Mark Sculpher (York), & Alex Sutton (Leicester)

• MRC HSRC funded workshop “Consensus working group on the use of evidence in economic decision models”

Page 31: USE OF EVIDENCE IN DECISION MODELS:  An appraisal of health technology assessments in the UK

REFERENCES• Ades AE, Welton NJ, Lu G. Introduction to mixed treatment comparisons.

http://www.hsrc.ac.uk/Current_research/research_programmes/mixedcomp/Web-4-ref.pdf Accessed May 2005

• Cooper NJ, Coyle D, Abrams KR, Mugford M, Sutton AJ. Use of evidence in decision models: An appraisal of health technology assessments in the UK to date. Journal of Health Services Research and Policy (In press 2005).

• Coyle D, Lee KM. Evidence-based economic evaluation: how the use of different data sources can impact results. Donaldson C, Mugford M, Vale L. Evidence-based health economics: From effectiveness to efficiency in systematic review. London: BMJ Publishing Group, 2002: 55-66.

• Lu G, Ades AE. Combination of direct and indirect evidence in mixed treatment comparions. Statistics in Medicine. 2004; 23:3105-3124.

• Lumley T. Network meta-analysis for indirect treatment comparisons. Statistics in Medicine 2002; 21:2313-2324.

• Philips Z, Ginnelly L, Sculpher M et al. Review of guidelines for good practice in decision-analytic modelling in health technology assessment. Health Technology Assessment. 2004; 8(36).

• National Institute for Clinical Excellence (NICE). Guide to the methods of technology appraisal. London: National Institute of Clinical Excellence, 2004.

Copy of slides available at: http://www.hs.le.ac.uk/personal/njc21/