test accuracy of cognitive screening tests for diagnosis...

33
3008 S troke-survivors are at particular risk of cognitive decline. Three month dementia prevalence is 30%, and even minor stroke events have cognitive sequel. 1,2 Poststroke cog- nitive impairment is associated with increased mortality, dis- ability, and institutionalization. 3 The importance of cognitive change is highlighted by stroke-survivors themselves. In a national priority setting exercise, cognitive impairment was voted the single most important topic for stroke research. 4 A first step in management of cognitive problems is recog- nition and diagnosis. Informal clinician assessment will miss important cognitive problems, 5 and formal cognitive testing is recommended. 6–8 The ideal would be expert, multidisciplinary assessment informed by comprehensive investigations. This approach is not feasible at a population level. In practice, a 2-step system is adopted, with baseline cognitive testing used for screening or triage and specialist assessment to define the cognitive problem offered depending on the results. Although there is general agreement on the merits of post- stroke cognitive assessment, there is no consensus on a pre- ferred testing strategy. 6–8 Various cognitive screening tools are available with substantial variation in test used. 9,10 The clinical meaning of cognitive problems after stroke will vary according to test context. Cognitive impairment diagnosed in the first days post stroke may reflect a mix of delirium, stroke- specific impairments, and prestroke cognitive decline. 2,11,12 In the longer term, assessments aim to make or refute a demen- tia diagnosis. Common to all test situations is a final diagno- sis of presence/absence of clinically important impairments. A screening assessment should detect this clinical syndrome of all-cause, poststroke multidomain cognitive impairment. Background and Purpose—Guidelines recommend screening stroke-survivors for cognitive impairments. We sought to collate published data on test accuracy of cognitive screening tools. Methods—Index test was any direct, cognitive screening assessment compared against reference standard diagnosis of (undifferentiated) multidomain cognitive impairment/dementia. We used a sensitive search statement to search multiple, cross- disciplinary databases from inception to January 2014. Titles, abstracts, and articles were screened by independent researchers. We described risk of bias using Quality Assessment of Diagnostic Accuracy Studies tool and reporting quality using Standards for Reporting of Diagnostic Accuracy guidance. Where data allowed, we pooled test accuracy using bivariate methods. Results—From 19 182 titles, we reviewed 241 articles, 35 suitable for inclusion. There was substantial heterogeneity: 25 differing screening tests; differing stroke settings (acute stroke, n=11 articles), and reference standards used (neuropsychological battery, n=21 articles). One article was graded low risk of bias; common issues were case–control methodology (n=7 articles) and missing data (n=22). We pooled data for 4 tests at various screen positive thresholds: Addenbrooke’s Cognitive Examination-Revised (<88/100): sensitivity 0.96, specificity 0.70 (2 studies); Mini Mental State Examination (<27/30): sensitivity 0.71, specificity 0.85 (12 studies); Montreal Cognitive Assessment (<26/30): sensitivity 0.95, specificity 0.45 (4 studies); MoCA (<22/30): sensitivity 0.84, specificity 0.78 (6 studies); Rotterdam- CAMCOG (<33/49): sensitivity 0.57, specificity 0.92 (2 studies). Conclusions—Commonly used cognitive screening tools have similar accuracy for detection of dementia/multidomain impairment with no clearly superior test and no evidence that screening tools with longer administration times perform better. MoCA at usual threshold offers short assessment time with high sensitivity but at cost of specificity; adapted cutoffs have improved specificity without sacrificing sensitivity. Our results must be interpreted in the context of modest study numbers: heterogeneity and potential bias. (Stroke. 2014;45:3008-3018.) Key Words: cognitive impairment dementia sensitivity specificity stroke Test Accuracy of Cognitive Screening Tests for Diagnosis of Dementia and Multidomain Cognitive Impairment in Stroke Rosalind Lees, MA; Johann Selvarajah, PhD; Candida Fenton, MSc; Sarah T. Pendlebury, DPhil; Peter Langhorne, PhD; David J. Stott, MD; Terence J. Quinn, MD Received April 16, 2014; accepted July 31, 2014. From the Institute of Cardiovascular and Medical Sciences (R.L., P.L., D.J.S., T.J.Q.), Institute of Neuroscience and Psychology (J.S.), MRC/CSO Social and Public Health Sciences Unit (C.F.), University of Glasgow, UK; and NIHR Biomedical Research Centre and Stroke Prevention Research Unit, John Radcliffe Hospital, Oxford, UK (S.T.P.). Presented in part at the European Stroke Conference, Nice, France, May 6–9, 2014. The online-only Data Supplement is available with this article at http://stroke.ahajournals.org/lookup/suppl/doi:10.1161/STROKEAHA.114. 005842/-/DC1. Correspondence to Terence J. Quinn, MD, Institute of Cardiovascular and Medical Sciences, University of Glasgow, Room 2.44, New Lister Building, Glasgow Royal Infirmary, 10–16 Alexandra Parade, Glasgow, UK. E-mail [email protected] © 2014 American Heart Association, Inc. Stroke is available at http://stroke.ahajournals.org DOI: 10.1161/STROKEAHA.114.005842 by guest on May 12, 2018 http://stroke.ahajournals.org/ Downloaded from by guest on May 12, 2018 http://stroke.ahajournals.org/ Downloaded from by guest on May 12, 2018 http://stroke.ahajournals.org/ Downloaded from by guest on May 12, 2018 http://stroke.ahajournals.org/ Downloaded from by guest on May 12, 2018 http://stroke.ahajournals.org/ Downloaded from by guest on May 12, 2018 http://stroke.ahajournals.org/ Downloaded from by guest on May 12, 2018 http://stroke.ahajournals.org/ Downloaded from by guest on May 12, 2018 http://stroke.ahajournals.org/ Downloaded from by guest on May 12, 2018 http://stroke.ahajournals.org/ Downloaded from by guest on May 12, 2018 http://stroke.ahajournals.org/ Downloaded from by guest on May 12, 2018 http://stroke.ahajournals.org/ Downloaded from by guest on May 12, 2018 http://stroke.ahajournals.org/ Downloaded from by guest on May 12, 2018 http://stroke.ahajournals.org/ Downloaded from by guest on May 12, 2018 http://stroke.ahajournals.org/ Downloaded from by guest on May 12, 2018 http://stroke.ahajournals.org/ Downloaded from by guest on May 12, 2018 http://stroke.ahajournals.org/ Downloaded from by guest on May 12, 2018 http://stroke.ahajournals.org/ Downloaded from by guest on May 12, 2018 http://stroke.ahajournals.org/ Downloaded from by guest on May 12, 2018 http://stroke.ahajournals.org/ Downloaded from by guest on May 12, 2018 http://stroke.ahajournals.org/ Downloaded from by guest on May 12, 2018 http://stroke.ahajournals.org/ Downloaded from by guest on May 12, 2018 http://stroke.ahajournals.org/ Downloaded from by guest on May 12, 2018 http://stroke.ahajournals.org/ Downloaded from

Upload: dangdan

Post on 09-Mar-2018

222 views

Category:

Documents


4 download

TRANSCRIPT

Page 1: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

3008

Stroke-survivors are at particular risk of cognitive decline. Three month dementia prevalence is ≥30%, and even

minor stroke events have cognitive sequel.1,2 Poststroke cog-nitive impairment is associated with increased mortality, dis-ability, and institutionalization.3 The importance of cognitive change is highlighted by stroke-survivors themselves. In a national priority setting exercise, cognitive impairment was voted the single most important topic for stroke research.4

A first step in management of cognitive problems is recog-nition and diagnosis. Informal clinician assessment will miss important cognitive problems,5 and formal cognitive testing is recommended.6–8 The ideal would be expert, multidisciplinary assessment informed by comprehensive investigations. This approach is not feasible at a population level. In practice, a 2-step system is adopted, with baseline cognitive testing used

for screening or triage and specialist assessment to define the cognitive problem offered depending on the results.

Although there is general agreement on the merits of post-stroke cognitive assessment, there is no consensus on a pre-ferred testing strategy.6–8 Various cognitive screening tools are available with substantial variation in test used.9,10

The clinical meaning of cognitive problems after stroke will vary according to test context. Cognitive impairment diagnosed in the first days post stroke may reflect a mix of delirium, stroke-specific impairments, and prestroke cognitive decline.2,11,12 In the longer term, assessments aim to make or refute a demen-tia diagnosis. Common to all test situations is a final diagno-sis of presence/absence of clinically important impairments. A screening assessment should detect this clinical syndrome of all-cause, poststroke multidomain cognitive impairment.

Background and Purpose—Guidelines recommend screening stroke-survivors for cognitive impairments. We sought to collate published data on test accuracy of cognitive screening tools.

Methods—Index test was any direct, cognitive screening assessment compared against reference standard diagnosis of (undifferentiated) multidomain cognitive impairment/dementia. We used a sensitive search statement to search multiple, cross-disciplinary databases from inception to January 2014. Titles, abstracts, and articles were screened by independent researchers. We described risk of bias using Quality Assessment of Diagnostic Accuracy Studies tool and reporting quality using Standards for Reporting of Diagnostic Accuracy guidance. Where data allowed, we pooled test accuracy using bivariate methods.

Results—From 19 182 titles, we reviewed 241 articles, 35 suitable for inclusion. There was substantial heterogeneity: 25 differing screening tests; differing stroke settings (acute stroke, n=11 articles), and reference standards used (neuropsychological battery, n=21 articles). One article was graded low risk of bias; common issues were case–control methodology (n=7 articles) and missing data (n=22). We pooled data for 4 tests at various screen positive thresholds: Addenbrooke’s Cognitive Examination-Revised (<88/100): sensitivity 0.96, specificity 0.70 (2 studies); Mini Mental State Examination (<27/30): sensitivity 0.71, specificity 0.85 (12 studies); Montreal Cognitive Assessment (<26/30): sensitivity 0.95, specificity 0.45 (4 studies); MoCA (<22/30): sensitivity 0.84, specificity 0.78 (6 studies); Rotterdam-CAMCOG (<33/49): sensitivity 0.57, specificity 0.92 (2 studies).

Conclusions—Commonly used cognitive screening tools have similar accuracy for detection of dementia/multidomain impairment with no clearly superior test and no evidence that screening tools with longer administration times perform better. MoCA at usual threshold offers short assessment time with high sensitivity but at cost of specificity; adapted cutoffs have improved specificity without sacrificing sensitivity. Our results must be interpreted in the context of modest study numbers: heterogeneity and potential bias. (Stroke. 2014;45:3008-3018.)

Key Words: cognitive impairment ◼ dementia ◼ sensitivity ◼ specificity ◼ stroke

Test Accuracy of Cognitive Screening Tests for Diagnosis of Dementia and Multidomain Cognitive Impairment in Stroke

Rosalind Lees, MA; Johann Selvarajah, PhD; Candida Fenton, MSc; Sarah T. Pendlebury, DPhil; Peter Langhorne, PhD; David J. Stott, MD; Terence J. Quinn, MD

Received April 16, 2014; accepted July 31, 2014.From the Institute of Cardiovascular and Medical Sciences (R.L., P.L., D.J.S., T.J.Q.), Institute of Neuroscience and Psychology (J.S.), MRC/CSO Social

and Public Health Sciences Unit (C.F.), University of Glasgow, UK; and NIHR Biomedical Research Centre and Stroke Prevention Research Unit, John Radcliffe Hospital, Oxford, UK (S.T.P.).

Presented in part at the European Stroke Conference, Nice, France, May 6–9, 2014.The online-only Data Supplement is available with this article at http://stroke.ahajournals.org/lookup/suppl/doi:10.1161/STROKEAHA.114.

005842/-/DC1.Correspondence to Terence J. Quinn, MD, Institute of Cardiovascular and Medical Sciences, University of Glasgow, Room 2.44, New Lister Building,

Glasgow Royal Infirmary, 10–16 Alexandra Parade, Glasgow, UK. E-mail [email protected]© 2014 American Heart Association, Inc.

Stroke is available at http://stroke.ahajournals.org DOI: 10.1161/STROKEAHA.114.005842

by guest on May 12, 2018

http://stroke.ahajournals.org/D

ownloaded from

by guest on M

ay 12, 2018http://stroke.ahajournals.org/

Dow

nloaded from

by guest on May 12, 2018

http://stroke.ahajournals.org/D

ownloaded from

by guest on M

ay 12, 2018http://stroke.ahajournals.org/

Dow

nloaded from

by guest on May 12, 2018

http://stroke.ahajournals.org/D

ownloaded from

by guest on M

ay 12, 2018http://stroke.ahajournals.org/

Dow

nloaded from

by guest on May 12, 2018

http://stroke.ahajournals.org/D

ownloaded from

by guest on M

ay 12, 2018http://stroke.ahajournals.org/

Dow

nloaded from

by guest on May 12, 2018

http://stroke.ahajournals.org/D

ownloaded from

by guest on M

ay 12, 2018http://stroke.ahajournals.org/

Dow

nloaded from

by guest on May 12, 2018

http://stroke.ahajournals.org/D

ownloaded from

by guest on M

ay 12, 2018http://stroke.ahajournals.org/

Dow

nloaded from

by guest on May 12, 2018

http://stroke.ahajournals.org/D

ownloaded from

by guest on M

ay 12, 2018http://stroke.ahajournals.org/

Dow

nloaded from

by guest on May 12, 2018

http://stroke.ahajournals.org/D

ownloaded from

by guest on M

ay 12, 2018http://stroke.ahajournals.org/

Dow

nloaded from

by guest on May 12, 2018

http://stroke.ahajournals.org/D

ownloaded from

by guest on M

ay 12, 2018http://stroke.ahajournals.org/

Dow

nloaded from

by guest on May 12, 2018

http://stroke.ahajournals.org/D

ownloaded from

by guest on M

ay 12, 2018http://stroke.ahajournals.org/

Dow

nloaded from

by guest on May 12, 2018

http://stroke.ahajournals.org/D

ownloaded from

by guest on M

ay 12, 2018http://stroke.ahajournals.org/

Dow

nloaded from

by guest on May 12, 2018

http://stroke.ahajournals.org/D

ownloaded from

Page 2: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

Lees et al Cognitive Screening in Stroke 3009

Collation and synthesis of the evidence around cognitive screening tools will help guide policy and practice and high-light knowledge gaps. We sought to perform systematic review and meta-analysis, describing accuracy of screening tools for assessing dementia and multidomain cognitive impairment in stroke-survivors.

MethodsWe used systematic literature review and meta-analysis techniques specific to test accuracy13 and followed best practice in reporting.14

AimsCoprimary aims were to describe the following:

1. Accuracy of cognitive screening tests for clinical diagnosis of multidomain, cognitive impairment/dementia in stroke-survivors.

2. Accuracy of brief (<5 minutes completion time), cognitive screen-ing tests against more detailed neuropsychological assessments.

If data allowed, secondary objectives were to compare differing cutpoint scores used to define a threshold of test positivity and to compare the effects of heterogeneity with specific reference to test context and diagnostic reference standard.

Index TestIndex tests of interest were any direct-to-patient, cognitive screening tests. We included any screening test where the authors described it as such. We excluded informant-based assessments and tests that require testing equipment considered nonstandard for a stroke service. We rec-ognize that language and visuospatial function are important compo-nents of cognition, but did not include assessments of tools designed to exclusively test these domains. We did not include studies that com-pared screening tools with no reference to diagnostic gold standard.

Target Condition and Reference StandardOur target condition of interest was all-cause (undifferentiated) multido-main cognitive impairment post stroke. This rubric recognizes that a di-agnosis of important poststroke cognitive problems can be made without necessarily assigning a dementia label or specifying a pathological subtype.

As reference standard, we included clinical diagnosis of dementia made using any recognized classification system. We also included multidomain, cognitive problems as detailed on a neuropsychologi-cal assessment, provided the test battery was comprehensive and the results were interpreted to give a diagnostic formulation. We did not include single domain cognitive impairment or cognitive impairment no dementia because of inconsistency in operationalization of these syndromes.15 We prespecified subgroup analyses comparing demen-tia diagnosis and neuropsychological battery-based diagnosis.

For assessment of brief tests, we accepted results from a more detailed multidomain, screening assessment as reference standard, for example, comparison of Hodgkinson’s abbreviated mental test against Folstein’s Mini Mental State Examination (MMSE).

Participants and SettingOur focus was stroke-survivors. Where study population was a mix of stroke survivors and other patients, we included if proportion of stroke-survivors was >75%. We made no distinction between stroke subtypes, but excluded studies of traumatic intracerebral hemorrhage and subarachnoid hemorrhage. We did not include case studies (de-fined as having fewer than 10 participants).

We included studies conducted in any clinical setting and at any time post stroke. We operationalized time since stroke as hyperacute (first 7 days); acute (8–14 days); post acute (15 days to 3 months); medium term (3–12 months), and longer term (post 1 year).

Search StrategyAll aspects of searching, data extraction, and study assessment were performed by 2 reviewers (RL, JS) based in different centers and

blinded to each others’ results. On review of paired data, disagree-ment was resolved by group discussion.

We developed a sensitive search strategy in collaboration with an Information Scientist (CF) and with assistance from the Cochrane Dementia and Cognitive Improvement Group. Search terms were developed using a concepts-based approach, using Medical Subject Heading terms and other controlled vocabulary. Concepts of inter-est were stroke, dementia, and cognitive assessment. Our cognitive assessment concept included terms relating to cognitive tests used in stroke, based on previous survey data.9,10 We searched multiple, international, cross-disciplinary electronic databases from inception to January 2014. (Full search strategy is detailed in the online-only Data Supplement I–II.) We checked reference lists of relevant studies and reviews for further titles, repeating the process until no new titles were found.

We screened all titles generated by initial searches for relevance, and corresponding abstracts were assessed and potentially eligible studies reviewed as full manuscripts against inclusion criteria. As a check of in-ternal validity, a random selection of 2000 titles from the original search was reassessed by a third author (TQ). We tested external validity of our search by sharing lists of included articles with independent researchers (Acknowledgments). One author (TQ) identified exemplar articles rel-evant to the review question (online-only Data Supplement II), and we assessed whether these titles were detected by search strategy.

Data Extraction and ManagementWe extracted data to a study-specific proforma piloted against 2 ar-ticles (available from contact author).

For screening tests that give an ordinal summary score, various cut-points can be used to define test positive cases. Where data were given for several thresholds, we extracted separate data for each. Where a study may have included useable data, but these were not presented in the published article, we contacted the authors directly. If the same data set was presented in >1 publication, we included the primary article.

Quality AssessmentWe assessed quality of study reporting using the dementia-specific ex-tension to the Standards for Reporting of Diagnostic Accuracy check-list.16 We assessed methodological quality using the Quality Assessment of Diagnostic Accuracy Studies tool (QUADAS-2).17 We have previ-ously developed QUADAS-2 anchoring statements specific to cognitive assessment.18

Statistical AnalysisWe assessed accuracy of screening tests against a dichotomous vari-able cognitive impairment/no cognitive impairment. We created stan-dard 2-by-2 data tables describing binary test results cross-classified with binary reference standard. We did not include case–control stud-ies in pooled analyses. We used accepted cut-offs for multidomain impairment/dementia published in the literature: Addenbrooke’s Cognitive Examination-Revised <88, Rotterdam-CAMCOG<33, MMSE<27, and <25. For Montreal Cognitive Assessment (MoCA), a cut-off of <26 was used together with the lower cut-off of <22 be-cause the former was developed to detect single-domain impairment.

Where data allowed, we calculated sensitivity, specificity, and corre-sponding 95% confidence intervals (95% CI) and created test accuracy forest plots (RevMan 5.1, Cochrane Collaboration). We pooled test ac-curacy data using the bivariate approach (Statistical Analysis Software v9.1; SAS Institute Inc, USA). We used a bespoke macro developed with assistance of a statistical team with an interest in test accuracy.19 Summary metrics of interest were sensitivity/specificity and positive/negative like-lihood ratios and clinical utility index.20 We created summary curves in received operating characteristic space with corresponding 95% predic-tion intervals.

We assessed potential heterogeneity through visual inspection of for-est plots. We prespecified 2 factors that may contribute to heterogeneity, timing of assessment and reference standard. We dichotomized studies into acute (classified as hyperacute or acute) or nonacute and described reference standard as clinical (clinical diagnosis of dementia) and

by guest on May 12, 2018

http://stroke.ahajournals.org/D

ownloaded from

Page 3: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

3010 Stroke October 2014

neuropsychological battery (multidomain cognitive impairment). We as-sessed effect by plotting summary received operating characteristic curves by covariate. Taking timing of assessment as example, we calculated sensitivity/specificity restricted to the single cognitive assessment with greatest number of studies. We described a relative sensitivity/specific-ity comparing acute and nonacute studies, where a result of unity sug-gests no difference in that diagnostic property between settings. We did not quantify publication bias because there is no assessment applicable to test accuracy.

Where articles fulfilled inclusion criteria but did not have data suitable for this method of analysis, we offer a tabulated/narrative description.

ResultsFrom 19 182 titles, we reviewed 241 full articles of which 35 (34 data sets)21–56 were suitable for inclusion. Scope of included litera-ture was international, with articles from 16 different countries.

We detailed the study selection process in a Preferred Reporting Items for Systematic Review and Meta-Analysis (PRISMA) flow diagram (Figure 1).

Our validation check suggested the initial search was appro-priate because all prespecified articles were found on first search.

Figure 1. Preferred Reporting Items for Systematic Review and Meta-Analysis (PRISMA) flow diagram. ACE-R indicates Addenbrooke’s cognitive examination-revised; AMT, Abbreviated Mental Test; FN, false-negative; FP, false-positive; HSROC, Hierarchical Summary ROC; MMSE, mini-mental state examination; MoCA, Montreal cognitive assessment; R-CAMCOG, Rotterdam-CAMCOG; ROC, received operating characteristic; TN, true-negative; and TP, true-positive.

by guest on May 12, 2018

http://stroke.ahajournals.org/D

ownloaded from

Page 4: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

Lees et al Cognitive Screening in Stroke 3011

Accuracy of Screening Tools for Diagnosis of Cognitive ImpairmentIn total n=32 articles (n=3562 participants) were eligible.21–53 We tabulated summary descriptors for studies using clinical diagnosis reference standard (n=11 articles)22,24,34,37–39,43,47,49,51,52 and those using detailed neuropsychological assessment (n=21; Tables 1–2).21,23,25-33,35,36,40–42,44–46,48,50

There was considerable heterogeneity in study population: setting and test strategy. Twenty-three different tests were described, commonest MMSE (n=16 articles) and MoCA (n=8; Tables 1–2; online-only Data Supplement III).

Screening tests gave a spread of test accuracy, sensitivity range 14% to 100%, and specificity range 0% to 100% with trade-off between sensitivity and specificity. Where authors compared >1 test in the same population, the comparator was usually MMSE, and for most articles, MMSE was more spe-cific and less sensitive than other tests. Both the Executive Function Performance Test and the Repeatable Battery for Assessment of Neuropsychological Status showed strong correlations with scores on neuropsychological batteries, but data were not suitable for pooled analysis.25–28

Quality and ReportingOne article was graded low risk for all QUADAS2 domains.21 Common issues of concern were use of case–control methodol-ogy (n=7 articles) and potential lack of blinding (n=12; Figure 2; online-only Data Supplement IV). Five articles attempted to include patients with moderate to severe aphasia.21–25

Standards for Reporting of Diagnostic Accuracy assess-ment suggested consistent areas of poor reporting, particu-larly around the handling of missing data (n=22 articles) and

descriptions of training and expertise of assessors (n=25 arti-cles; online-only Data Supplement V)

Meta-AnalysesWe were able to pool test accuracy data for 4 tests: Addenbrooke’s Cognitive Examination-Revised (threshold score ≤88); MMSE (thresholds ≤24 and ≤26), MoCA (thresh-olds <26 and <22), and Rotterdam-CAMCOG (threshold <33). No test had sensitivity and specificity that were signifi-cantly different from others. MoCA at traditional threshold was sensitive at cost of specificity; specificity improved if thresholds were adjusted (Table 3; Figure 3; online-only Data Supplement VI)

HeterogeneityWe assessed effect of timing and reference standard using MMSE data. Comparing 6 acute studies20,21,31,37,38,43 and 6 nonacute studies22,30,34,39,46,50; gave a relative sensitivity, 0.73 (95% CI, 0.58–0.93) and relative specificity, 1.12 (95% CI, 1.01–1.25). These data suggest that accuracy varies with assessment timing and context, with acute testing yielding higher sensitivity and lower specificity. From analysis of paired summary received operating characteristic curves, the overall accuracy was better for those studies where assess-ment was performed within the acute period (online-only Data Supplement VII). Comparing clinical dementia ref-erence standard21,24,40,47,52 against neuropsychological bat-tery23,32,33,36,39,41,45,46 suggested no difference in test properties dependent on the reference standard used, with relative sensi-tivity of 0.86 (95% CI, 0.67–1.11) and relative specificity of 1.05 (95% CI, 0.95–1.16).

Table 1. Test Accuracy Data for Studies Where Reference Standard was Clinical Diagnosis of Dementia

Studyn

Includedn (%) With Dementia

Index Test (Threshold)

Summary Results

Brookes34* 152 N/A BMET Discriminates AD/SVD

Cumming 201022 149 42 (28%) Cog4 Sens: 53% Spec: 84%

de Koning 200037 300 55 (19%) R-CAMCOG (33) Sens: 91% Spec: 90%

de Koning 200538 121 35 (29%) R-CAMCOG (33) Sens: 66% Spec: 94%

de Koning52 300 55 (19%) CAMCOG (81) Sens: 91% Spec: 80%

MMSE (25) Sens: 80% Spec: 76%

Dong 201239 300 60 (25%) MoCA (21) Sens: 88% Spec: 64%

MMSE (24) Sens: 88% Spec: 67%

Hershey42 63 13 (21%) CSCE (20) Sens: 85% Spec: 87%

Hobson43* 149 N/A PNB (55) Sens: 71% Spec: 93%

Srikanth24 67 8 (12%) MMSE (23) Sens: 14% Spec: 100%

Tang47 142 10 (12%) MDRS (22) Sens: 82% Spec: 90%

MMSE (18) Sens: 80% Spec: 88%

Wu49 206 95 (46%) MoCA (23) Sens: 65% Spec: 79%

Where >1 test threshold was described, we present the primary data. AD indicates Alzheimer’s disease; BMET, brief memory and executive test; CSCSE, cognitive capacity screening examination; MDRS, Mattis dementia rating scale; MMSE, mini-mental state examination; MoCA, Montreal cognitive assessment; N/A, not applicable; PNB, preliminary neuropsychological battery; R-CAMCOG, Rotterdam-CAMCOG; and SVD, small vessel disease.

*Case–control study, data not included in pooled analyses.

by guest on May 12, 2018

http://stroke.ahajournals.org/D

ownloaded from

Page 5: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

3012 Stroke October 2014

Accuracy of Brief ToolsWe found 3 suitable articles (n=294 participants; online-only Data Supplement VIII).53–55 Two articles were graded high risk of bias and applicability concerns because of case–control methodology and patient inclusion.53,55

DiscussionPrevious systematic reviews of cognitive test accuracy have focused on older adults with narrative results only. Our aim was to provide a stroke-specific, literature synthesis to allow evidence-based recommendations on cognitive testing. We

were partially successful in this aim. Although there is an extensive, cross-disciplinary literature describing cogni-tive testing in stroke, number of articles using the classical test accuracy paradigm of index test versus reference (gold) standard was limited. Eligible articles were characterized by substantial heterogeneity and risk of bias, and the potential to describe summary analyses at individual test level was limited.

Accepting these caveats, we can still offer conclusions from our pooled analysis. There was no clearly superior cognitive screening test. Given the relative consistency in accuracy, choice of test strategy should be informed by other factors,

Table 2. Reference Standard of Cognitive Impairment Based on Multidomain Neuropsychological Battery

Studyn

Includedn (%) With Dementia

Index Test (Threshold)

Summary Results

Agrell31* 64 N/A MMSE (23) Sens: 56% Spec: 24%

Baum26* 73 N/A EPFT Correlates with NPB

Blake32 112 31 (28%) MMSE (24) Sens: 62% Spec: 88%

Bour33 194 22 (11%) MMSE (23) Sens: 96% Spec: 83%

Cartoni35 30 27 (90%) MEAMS (3) Sens: 52% Spec: 100%

Cumming 201336 60 39 (65%) MMSE (24) Sens: 54% Spec: 81%

MoCA (24) Sens: 97% Spec: 52%

Desmond29* 72 6 (8%) TICS (25) Sens: 100% Spec: 83%

MMSE (24) Sens: 83% Spec: 87%

Dong 201040 100 60 (60%) MMSE (24) Sens: 86% Spec: 82%

MoCA (21) Sens: 90% Spec: 77%

Godefroy23 95 64 (67%) MMSE (24) Sens: 70% Spec: 94%

MoCA (24) Sens: 92% Spec: 58%

Grace41 70 32 (46%) MMS (79) Sens: 69% Spec: 79%

MMSE (24) Sens: 44% Spec: 84%

Green27 60 N/A RBANS (3) Sens: 84% Spec: 90%

Jodzio44 44 24 (55%) WCST PPV: 21 NPV: 75

Larson28 158 N/A RBANS Correlates with NPB

Morris45 101 51 (84%) MMSE (24) Sens: 58% Spec: 77%

ACE-R (88) Sens: 94% Spec: 0%

Nokleby51 49 28 (60%) COGNISTAT (59) Sens: 59% Spec: 67%

SINS (2) Sens: 71% Spec: 67%

CDT 9 Sens: 63% Spec: 67%

Nys50* 72 N/A MMSE (28) Sens: 100% Spec: 40%

Pendlebury30 91 19 (21%) MoCA (25) Sens: 100% Spec: 46%

MMSE (24) Sens: 56% Spec: 96%

ACE-R (88) Sens: 84% Spec: 85%

Pendlebury46(telephone) 91 19 (21%) T’phone MoCA (19) Sens: 89% Spec: 46%

TICSm (25) Sens: 85% Spec: 56%

Salvadori21 80 47 (59%) MoCA (21) Sens: 91% Spec: 76%

Wolf25 20 N/A EPFT Correlate with NPB

Wong48* 68 N/A MoCA (21) Sens: 73% Spec: 75%

Where >1 test threshold was described, we present the primary data. ACE-R indicates Addenbrooke’s cognitive examination-revised; CDT, clock drawing test; EPFT, executive performance function test; MEAMS, middlesex elderly assessment mental state; MMSE, mini-mental state examination; MoCA, Montreal cognitive assessment; N/A, not applicable; NPB, neuropsychological battery; RBANS, repeatable battery for assessment of neuropsychological status; SINS, screening instrument neuropsychological impairments in stroke; TICS, telephone interview cognitive status; and WCST, Wisconsin Card Sorting Test.

*Case–control study, data not included in pooled analyses.

by guest on May 12, 2018

http://stroke.ahajournals.org/D

ownloaded from

Page 6: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

Lees et al Cognitive Screening in Stroke 3013

such as purpose of testing, feasibility; acceptability, and opportunity cost. Recent guidance and practice has tended to favor novel tests compared with the traditionally popular MMSE.6,7,56 Although there may be good reasons to favor other tests, our data do not suggest that MMSE is inferior for the diagnosis of multidomain impairment, albeit MMSE may lack sensitivity for single domain impairment.15,46,56 There was a trend toward better clinical utility for Rotterdam-CAMCOG, but the small number of studies and corresponding wide confi-dence intervals precludes definitive recommendation.

Preferred test properties will depend on test purpose, par-ticularly the level of cognitive impairment to be detected. It could be argued that for initial screening, sensitivity is pre-ferred compared with specificity. In this case, MoCA and Addenbrooke’s Cognitive Examination-Revised may be pre-ferred. The high false-positive rate for multidomain impair-ment obtained using MoCA <26 is to be expected because this threshold was chosen for detection of single domain impair-ment/mild cognitive impairment.15,46 Our findings suggest that an adjusted cut-off <22 has improved overall test properties for multidomain impairment in stroke as suggested previously.15,46

We were pragmatic in our choice of index test and refer-ence standard. Our screening test rubric included relatively short assessments (MoCA) through to fairly lengthy batteries (Repeatable Battery for Assessment of Neuropsychological Status). We did not find significant improvements in sensitiv-ity/specificity comparing shorter and longer screens (MoCA and Addenbrooke’s Cognitive Examination-Revised). This may suggest that increasing the length of test batteries does not necessarily improve the test accuracy.

There is no universally accepted gold standard for demen-tia. Rather than restricting to a particular pathological subtype, our focus was all cause dementia post stroke. We think this approach is in keeping with current clinical practice, where initial screening highlights potential cognitive issues that can subsequently be characterized with more detailed assess-ments. Literature describing screening tests for diagnosis of specific dementia subtypes was limited, and we were only able to describe test accuracy for all cause (undifferentiated) dementia. A priori, we decided to include multidomain impair-ment based on neuropsychological assessment in our reference standard, and this approach maximized potential for pooled analysis and recognizes that clinical diagnosis in certain stroke situations is not always feasible or appropriate. We used mul-tidomain rather than single domain impairment because this is closest to current operational classifications of dementia. We did not specify content of the neuropsychological assessment and recognize the substantial heterogeneity in batteries used.15

There were many potential sources of heterogeneity (sample size, case–control methodology, investigator training); indeed a degree of heterogeneity is expected in test accuracy meta-analyses. We were not able to assess all potentially contributory factors across the modest number of included studies and accept that this limits our findings. We chose to focus on reference stan-dard and timing of assessment as the most important possible sources of heterogeneity. We found that test properties varied with timing and seemed to favor earlier assessment, albeit our rubric of acute assessment included studies from first hours ≤14 days post ictus. Comparative analysis of test properties require careful interpretation; a conservative interpretation would be

Table 3. Pooled Test Accuracy for 4 Cognitive Screening Tests Against a Reference Standard of Cognitive Impairment

Test (Threshold)Articles

(Patients)Cognitive

Impairment, n (%)Sensitivity (95% CI)

Specificity (95% CI)

Positive Likelihood Ratio (95% CI)

Negative Likelihood Ratio (95% CI)

Clinical Utility Index

ACE-R (<88/100) 2 (192) 52 (27%) 96.2 (0.90–1.0) 0.70 (0.59–0.80) 3.19 (2.24–4.54)) 0.06 (0.01–0.22) +0.67

−0.67

MMSE (<25/30) 12 (1639) 483 (30%) 0.71 (0.60–0.80) 0.85 (0.80–0.89) 4.73 (3.63–6.17) 0.34 (0.25–0.47) +0.46

−0.70

MMSE (<27/30) 5 (445) 195 (44%) 0.88 (0.82–0.92) 0.62 (0.50–0.73) 2.33 (1.72–3.17) 0.19 (0.13–0.29) +0.65

−0.54

MoCA (<22/30) 6 (726) 289 (39%) 0.84 (0.76–0.89) 0.78 (0.69–0.84) 3.75 (2.77–5.08) 0.20 (0.15–0.29) +0.59

−0.63

MoCA (<26/30) 4 (326) 131 (40%) 0.95 (0.89–0.98) 0.45 (0.34–0.57) 1.73 (1.43–2.10) 0.10 (0.04–0.23) +0.60

−0.36

R-CAMCOG (<33/49) 2 (421) 90 (21%) 0.81 (0.57–0.93) 0.92 (0.87–0.95) 10.18 (6.41–16.18) 0.20 (0.07–0.52) +0.59

−0.86

For clinical utility index, +, positive; −, negative. ACE-R indicates Addenbrooke’s cognitive examination-revised; MMSE, mini-mental state examination; MoCA, Montreal cognitive assessment; and R-CAMCOG, Rotterdam-CAMCOG.

Figure 2. Assessment of risk of bias and applicability concerns using the revised Quality Assessment of Diagnostic Accuracy Studies tool (QUADAS-2). Assessment at individual study level is available in online-only Data Supplement.

by guest on May 12, 2018

http://stroke.ahajournals.org/D

ownloaded from

Page 7: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

3014 Stroke October 2014

A

B

Figure 3. Summary received operating characteristic (ROC) curve and forest plot describing test accuracy studies of (A) Folstein’s mini-mental state examination (MMSE) at threshold of <25/30; and (B) MMSE at threshold of <27/30. (Continued)

by guest on May 12, 2018

http://stroke.ahajournals.org/D

ownloaded from

Page 8: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

Lees et al Cognitive Screening in Stroke 3015

MoCA threshold <22 for diagnosis cognitive impairment

MoCA threshold <26 for diagnosis cognitive impairment

C

D

Figure 3 (Continued). Summary ROC curve and forest plot describing test accuracy studies of (C) Montreal cognitive assessment (MoCA) at threshold of <26/30; and (D) MoCA at threshold of <22/30. The filled circle represents the summary (pooled) test accuracy; unfilled circles repre-sent individual articles. The solid line is the summary ROC curve; the broken line is the 95% confidence interval (CI) around the summary point.

by guest on May 12, 2018

http://stroke.ahajournals.org/D

ownloaded from

Page 9: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

3016 Stroke October 2014

that a test suitable for longer-term poststroke assessment may not be suitable for early assessment and that cognitive testing can (and should) be performed in the acute stroke unit.

Brief assessment tools are attractive for use in busy stroke settings but only if they have acceptable accuracy. In this regard, the limited number of studies looking at brief assess-ments is unfortunate. Based on data available, screening using clock drawing test or abbreviated mental test may have a role for initial assessment; but they should not be considered a sub-stitute for subsequent multidomain testing.

Limitations of Included StudiesOur review highlights issues in the design and reporting of cognitive test accuracy studies. There was little reporting of missing or indeterminate data; however, we know that sub-stantial numbers of stroke patients are unable to complete multidomain screening tools. The same concern holds for a reference standard based on extensive neuropsychological testing. Limiting test accuracy data to those able to complete testing will bias results, tending to inflate test accuracy. We would encourage use of the intention to diagnose approach, where the traditional 2×2 test accuracy table is expanded with cells representing those unable to complete index test and ref-erence standard.57

A related issue is around generalizability of the included subjects. In our quality assessment, we scored several articles as inappropriate exclusion, including the studies that excluded patients with moderate to severe aphasia or inability to consent. We recognize the challenges of cognitive testing in this group, who by definition have at least single domain impairment; however, excluding patients with communication problems or frank confusion from test accuracy research will limit external validity.

Strengths and Limitations of ReviewOur review provides a contemporary synthesis of the rapidly evolving field of cognitive screening in stroke. We offer a sen-sitive search strategy following best practice in conduct and reporting58 and with multiple embedded internal and external validity checks. The review has limitations; our focused ques-tion excluded potentially informative articles, for example where screening tests are compared against each other.59 We did not assess other important test metrics, such as responsiveness to change or reliability, or other cognitive states, such as (single domain) mild cognitive impairment. These questions would need their own review. We recognize that cognitive assessment in stroke is evolving; 15 (47%) of included articles in our pri-mary review were published since 2010. Novel, stroke cognition assessments are emerging, but no test accuracy data were avail-able at time of review.

Future ResearchOur subgroup analysis suggests that test accuracy will vary depending on time since stroke, but could not suggest an optimal time for assessment. Early assessment in the Acute Stroke Unit has practical advantages and could allow for timely intervention; however, few studies assessed patients in the acute period. Given the issues with generalizability and

missing data, future studies may wish to describe feasibility and acceptability of testing, as well as classical test accuracy metrics.

ConclusionsAll our results must be interpreted with caution as included studies had substantial heterogeneity and potential for bias. However, we can offer some practical advice to clinicians. There is no clearly superior cognitive screening test approach and no evidence that newer or longer screening tests neces-sarily offer better test accuracy than established screens with shorter administration time. The strategy for cognitive testing in stroke must be informed by the purpose of the test and by other metrics, such as feasibility, acceptability, and opportu-nity cost. If the purpose of the initial screening is to pick-up all potential cases to allow further assessment, then MoCA may be the preferable test because it offers high sensitivity with shorter administration time than other equally sensitive tests. For MoCA use in stroke, screen positive thresholds may need to be adapted if the aim is to assess for dementia/mul-tidomain impairment, and test accuracy may vary with time since stroke event.

AcknowledgmentsWe are grateful for expert assistance from Cochrane Dementia and Cognitive Improvement Group to Cochrane authors who shared rel-evant papers, specifically Dr Ingrid Arevalo Rodriguez, Dr Sarah Cullum/Dr Daniel Davis, and Professor Nadina Lincoln. We are grateful to all study authors who responded to queries or shared data.

Sources of FundingProject supported by a Chest Heart and Stroke Scotland research award. J. Selvarajah is supported by a National Health Service Research Scotland fellowship, Chief Scientists Office Scotland. D.J. Stott has received support toward cognitive assessment research from Chief Scientists Office and the Orr Bequest. T.J. Quinn received trav-el support from Royal College of Physicians and Surgeons Glasgow Scholarship and Graham Wilson Travel Fund.

DisclosuresNone.

References 1. Pendlebury ST, Wadling S, Silver LE, Mehta Z, Rothwell PM. Transient

cognitive impairment in TIA and minor stroke. Stroke. 2011;42:3116–3121. 2. Pendlebury ST, Rothwell PM. Prevalence, incidence, and factors associ-

ated with pre-stroke and post-stroke dementia: a systematic review and meta-analysis. Lancet Neurol. 2009;8:1006–1018.

3. Leys D, Hénon H, Mackowiak-Cordoliani MA, Pasquier F. Poststroke dementia. Lancet Neurol. 2005;4:752–759.

4. Pollock A, St George B, Fenton M, Firkins L. Top ten research priorities relating to life after stroke. Lancet Neurol. 2012;11:209.

5. Chodosh J, Petitti DB, Elliott M, Hays RD, Crooks VC, Reuben DB, et al. Physician recognition of cognitive impairment: evaluating the need for improvement. J Am Geriatr Soc. 2004;52:1051–1059.

6. Hachinski V, Iadecola C, Petersen RC, Breteler MM, Nyenhuis DL, Black SE, et al. National Institute of Neurological Disorders and Stroke-Canadian Stroke Network vascular cognitive impairment harmonization standards. Stroke. 2006;37:2220–2241.

7. Royal College of Physicians. National Clinical Guidelines for Stroke Clinical. Effectiveness and Evaluation Unit. London Royal College of Physicians; 2012.

8. SIGN Guideline 118. Management of patients with stroke. Rehabilitation, prevention and management of complications and discharge planning Scottish Intercollegiate Guidelines Network (SIGN); 2010

by guest on May 12, 2018

http://stroke.ahajournals.org/D

ownloaded from

Page 10: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

Lees et al Cognitive Screening in Stroke 3017

9. Lees R, Fearon P, Harrison JK, Broomfield NM, Quinn TJ. Cognitive and mood assessment in stroke research: focused review of contemporary studies. Stroke. 2012;43:1678–1680.

10. Lees RA, Broomfield NM, Quinn TJ. Questionnaire assessment of usual practice in mood and cognitive assessment in Scottish stroke units. Disabil Rehabil. 2014;36:339–343.

11. Hénon H, Lebert F, Durieu I, Godefroy O, Lucas C, Pasquier F, et al. Confusional state in stroke: relation to preexisting dementia, patient characteristics, and outcome. Stroke. 1999;30:773–779.

12. Shi Q, Presutti R, Selchen D, Saposnik G. Delirium in acute stroke: a systematic review and meta-analysis. Stroke. 2012;43:645–649.

13. Higgins JPT, Green S, eds. Cochrane Handbook for Systematic Reviews of Interventions Version 5.0.2. The Cochrane Collaboration; 2009.

14. Stroup DF, Berlin JA, Morton SC, Olkin I, Williamson GD, Rennie D, et al. Meta-analysis of observational studies in epidemiology. JAMA. 2000; 283:2008–2012.

15. Pendlebury ST, Mariz J, Bull L, Mehta Z, Rothwell PM. Impact of dif-ferent operational definitions on mild cognitive impairment rate and MMSE and MoCA performance in transient ischaemic attack and stroke. Cerebrovasc Dis. 2013;36:355–362.

16. Noel-Storr AH, McCleery JM, Richard E, Ritchie CW, Flicker L, Cullum S, et al. Reporting standards for studies of diagnostic test accuracy in dementia. Neurology. 2014;83:364–373.

17. Whiting PF, Rutjes AW, Westwood ME, Mallett S, Deeks JJ, Reitsma JB, et al; QUADAS-2 Group. QUADAS-2: a revised tool for the quality assessment of diagnostic accuracy studies. Ann Intern Med. 2011;155:529–536.

18. Davis DHJ, Creavin ST, Noel-Storr A, Quinn TJ, Smailagic N, Hyde C, et al. Neuropsychological tests for the diagnosis of Alzheimer’s disease dementia and other dementias: a generic protocol for cross-sectional and delayed-verification studies. Cochrane Database Syst Rev. 2013;3:CD010460.

19. Takwoingi Y, Deeks JJ. MetaDAS: A SAS macro for meta-analysis of diagnostic accuracy studies. User Guide Version 1.3. http://srdta.cochrane.org/. Accessed January 2014.

20. Mitchell AJ. SensitivityxPPV is a recognized test called the clinical util-ity index. Eur J Epidemiol. 2011;26:251–252.

21. Salvadori E, Pasi M, Poggesi A, Chiti G, Inzitari D, Pantoni L. Predictive value of MoCA in the acute phase of stroke on the diagnosis of mid-term cognitive impairment. J Neurol. 2013;260:2220–2227.

22. Cumming TB, Blomstrand C, Bernhardt J, Linden T. The NIH stroke scale can establish cognitive function after stroke. Cerebrovasc Dis. 2010;30:7–14.

23. Godefroy O, Fickl A, Roussel M, Auribault C, Burnicourt JM, Lamy C, et al. Is the Montreal cognitive assessment superior to the mini-mental state examination to detect poststroke cognitive impairment? Stroke. 2011;42:1712–1716.

24. Srikanth V, Thrift AG, Fryer JL, Saling MM, Dewey HM, Sturm JW, et al. The validity of brief screening cognitive instruments in the diag-nosis of cognitive impairment and dementia after first-ever stroke. Int Psychogeriatr. 2006;18:295–305.

25. Wolf TJ, Stift S, Connor LT, Baum C. Feasibility of using the EFPT to detect executive function deficits at the acute stage of stroke. Stroke. 2010;36:405–412.

26. Baum CM, Connor LT, Morrison T, Hahn M, Dromerick AW, Edwards DF. Reliability, validity, and clinical utility of the Executive Function Performance Test: a measure of executive function in a sample of people with stroke. Am J Occup Ther. 2008;62:446–455.

27. Green S, Sinclair E, Rodgers E, Birks E, Lincoln N. The Repeatable Battery for the Assessment of Neuropsychological Status (RBANS) for post stroke cognitive impairment screening. Int J Ther Rehab. 2013;20:7.

28. Larson E, Kirschner K, Bode R, Heinemann A, Goodman R. Construct and predictive validity of the repeatable battery for the assessment of neuropsychological status in the evaluation of stroke patients. J Clin Exp Neuropsychol. 2005;27:16–32.

29. Desmond DW, Tatemichi TK, Hanzawa L. The telephone interview for cognitive status. Reliability and validity in a stroke sample. Int J Geriat Psychiat. 1994;9:803–807.

30. Pendlebury ST, Welch SJ, Cuthbertson FC, Mariz J, Mehta Z, Rothwell PM. Telephone assessment of cognition after transient ischemic attack and stroke: modified telephone interview of cognitive status and telephone Montreal Cognitive Assessment versus face-to-face Montreal Cognitive Assessment and neuropsychological battery. Stroke. 2013;44:227–229.

31. Agrell B, Dehlin O. Mini mental state examination in geriatric stroke patients. Validity, differences between subgroups of patients,

and relationships to somatic and mental variables. Aging (Milano). 2000;12:439–444.

32. Blake H, McKinney M, Treece K, Lee E, Lincoln NB. An evaluation of screening measures for cognitive impairment after stroke. Age Ageing. 2002;31:451–456.

33. Bour A, Rasquin S, Boreas A, Limburg M, Verhey F. How predic-tive is the MMSE for cognitive performance after stroke? J Neurol. 2010;257:630–637.

34. Brookes RL, Hannesdottir K, Lawrence R, Morris RG, Markus HS. Brief Memory and Executive Test: evaluation of a new screening test for cogni-tive impairment due to small vessel disease. Age Ageing. 2012;41:212–218.

35. Cartoni A, Lincoln NB. The sensitivity and specificity of the middlesex elderly assessment of mental state (MEAMS) for detecting cognitive impairment after stroke. Neuropsychol Rehabil. 2005;15:55–67.

36. Cumming TB, Churilov L, Linden T, Bernhardt J. Montreal cognitive assessment and mini-mental state examination are both valid cognitive tools in stroke. Acta Neurol Scand. 2013;128:122–129.

37. de Koning I, Dippel DW, van Kooten F, Koudstaal PJ. A short screen-ing instrument for poststroke dementia: the R-CAMCOG. Stroke. 2000;31:1502–1508.

38. de Koning I, van Kooten F, Koudstaal PJ, Dippel DW. Diagnostic value of the Rotterdam-CAMCOG in post-stroke dementia. J Neurol Neurosurg Psychiatry. 2005;76:263–265.

39. Dong Y, Venketasubramanian N, Chan BP, Sharma VK, Slavin MJ, Collinson SL, et al. Brief screening tests during acute admission in patients with mild stroke are predictive of vascular cognitive impairment 3-6 months after stroke. J Neurol Neurosurg Psychiatry. 2012;83:580–585.

40. Dong YH, Sharma VK, Chan BPL, Venketasubramanian N, Teoh HL, Seet RC, et al. The Montreal cognitive assessment (MoCA) is superior to the mini-mental state examination (MMSE) for the detection of vascular cognitive impairment after acute stroke. J Neurolog Sci. 2010;299:15–18.

41. Grace J, Nadler JD, White DA, Guilmette TJ, Giuliano AJ, Monsch AU, et al. Folstein vs modified Mini-Mental State Examination in geriatric stroke. Stability, validity, and screening utility. Arch Neurol. 1995;52:477–484.

42. Hershey LA, Jaffe DF, Greenough PG, Yang SL. Validation of cogni-tive and functional assessment instruments in vascular dementia. Int J Psychiatry Med. 1987;17:183–192.

43. Hobson JP, Leeds L, Meara RJ. The feasibility of cognitive screening of patients with ischaemic stroke using the preliminary neuropsychological battery. Psychol Health. 2003;18:655–665.

44. Jodzio K, Biechowska D. Wisconsin card sorting test as a measure of executive function impairments in stroke patients. Appl Neuropsychol. 2010;17:267–277.

45. Morris K, Hacker V, Lincoln NB. The validity of the Addenbrooke’s Cognitive Examination-Revised (ACE-R) in acute stroke. Disabil Rehabil. 2012;34:189–195.

46. Pendlebury ST, Mariz J, Bull L, Mehta Z, Rothwell PM. MoCA, ACE-R, and MMSE versus the National Institute of Neurological Disorders and Stroke-Canadian Stroke Network Vascular Cognitive Impairment Harmonization Standards Neuropsychological Battery after TIA and stroke. Stroke. 2012;43:464–469.

47. Tang WK, Mok V, Chan SS, Chiu HF, Wong KS, Kwok TC, et al. Screening of dementia in stroke patients with lacunar infarcts: compari-son of the mattis dementia rating scale and the mini-mental state exami-nation. J Geriatr Psychiatry Neurol. 2005;18:3–7.

48. Wong A, Xiong YY, Kwan PW, Chan AY, Lam WW, Wang K, et al. The validity, reliability and clinical utility of the Hong Kong Montreal Cognitive Assessment (HK-MoCA) in patients with cerebral small ves-sel disease. Dement Geriatr Cogn Disord. 2009;28:81–87.

49. Wu Y, Wang M, Ren M, Xu W. The effects of educational background on Montreal Cognitive Assessment screening for vascular cognitive impairment, no dementia, caused by ischemic stroke. J Clin Neurosci. 2013;20:1406–1410.

50. Nys GM, van Zandvoort MJ, de Kort PL, Jansen BP, Kappelle LJ, de Haan EH. Restrictions of the Mini-Mental State Examination in acute stroke. Arch Clin Neuropsychol. 2005;20:623–629.

51. Nøkleby K, Boland E, Bergersen H, Schanke AK, Farner L, Wagle J, et al. Screening for cognitive deficits after stroke: a comparison of three screening tools. Clin Rehabil. 2008;22:1095–1104.

52. de Koning I, van Kooten F, Dippel DWJ, van Harskamp F, Grobbee DE, Kluft C, Koudstaal PJ. Stroke. The CAMCOG: A useful screening instru-ment for dementia in stroke patients. Stroke. 1998:29:2080–2086.

53. Johnson-Greene D, Touradji P, Emmerson LC. The Three Cities Test: preliminary validation of a short bedside memory test in persons with acute stroke. Top Stroke Rehabil. 2009;16:321–329.

by guest on May 12, 2018

http://stroke.ahajournals.org/D

ownloaded from

Page 11: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

3018 Stroke October 2014

54. Lees R, Corbet S, Johnston C, Moffitt E, Shaw G, Quinn TJ. Test accuracy of short screening tests for diagnosis of delirium or cognitive impairment in an acute stroke unit setting. Stroke. 2013;44:3078–3083.

55. Wong A, Mok VC, Yim P, Fu M, Lam WW, Yau C, et al. The executive clock drawing task (CLOX) is a poor screening test for executive dys-function in Chinese elderly patients with subcortical ischemic vascular disease. J Clin Neurosci. 2004;11:493–497.

56. Pendlebury ST, Cuthbertson FC, Welch SJ, Mehta Z, Rothwell PM. Underestimation of cognitive impairment by Mini-Mental State Examination versus the Montreal Cognitive Assessment in patients with

transient ischemic attack and stroke: a population-based study. Stroke. 2010;41:1290–1293.

57. Schuetz GM, Schlattmann P, Dewey M. Use of 3x2 tables with an inten-tion to diagnose approach to assess clinical performance of diagnostic tests. BMJ. 2012;345:e6717

58. Quinn TJ, Fearon P, Noel-Storr AH, Young C, McShane R, Stott DJ. Informant Questionnaire on Cognitive Decline in the Elderly (IQCODE) for the diagnosis of dementia within community dwelling populations. Cochrane Database Syst Rev. 2014;4:CD010079.

59. Blackburn DJ, Bafadhel L, Randall M, Harkness KA. Cognitive screen-ing in the acute stroke setting. Age Ageing. 2013;42:113–116.

by guest on May 12, 2018

http://stroke.ahajournals.org/D

ownloaded from

Page 12: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

David J. Stott and Terence J. QuinnRosalind Lees, Johann Selvarajah, Candida Fenton, Sarah T. Pendlebury, Peter Langhorne,

Cognitive Impairment in StrokeTest Accuracy of Cognitive Screening Tests for Diagnosis of Dementia and Multidomain

Print ISSN: 0039-2499. Online ISSN: 1524-4628 Copyright © 2014 American Heart Association, Inc. All rights reserved.

is published by the American Heart Association, 7272 Greenville Avenue, Dallas, TX 75231Stroke doi: 10.1161/STROKEAHA.114.005842

2014;45:3008-3018; originally published online September 4, 2014;Stroke. 

http://stroke.ahajournals.org/content/45/10/3008World Wide Web at:

The online version of this article, along with updated information and services, is located on the

http://stroke.ahajournals.org/content/suppl/2014/09/19/STROKEAHA.114.005842.DC1Data Supplement (unedited) at:

  http://stroke.ahajournals.org//subscriptions/

is online at: Stroke Information about subscribing to Subscriptions: 

http://www.lww.com/reprints Information about reprints can be found online at: Reprints:

  document. Permissions and Rights Question and Answer process is available in the

Request Permissions in the middle column of the Web page under Services. Further information about thisOnce the online version of the published article for which permission is being requested is located, click

can be obtained via RightsLink, a service of the Copyright Clearance Center, not the Editorial Office.Strokein Requests for permissions to reproduce figures, tables, or portions of articles originally publishedPermissions:

by guest on May 12, 2018

http://stroke.ahajournals.org/D

ownloaded from

Page 13: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

Supplementary Materials

I Detailed summary of search strategy

II Papers used to assess external validity of search

III Summary of included studies

IV QUADAS-2 based assessment at individual study level

V STARDdem descriptors of reporting

VI Forest plots of test accuracy data

VII Summary ROC exploring effect of covariates (setting and reference standard)

VIII Summary of studies describing brief cognitive tests

Page 14: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

Sepplemental Methods

I Search stratgey

We searched the following databses from inception to january 2014. We applied no language or date restrictions.

ALOIS (Cochrane Dementia and Cognitive Improvement Group); ARIF (University of Birmingham); CINAHL (EBSCOhost); Embase (OvidSP); LILACS (Bireme); Medline (OvidSP); MEDION (Netherlands); Psychinfo (OvidSP) and the DARE, NHS EED, HTA databases (CRD).

We handsearched recent publications (2010 onwards) in key journals including conference proceedings (Age and Ageing; Cerebrovascular Diseases; International Journal of Stroke; Lancet Neurology; Stroke).

We contacted groups with research interest in stroke test accuracy.(acknowledgements)We utilised “related article” feature in PubMed and examined key studies in the citation databases of Science Citation Index and Scopus.

We supplemented our sensitive search with a purposive search, focussed on four prevalent cognitive screening tools:AMT, MMSE, Montreal Cognitive Assessment (MoCA) and Addenbrookes’ Cognitive Examination Revised (ACE-R).

Page 15: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

Search strategy (as used in “Medline”)

Concept a) Stroke

1.Brain Ischemia.ti,ab (exploded)

2.Cerebrovascular*ti,ab (exploded)

3.Stroke.ti,ab (exploded)

4. or/1-3

Concept b) Cognitive disorders and tests

1.3-stage commands.ti,ab

2.Abbreviated mental test.ti,ab

3.AMT.ti,ab

4.Addenbrooke’s cognitive examination revised.ti,ab

5.ACE-R.ti,ab

6.Arizona battery for communication disorders of dementia.ti,ab

7.Assessment of motor and processing skills.ti,ab

8.AMPS.ti,ab

9.Block tapping.ti,ab

10.Brixton tests.ti,ab

11.California verbal learning test.ti,ab

12.CVLT.ti,ab

13.Cambridge cognitive examination revised.ti,ab

14.CAMCOG.ti,ab

15.Chessington occupational therapy Neurological assessment battery.ti,ab

16.COTNAB.ti,ab

17.Clock drawing.ti,ab

18.Cognitive linguistic quick tester.ti,ab

19.CLQT.ti,ab

20.Cognistat.ti,ab

21.Doors and people test.ti,ab

Page 16: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

22.Galveston orientation and amnesia test.ti,ab

23.GOAT.ti,ab

24.Hayling test.ti,ab

25.Intersecting pentagons.ti,ab

26.Loewenstein occupational therapy cognitive assessment.ti,ab

27.LOTCA-G.ti,ab

28.Lothian aphasia stroke cognitive assessment.ti,ab

29.LASKA.ti,ab

30.Mental status questionnaire .ti,ab

31.MSQ.ti,ab

32.Mini mental state examination.ti,ab

33.MMSE.ti,ab

34.Montreal cognitive assessment.ti,ab

35.MoCA.ti,ab

36.Measure of cognitive linguistic ability,ti,ab

37.MCLA.ti,ab

38.OT cognitive screening tool.ti,ab

39.Perceive, recall, Plan and Perform.ti,ab

40.PRPP.ti,ab

41.Picture cards.ti,ab

42.Repeatable battery for the assessment of the neuropsychological status.ti,ab

43.RBANS.ti,ab

44.Rivermead behavioural memory test.ti,ab

45.RBMT.ti,ab

46.Rivermead perceptual assessment battery.ti,ab

47.RPAB.ti,ab

48.Screening Instrument for neuropsychological Impairments in Stroke.ti,ab

49.SINS.ti,ab

50.Short orientation memory and concentration test.ti,ab

Page 17: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

51.SOMC.ti,ab

52.Verbal fluency test .ti,ab

53.Wessex head injury matrix.ti,ab

54.WHIM.ti,ab

55..Alzheimer Disease.ti.ab (exploded)

56..Cognition.ti.ab

57..Cognition Disorders.ti,ab (exploded)

58..Dementia.ti,ab (exploded)

59.Memory.ti,ab (exploded)

60.Vascular dementia.ti,ab

61. or/1-54

62. or/55-60

63. 61 or 62

Concept c) cognitive screening

1.Mass Screening.ti,ab (exploded)

2.Mental Status Schedule.ti,ab

3.Neuropsychological Tests.ti,ab (exploded)

4.Predictive Value of Teststi,ab

5.Psychiatric Status Rating Scales.ti,ab (exploded)

6.Psychological Tests.ti,ab

7.Reproducibility of Results.ti,ab

8.ROC Curve.ti,ab

9.Sensitivity.ti,ab (exploded)

10.Specificity.ti,ab (exploded)

11.Severity of Illness Indexti,ab (exploded)

12. or/1-11

(Concept b OR concept c) AND concept a

Page 18: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

II Ten papers used for external validity check of search stratgey

Appels BA, Scherder E. The diagnostic accuracy of dementia-screening instruments with an administration time of 10 to 45 minutes for use in secondary care: A systematic review. American Journal of Alzheimers Disease and other Dementias 2010;25:301-316.

Blake H, McKinney M, Treece K, Lee E, Lincoln NB. An evaluation of screening measures for cognitive impairment after stroke. Age and Ageing.2002;31:451-456.

Bour A, Rasquin S, Boreas A, Limburg M, Verhey F. How predictive is the MMSE for cognitive performance after stroke. Journal of Neurology. 2010;257;630-7.

Crawford S, Whitnall L, Robertson J, Evans JJ. A systematic review of the accuracy and clinical utility of the Addenbrooke's Cognitive Examination and the Addenbrooke's Cognitive Examination-Revised in the diagnosis of dementia. International Journal of Geriatric Psychiatry.2011;27:659-669.

Cullen B, O’Neill B, Evans JJ, Coen RF, Lawlor BA. A review of screening tests for cognitive impairment. Journal of Neurology, Neurosurgery and Psychiatry.2007;78:790-9.

de Koning I, van Kooten F, Dippel DWJ, van Harskamp F, Grobbee DE, Kluft C, Koudstaal PJ. The CAMCOG: a useful screening instrument for dementia in stroke patients. Stroke.1998;29:2080-2086.

Kwa VIH, Limburg M, Voogel Aj, Teunisse S, Derix MMA, Hijdra A. Feasibility of cognitive screening in patients with ischaemic stroke using the CAMCOG. A Hospital Based Study. Journal of Neurology.1996;243:405-409.

Nokleby K, Boland E, bergersen H, Schanke AK, Famer L, Wagle J, Wyller TB. Screening for cognitive deficits after stroke; a comparison of three screening tools. Clinical Rehabilitation.2008;22:1095-1104.

Srikanth V, Thrift AG, Fryer JL, Saling MM, Dewey HM, Sturm JW, Donnan GA. The validity of brief screening cognitive instruments in the diagnosis of cognitive impairment and dementia after first-ever stroke. International Psychogeriatrics.2006;18:295-305.

Young J, Meagher D, Maclullich A. Cognitive assessment of older people. British Medical Journal. 2011;343: d5042 doi: 10.1136/bmj.d5042

Page 19: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

Supplemental tables

Table III Summary of inlcuded studies

a) Limited to “acute” setings b) Limited to “non-acute” settings

Page 20: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

Study Setting Time since

stroke Includes aphasia

Index test(s)

Index test rater

Diagnostic test(S)

Diagnostic test rater

Bour 20101

Neurology ward

Acute no MMSE Psychologist NPB Psychologist

Cumming 20102 Unspecified Acute unspecified

MMSE, MoCA,

Not specified NPB Not specified

Dong 20123 ASU Hyperacute no

MoCa, MMSE

Not specified NPB Psychologist

Dong 20104 ASU Hyperacute no

MoCA, MMSE

Not specified Clinical Not specified

Godefroy 20115 ASU Acute yes

MoCA, MMSE

Not specified NPB Not specified

Green 20136 ASU Acute unspecified RBANS Researcher NPB Researcher

Jodzio 20107

Neurology ward

Acute no WCST Not specified NPB Not specified

Morris 20128 ASU Acute yes (mild) ACE-R, MMSE

Physician Psychologist

NPB Psychologist

Nys 20059 ASU Acute yes MMSE Not specified NPB Not specified

Salvadori 201310

ASU Hyperacute* yes (mild) MoCA Researcher NPB Not specified

Wu 201311

ASU Unspecified no MoCA Physician Clinical Not specified

Page 21: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

Study Setting Time since

stroke Includes aphasia

Index test(s)

Index test rater

Diagnostic test(S)

Diagnostic test rater

Agrell 200012

Rehabilitation Post acute no MMSE Physician NPB Psychologist

Baum 200813

Community Post acute unspecified EFPT Not specified NPB Not specified

Blake 200214

Hospital (other)

Unspecified unspecified MMSE,

SST, RCPM Not specified NPB Not specified

Brookes 201215

Rehabilitation Post acute unspecified BMET,

MMSE, CDR Not specified Clinical Not specified

Cartoni 200716

Rehabilitation Unspecified no MEAMS OT NPB Psychologist

Cumming 2013

17

Community Long term yes MMSE, Cog4 Physician

Psychiatrist Clinical

Physician Psychiatrist

de Koning 2000

18

Neurology ward

Medium term

no (R)-CAMCOG Physician

Psychologist Clinical

Adjudication panel

de Koning 2005

19

Hospital (other)

Medium term

no R-CAMCOG Researcher Clinical Adjudication

panel

de Koning 1998

20

Outpatients Post acute yes (mild) CAMCOG,

MMSE Not specified Clinical

Adjudication panel

Desmond 1994

21

Outpatients Unspecified unspecified TICS Not specified NPB Not specified

Grace 199522

Rehabilitation Unspecified no MMSE,

3MS Not specified NPB Not specified

Hershey 198723

Outpatients unspecified no CCSE, FAQ

Mixed Clinical Physician

Larson 200524

Rehabilitation Medium

term yes (mild) RBANS Not specified NPB Not specified

Nokleby 200825

Rehabilitation Post acute yes Cognistat, SINS, CDT

Physician Psychologist

NPB Psychologist

Pendlebury 2012

26

Outpatients Long term no MMSE,

MoCA, ACE-R Physician NPB Mixed

Srikanth 200627

Outpatients Post acute yes S-MMSE Not specified Clinical Not specified

Tang 200528

Outpatients Post acute unspecified MDRS, MMSE

Researcher Clinical Psychiatrist

Wolf 201029

Rehabilitation Hyperacute yes EFPT Not specified NPB Not specified

Wong 200930

Rehabilitation Post acute unspecified MoCA Researcher NPB Researcher

Page 22: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

N/A=not applicable

MCI=Mild Cognitive Impairment; NPB=Neuropsychological battery

BMET= Brief Memory and Executive Test; CDT=Clock drawing test; CSCSE=Cognitive Capacity Screening Examination; EPFT=Executive Performance Function Test; MEAMS=Middlesex Elderly Assessment Mental State; MMSE=Mini-mental State Examination; MoCA=Montreal Cognitive Assessment; PNB=Preliminary Neuropsychological Battery; RBANS=Repeatable Battery for Assessment of Neuropsychological Status; R-CAMCOG=Rotterdam CAMCOG; SINS=Screening Instrument Neuropsychological impairments in Stroke; TICS=Telephone Interview Cognitive Status; WCST= Wisconsin Card Sorting Test

Page 23: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

Table IV

Summary of studies describing brief (less than 5 minutes administration time

ASU=Acute Stroke unit

4AT=4 A test; AMT=Abbreviated mental test; CDT=Clock drawing test; TCT=Three Cities Test

HVLT=Harvard Verbal Learning Test; MMSE=Mini Mental State Examination; MoCA=Montreal Cognitive Assessment; WCST=Wisconsin Card Sorting Test

Study "n"

included Setting (Timing)

Index tests Reference standard(s

) Summary results

Johnson-Greene 2009

31

115 ASU

(Hyperacute) TCT

MMSE, HVLT

TCT correlates with HVLT and MMSE and discriminates cases from controls

Lees 201332

111 ASU

(Hyperacute)

4AT, AMT, CDT, Cog-

4 MoCA

4AT performs well at standard MoCA thresholds CDT performs well at lower MoCA thresholds

Wong 200433

68 Outpatients

(Unspecified) CDT

MMSE, WCST

CDT correlates with MMSE and WCST and discriminates cases from controls

Page 24: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

Supplemental Figures

Figure V Risk of bias and applicability concerns assessed at study level using the revised Quality Assessment of Diagnostic Accuracy Studies tool (QUADAS-2).

Page 25: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

Pat

ient

Sel

ectio

nAgrell 2000 –

Baum 2008 –

Blake 2002 ?

Bour 2010 ?

Brookes 2012 –

Cartoni 2007 ?

Cumming 2010 –

Cumming 2013 ?

de Koning 1998 –

de Koning 2000 –

de Koning 2005 –

Desmond 1994 –

Dong 2010 ?

Dong 2012 –

Godefroy 2011 +

Grace 1995 –

Green 2013 +

Hershey 1987 –

Hobson 2003 –

Jodzio 2010 –

Larson 2005 +

Morris 2012 +

Nokleby 2008 –

Nys 2005 –

Pendlebury 2012 +

Salvadori 2013 +

Sodring 1998 +

Srikanth 2006 –

Tang 2005 ?

Wolf 2010 –

Wong 2009 –

Wu 2013 –

Inde

x T

est

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

Ref

eren

ce S

tand

ard

+

?

?

+

+

+

+

+

+

+

+

+

+

+

+

+

+

Flo

w a

nd T

imin

g

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

?

+

+

+

+

+

+

+

Risk of Bias

Pat

ient

Sel

ectio

n

?

?

+

?

+

?

+

?

+

?

+

+

+

+

?

Inde

x T

est

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

?

+

+

+

+

+

+

+

+

+

+

+

Ref

eren

ce S

tand

ard

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

Applicability Concerns

– High ? Unclear + Low

Page 26: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

Figure VI STARDdem reporting guidance as applied to included studies

Methods Results

Participants Test Methods Stats Participants Test Results Estimates

STARDdem item 1 2 25 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24

Agrell 2000 N Y Y Y N Y Y Y Y Y Y Y Y N Y Y Y Y Y N N Y N Y N

Baum 2008 Y Y Y Y Y N Y Y Y Y N N Y Y N Y Y Y Y N N N N Y Y

Blake 2002 N Y Y Y N Y Y Y Y Y Y N Y N N Y N Y Y N N Y N Y N

Bour 2010 N Y Y Y N Y Y Y Y Y Y Y Y N Y Y Y Y Y Y Y Y N Y N Brookes 2012 N Y Y Y Y N Y Y Y Y Y N Y Y N Y N Y Y Y N N N Y Y

Cartoni 2007 Y Y Y Y N Y Y Y Y Y Y N Y N Y Y Y Y Y Y N Y N Y N Cumming 2010 N Y Y Y Y N Y Y Y Y Y N Y N Y Y Y Y Y Y N Y N Y N Cumming 2013 N Y Y Y N Y Y Y Y Y Y N Y Y Y Y Y Y Y Y Y Y Y Y Y de Koning 2000 N Y Y Y N Y Y Y Y Y N N Y N Y Y Y N Y Y Y Y Y Y N de Koning 2005 N Y Y Y N Y Y Y Y Y N N Y N Y Y N Y Y N Y Y Y Y N de Koning 1998 N Y Y Y N Y Y Y Y Y Y Y Y N Y Y Y Y Y N N Y Y Y N Desmond 1994 Y Y Y Y N Y Y Y Y Y N N N Y N Y Y Y Y N N Y Y Y Y

Dong 2012 N Y Y Y Y Y Y Y Y Y Y Y Y N Y Y Y Y Y Y N Y Y Y N

Dong 2010 N Y Y Y Y Y Y Y Y Y N N Y N N Y N Y Y N N Y Y Y N Godefroy 2011 N Y Y Y N Y Y Y Y Y N N Y N Y Y N Y Y N N Y N Y N

Grace 1995 Y Y Y Y N Y Y Y Y Y N N Y N N Y N Y Y N N Y N Y N

Green 2013 N Y Y Y N Y Y Y Y Y Y Y Y N N Y Y Y Y Y N Y N Y N Hershey 1987 N Y Y Y Y N Y Y Y Y Y Y Y N N Y N N Y N N Y N Y N

Hobson 2003 N Y Y Y N Y Y Y Y Y N N Y Y Y Y Y Y Y N N Y N Y Y

Jodzio 2010 N Y Y Y Y Y Y Y Y Y N N Y N Y Y Y Y Y N N Y N Y N

Larson 2005 N Y Y Y N Y Y Y Y Y N N Y N N Y Y Y Y N N Y Y Y N

Morris 2012 N Y Y Y N Y Y Y Y Y Y Y Y N N Y Y Y Y N N Y Y Y N

Nkleby 2008 N Y Y Y Y N Y Y Y Y Y N Y N N Y Y Y Y Y N Y Y Y N

Nys 2005 N Y Y Y N Y Y Y Y Y N N Y N N Y N Y Y N N Y N N N Pendlebury 2012 N Y Y Y Y Y Y Y Y Y Y Y Y N Y Y Y Y Y N N Y N Y N Salvadori 2013 N Y Y Y N Y Y Y Y Y Y Y Y N Y Y Y Y Y Y N Y N Y N Srikanth 2006 N Y Y Y N Y Y Y Y Y N Y Y N Y Y Y Y Y N N Y N Y N

Tang 2005 N Y Y Y Y Y Y Y Y Y Y Y Y Y N Y Y Y Y N N Y N N Y

Wolf 2010 N Y Y Y Y N Y Y Y Y N N Y N N Y Y Y N N N Y N N N

Wong 2009 N Y Y Y Y N Y Y Y Y N Y Y Y N Y N Y Y N N Y N N Y

Wu 2013 N Y Y Y Y N Y Y Y Y Y Y Y N Y Y N Y Y N N Y N Y N Pendlebury 2013 N Y Y Y N Y Y Y Y Y N N Y N Y Y Y Y Y N N Y N Y N

Page 27: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

Figure VII Forest plots of test accuracy data for cognitive screening tools in stroke

Forest plot of ACE-R (threshold 88) for diagnosis of dementia / cognitive impairment (n=2 studies)

Forest plot of R-CAMCOG (threshold 25) for dementia / cognitive impairment (n=2 studies)

Page 28: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

Figure VIII Summary ROC curves exploring effect of co-variates on test accuracy

Page 29: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

a) Effect of time since stroke, comparing “acute” testing with “non-acute” testing

The filled circle/diamond represents the summary (pooled) test accuracy; unfilled circles/diamonds represent individual papers. The solid line is the summary ROC curve; the broken line is the 95% confidence interval around the summary point.

Page 30: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

b) Effect of reference standard employed, comparing “clinical” diagnosis of dementia against diagnosis made using “neuropsychological battery”

The filled circle/diamond represents the summary (pooled) test accuracy; unfilled circles/diamonds represent individual papers. The solid line is the summary ROC curve; the broken line is the 95% confidence interval around the summary point.

Page 31: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

REFERENCES

1. Bour A,Rasquin S,Boreas A,Limburg M,Verhey F. How predictive is the MMSE for cognitive performance after stroke? Journal of Neurology.2010;257:630-7.

2. Cumming TB,Blomstrand C,Bernhardt J,Linden T. The NIH stroke scale can establish cognitive function after stroke. Cerebrovasc Dis.2010;30:7-14.

3. Dong Y,Venketasubramanian N,Chan BP,Sharma VK,Slavin MJ,Collinson SL et al. Brief screening tests during acute admission in patients with mild stroke are predictive of vascular cognitive impairment 3-6 months after stroke. Journal Neurology Neurosurgery and Psychiatry.2012;83:580-5.

4. Dong YH,Sharma VK,Chan BPL,Venketasubramanian N,Teoh HL,Seet RC et al. The Montreal Cognitive Assessment (MoCA) is superior to the Mini-Mental State Examination (MMSE) for the detection of vascular cognitive impairment after acute stroke. Journal of the Neurological Sciences.2010;299:15-8.

5. Godefroy O,Fickl A,Roussel M,Auribault C,Burnicourt JM,Lamy C et al. Is the Montreal Cognitive Assessment Superior to the Mini-Mental State Examination to Detect Poststroke Cognitive Impairment? Stroke.2011;42:1712-6.

6. Green S,Sinclair E,Rodgers E,Birks E,Lincoln N. The Repeatable Battery for the Assessment of Neuropsychological Status (RBANS) for post stroke cognitive impairment screening. International Journal of Therapy and Rehabilitation.2013;20:7.

7. Jodzio K,Biechowska D. Wisconsin card sorting test as a measure of executive function impairments in stroke patients. Applied Neuropsychology.2010;17:267-77.

8. Morris K,Hacker V,Lincoln NB. The validity of the Addenbrooke's Cognitive Examination-Revised (ACE-R) in acute stroke. Disability and Rehabilitation.2012;34:189-95.

9. Nys GM,van Zandvoort MJ,de Kort PL,Jansen BP,Kappelle LJ,de Haan EH. Restrictions of the Mini-Mental State Examination in acute stroke. Archives of Clinical Neuropsychology.2005;20:623-9.

10. Salvadori E,Pasi M,Poggesi A,Chiti G,Inzitari D,Pantoni L. Predictive value of MoCA in the acute phase of stroke on the diagnosis of mid-term cognitive impairment. J Neurol.2013;260:2220-7.

11. Wu Y,Wang M,Ren M,Xu W. The effects of educational background on Montreal Cognitive Assessment screening for vascular cognitive impairment, no dementia, caused by ischemic stroke. Journal of Clinical Neuroscience.2013;20:1406-10.

12. Agrell B,Dehlin O. Mini mental state examination in geriatric stroke patients. Validity, differences between subgroups of patients, and relationships to somatic and mental variables. Aging (Milano).2000;12:439-44.

13. Baum CM,Connor LT,Morrison T,Hahn M,Dromerick AW,Edwards DF. Reliability, validity, and clinical utility of the Executive Function Performance Test: a

Page 32: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

measure of executive function in a sample of people with stroke. The American Journal of Occupational Therapy.2008;62:446-55.

14. Blake H,McKinney M,Treece K,Lee E,Lincoln NB. An evaluation of screening measures for cognitive impairment after stroke. Age and Ageing.2002;31:451-6.

15. Brookes RL,Hannesdottir K,Lawrence R,Morris RG,Markus HS. Brief Memory and Executive Test:evaluation of a new screening test for cognitive impairment due to small vessel disease. Age and Ageing 2012;41:212-8.

16. Cartoni A,Lincoln NB. The sensitivity and specificity of the Middlesex Elderly Assessment of Mental State (MEAMS) for detecting cognitive impairment after stroke. Neuropsychological Rehabilitation.2005;15:55-67.

17. Cumming TB,Churilov L,Linden T,Bernhardt J. Montreal Cognitive Assessment and Mini-Mental State Examination are both valid cognitive tools in stroke. Acta Neurologica Scandinavica.2013;128:122-9.

18. de Koning I,Dippel DW,van Kooten F,Koudstaal PJ. A short screening instrument for poststroke dementia:the R-CAMCOG. Stroke.2000;31:1502-8.

19. de Koning I,van Kooten F,Koudstaal PJ,Dippel DW. Diagnostic value of the Rotterdam-CAMCOG in post-stroke dementia. Journal Neurology Neurosurgery and Psychiatry.2005;76:263-5.

20. de Koning I,van Kooten F,Dippel DWJ,van Harskamp F,Grobbee DE,Kluft C,Koudstaal PJ. Stroke.The CAMCOG:A useful screening instrument for dementia in stroke patients. Stroke.1998:29:2080-2086.

21. Desmond DW, Tatemichi TK, Hanzawa L. The Telephone Interview for Cognitive Status.Reliability and Validity in a Stroke Sample. International Journal of Geriatric Psychiatry.1994;9:803-7.

22. Grace J,Nadler JD,White DA,Guilmette TJ,Giuliano AJ,Monsch AU et al. Folstein vs modified Mini-Mental State Examination in geriatric stroke. Stability, validity, and screening utility. Archives of Neurology.1995;52:477-84.

23. Hershey LA,Jaffe DF,Greenough PG,Yang SL. Validation of cognitive and functional assessment instruments in vascular dementia. International Journal Psychiatry Medicine.1987;17:183-92.

24. Larson E,Kirschner K,Bode R,Heinemann A,Goodman R. Construct and predictive validity of the repeatable battery for the assessment of neuropsychological status in the evaluation of stroke patients. J Clin Exp Neuropsychol.2005;27:16-32.

25. Nokleby K,Boland E,Bergersen H,Schanke AK,Farner L,Wagle J et al. Screening for cognitive deficits after stroke: a comparison of three screening tools.Clinical Rehabilitation 2008; 22:1095-104.

26. Pendlebury ST,Mariz J,Bull L,Mehta Z,Rothwell PM. MoCA, ACE-R, and MMSE Versus the National Institute of Neurological Disorders and Stroke-Canadian Stroke Network Vascular Cognitive Impairment Harmonization Standards Neuropsychological Battery After TIA and Stroke. Stroke.2012;43:464-9.

Page 33: Test Accuracy of Cognitive Screening Tests for Diagnosis ...stroke.ahajournals.org/content/strokeaha/45/10/3008.full.pdf · collate published data on test accuracy of cognitive screening

27. Srikanth V,Thrift AG,Fryer JL,Saling MM,Dewey HM,Sturm JW et al. The validity of brief screening cognitive instruments in the diagnosis of cognitive impairment and dementia after first-ever stroke. Int Psychogeriatric.2006;18:295-305.

28. Tang WK,Mok V,Chan SS,Chiu HF,Wong KS,Kwoc TC et al. Screening of dementia in stroke patients with lacunar infarcts: comparison of the Mattis dementia rating scale and the mini-mental state examination. Journal of Geriatric Psychiatry and Neurology.2005;18:3-7.

29. Wolf TJ,Stift S,Connor LT,Baum C. Feasibility of using the EFPT to detect executive function deficits at the acute stage of stroke. Stroke.2010;36:405-12.

30. Wong A,Xiong YY,Kwan PW,Chan AY,Lam WW,Wang K et al. The validity, reliability and clinical utility of the Hong Kong Montreal Cognitive Assessment (HK-MoCA) in patients with cerebral small vessel disease. Dementia and Geriatric Cognitive Disorders 2009; 28:81-7.

31. Johnson-Greene D,Touradji P,Emmerson LC. The Three Cities Test: Preliminary Validation of a Short Bedside Memory Test in Persons with Acute Stroke. Topics in Stroke Rehabilitation.2009;16:321-9.

32. Lees R,Corbet S,Johnston C,Moffitt E,Shaw G,Quinn TJ. Test accuracy of short screening tests for diagnosis of delirium or cognitive impairment in an acute stroke unit setting. Stroke.2013;44:3078-83.

33. Wong A,Mok VC,Yim P,Fu M,Lam WW,Yau C et al. The executive clock drawing task (CLOX) is a poor screening test for executive dysfunction in Chinese elderly patients with subcortical ischemic vascular disease. Journal of Clinical Neuroscience.2004;11:493-7.