health informatics: where’s the evidence? · partnership between clinicians and informaticians....

22
Page 1 of 22 Health informatics: Where’s the evidence? Authors: Philip Scott, Deborah Prytherch, Jim Briggs Centre for Healthcare Modelling and Informatics University of Portsmouth May 2010 Commissioned by the UK Faculty of Health Informatics Abstract Three cases of practical benefit from health informatics in the UK are presented. The evidence base for health informatics is critically analysed. Three important factors that limit generalisation from health informatics studies are: (1) the complexity of the healthcare environment, (2) the local ownership and enthusiasm of participants and (3) the emergent change arising from systemic adaptation. Much evidence in health informatics is missing, incomplete or lost. Two weaknesses underlying the evidence problems are the lack of a knowledge-sharing culture in the NHS and an absence of simple evaluation standards. The report recommends: an active communications plan (including use of a repository and evidence digests); higher profile promotion of health informatics professional training and education; development of a ‘lite’ model of informatics evaluation; and research to determine effective incentives for sharing knowledge. 1. INTRODUCTION ....................................................................................................................................... 2 2. CASE STUDIES .......................................................................................................................................... 2 3. EVIDENCE QUALITY IN HEALTH INFORMATICS ......................................................................................... 4 4. RECOMMENDATIONS .............................................................................................................................. 6 5. CONCLUSIONS ......................................................................................................................................... 8 DECLARATION OF INTERESTS ........................................................................................................................... 8 ACKNOWLEDGMENTS ...................................................................................................................................... 8 APPENDIX – STAKEHOLDER ANALYSIS.............................................................................................................. 9 REFERENCES................................................................................................................................................... 19

Upload: others

Post on 21-Mar-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Health informatics: Where’s the evidence? · partnership between clinicians and informaticians. Case study 3: Seasonal escalation matrix For the winter of 2009, Dorset Community

Page 1 of 22

Health informatics: Where’s the evidence?

Authors: Philip Scott, Deborah Prytherch, Jim Briggs

Centre for Healthcare Modelling and Informatics

University of Portsmouth

May 2010

Commissioned by the UK Faculty of Health Informatics

Abstract

Three cases of practical benefit from health informatics in the UK are presented. The evidence base

for health informatics is critically analysed. Three important factors that limit generalisation from

health informatics studies are: (1) the complexity of the healthcare environment, (2) the local

ownership and enthusiasm of participants and (3) the emergent change arising from systemic

adaptation. Much evidence in health informatics is missing, incomplete or lost. Two weaknesses

underlying the evidence problems are the lack of a knowledge-sharing culture in the NHS and an

absence of simple evaluation standards. The report recommends: an active communications plan

(including use of a repository and evidence digests); higher profile promotion of health informatics

professional training and education; development of a ‘lite’ model of informatics evaluation; and

research to determine effective incentives for sharing knowledge.

1. INTRODUCTION ....................................................................................................................................... 2

2. CASE STUDIES .......................................................................................................................................... 2

3. EVIDENCE QUALITY IN HEALTH INFORMATICS ......................................................................................... 4

4. RECOMMENDATIONS .............................................................................................................................. 6

5. CONCLUSIONS ......................................................................................................................................... 8

DECLARATION OF INTERESTS ........................................................................................................................... 8

ACKNOWLEDGMENTS ...................................................................................................................................... 8

APPENDIX – STAKEHOLDER ANALYSIS .............................................................................................................. 9

REFERENCES ................................................................................................................................................... 19

Page 2: Health informatics: Where’s the evidence? · partnership between clinicians and informaticians. Case study 3: Seasonal escalation matrix For the winter of 2009, Dorset Community

Page 2 of 22

1. Introduction

It is a long lamented fact that UK healthcare abounds with evidence of good and bad practice in

health informatics that is never reported beyond its immediate project context. UK health

informatics largely remains a field without a memory.

This report, commissioned by the UK Faculty of Health Informatics, has two distinct but interlinked

objectives:

• To identify and report at least two specific examples of the practical benefit of informatics in

various domains of health and social care in the UK;

• To make recommendations about how the informatics evidence base can be improved in

content and disseminated more widely and effectively.

We first present the selected case summaries and then offer a discussion of the current state of

evidence in health informatics and our proposals for its enhancement.

2. Case studies

Methods

The first objective required a specific search for relevant case studies that offer transferable

exemplars of good practice. The selection criteria we chose were that the case studies:

• provided evidence of improving the overall efficiency, effectiveness and service user experience

of health or social care services;

• included examples from the standpoint of at least two of the following groups: commissioners,

acute providers, primary care providers, health promotion services, public health services,

voluntary services, private sector providers of health and care and carers;

• were preferably based on quantitative and qualitative data;

• showed clear evidence of engagement with providers of informatics services, health and care

practitioners and service users within the given settings.

We searched for case studies in two ways: (1) A request for case studies was circulated via various

health informatics-related email lists and websites; (2) A literature search was carried out. As the

Faculty’s brief constrained us to a report of modest length, we applied the criteria pragmatically to

select examples with the most general application or the most novel features.

The queries used in the literature search comprised combinations of the following keywords:

"Health informatics", "Ehealth", "UK", "impact", "evidence", "evaluat#", "benefit". We decided to

limit the search to studies published in the period 2005-2010 to increase relevance. Searches on

these terms gave 256 results from NHS Evidence and 120 from Google Scholar. Many of the papers

were outside our selected publication period or described anticipated rather than realised benefit.

We selected one paper (case study 1).

We received twenty responses to our call for case studies. We long-listed nine of these for further

analysis and finally selected two (case studies 2-3).

Case Study 1: Home telemonitoring for heart failure

The HOME-HF project in North London [1] was evaluated using a randomized controlled trial on the

benefits of home telemonitoring for ‘typical heart failure patients’ (usually elderly but well-treated)

who had recently been discharged from hospital . The study compared patients who received the

usual follow-up care in the hospital cardiology department with those who had daily home

telemonitoring. The telemonitoring measured weight, blood pressure, heart rate and oxygen

Page 3: Health informatics: Where’s the evidence? · partnership between clinicians and informaticians. Case study 3: Seasonal escalation matrix For the winter of 2009, Dorset Community

Page 3 of 22

saturation and asked four questions about symptoms. Data was transmitted by telephone line to a

base station in the hospital and reviewed daily by a nurse. Deviation outside defined parameters

triggered clinical intervention. The home monitoring was continued for a maximum of six months.

The telemonitoring did not reduce total hospital admission rates, but did reduce the proportion of

admissions that were emergencies and the number of outpatient clinic and emergency room visits.

Home monitoring was well accepted by an elderly, multi-ethnic group of patients. The median

incremental cost was about £200 per patient. The authors concluded that they had demonstrated

that home telemonitoring enables early detection of worsening symptoms and empowers patients

to be actively involved in managing their condition. No detail was reported about the technology.

This case study demonstrates quantitative evidence of improvements in service effectiveness in

acute cardiac care. Patients and clinicians were well engaged, but there is no explicit evidence of

informatics practitioners being involved. We believe that this is typical of projects of this kind, which

are regarded as clinical innovation rather than the application of informatics.

Case Study 2: Automated prescribing for MRSA in hospitals

University Hospitals Birmingham NHS Foundation Trust uses a locally developed system called PICS

(Prescribing, Information and Communication System). Over 23,000 new prescriptions and 120,000

drug administration records are entered each week [2].

In 2008, the Trust introduced a new MRSA protocol. Risk factors for MRSA (for example, prior

admission) have to be recorded on PICS for each inpatient. Standard cultures are taken on

admission. As soon as a positive MRSA result is received from the laboratory system, an alert is

created in PICS that automatically generates the appropriate prescription for antibiotics and

antiseptic with strict rules for their administration. Monthly MRSA infection rates were roughly

halved after the introduction of the protocol.

Previous reports have indicated significant errors in MRSA antibiotic prescription [3-4]. The novel

feature of this innovation is the concept of a system-generated prescription, removing the decision

from the individual clinician (unless they choose to override the protocol). The system thus enables a

level of Trust-wide enforcement of a clinical protocol to reduce needless variation in care.

This example has quantitative evidence of benefit in patient outcomes (although strictly speaking it

demonstrates an association not a causal relationship), and clearly resulted from an effective

partnership between clinicians and informaticians.

Case study 3: Seasonal escalation matrix

For the winter of 2009, Dorset Community Health Services developed a tool to enable all partner

services in the health community to update each other on capacity and availability through the

winter period. The tool allows data on bed status from acute and community hospitals to be

combined with information on community based services, ambulance and social care capacity across

the county of Dorset to offer a clear visual presentation that enables decision-makers to see at a

glance the real-time capacity across the health community.

This case demonstrates qualitative benefit in improved access to timely, digestible information,

facilitating early discharge and reduction in time wastage from redundant phone calls to busy wards

and other care providers to determine status. Obtaining data from social services has been

problematic, particularly since central pressure was withdrawn after the peak of the flu period. The

tool has enabled a shared realisation of stress in other services. This has led to continuing

discussions about modifying referral behaviour during peaks of activity, and using longitudinal

reports for commissioners and providers to reflect upon capacity and management of services. An

indirect benefit of this project is demonstrating the value of a skilled informatics team that

understands the clinical and operational needs of the service.

Page 4: Health informatics: Where’s the evidence? · partnership between clinicians and informaticians. Case study 3: Seasonal escalation matrix For the winter of 2009, Dorset Community

Page 4 of 22

Discussion

These examples demonstrate that the application of health informatics can improve the

effectiveness of healthcare delivery, patient outcomes and the management of services. The process

of gathering case studies also shows that they are out there but need active collection to achieve

wider visibility.

Three cautions are needed when considering such ‘good news’ stories.

Due to the complex nature of healthcare processes and systems, evidence from such case studies

cannot be simplistically generalised [5]. It is safer and more useful to assess their authenticity,

credibility and transferability than whether they are formally valid and generalisable [6]. So, for

example, it might be reasonable to conclude that home monitoring for cardiology patients is in

principle a good idea. But a successful implementation will have many dependencies such as

staff:patient ratios, multi-disciplinary team culture and efficiency, patient education, technology

usability, reliability and so forth. It is not a matter of ‘deploying a solution’.

Innovations like these are typically led by enthusiastic local experts who are well embedded in

particular niches of the health service ecology. This can be seen as ‘naturally’ applying the socio-

technical design principles such as user ownership, flexible task specification and the importance of

personal values in design decisions [7]. These principles are far more challenging to apply as normal

practice if the systems behind process changes are viewed as mere commodity installations.

Informatics innovations often have unexpected consequences and lead to emergent change that

arises from systemic adaptation [8-9]. This can easily be overlooked or treated as ‘resistance to

change’ – something to be mitigated rather than evaluated and understood. This is especially the

case if information systems are used to enforce ‘improvements’ in practice (as defined by the

sponsors) when the merits of the given changes or methods of implementation are in dispute [10].

These themes are elaborated in the following section when the larger picture is considered.

3. Evidence quality in health informatics

Categorizing the problems

There are several fundamental problems with the health informatics evidence base, both globally

and for the UK specifically. The significance of these points can be illustrated by analogy to forensic

evidence in a criminal case (see Table 1 below).

Firstly, only a small proportion of informatics innovations in the NHS are properly evaluated or

reported [11]. Even though standard project management methodologies include post-project

reviews [12], it is common knowledge that they seldom occur and, even more infrequently, placed in

the public domain. We call this the missed evidence problem.

Secondly, the academic literature is not readily accessible to policy makers and service managers

due to its volume and sometimes impenetrable discourse [13]. The purpose of some academic

publication seems to be more about satisfying research-counting exercises than disseminating

knowledge. We call this the lost evidence problem.

Thirdly, the published research evidence in health informatics is of variable quality and limited

provenance [14]. A literature review in 2006 [15] noted that approximately 80% of the evidence

base is from the USA, of which about 27% is from six leading institutions with decades of experience

with home-grown systems. Along with the complex nature of the healthcare environment, this has

prevented the robust inference of general recommendations from systematic reviews [16-17]. We

call this the incomplete evidence problem.

Page 5: Health informatics: Where’s the evidence? · partnership between clinicians and informaticians. Case study 3: Seasonal escalation matrix For the winter of 2009, Dorset Community

Page 5 of 22

Table 1 – Problem categories for health informatics evidence

Category Health informatics evidence problem Forensic analogy

Missed evidence

Projects seldom evaluated Fingerprints not captured, DNA not

sampled

Lost evidence Evaluations only available in academic

journals and scholarly discourse

Pathology reports misfiled in a massive

archive

Incomplete

evidence

Findings not valid for generalisation Partial fingerprints, glitches in CCTV

recording

In simple terms, the case for health informatics leaves plenty of reasonable doubt. We suggest that

two key weaknesses underlying all three categories of evidence problem are: (1) the lack of a

knowledge-sharing culture in the NHS and (2) an absence of simple evaluation standards.

Knowledge-sharing culture

There is an active community of health informatics academics and practitioners who voluntarily

share knowledge through publications, events, websites and professional associations. However, it

must be acknowledged that it is often more or less the same group of enthusiasts who are seen

populating conference programmes and organizational committees across the BCS, the Faculty,

ASSIST and UKCHIP.

Outside this relatively small group of volunteers, there is not perceived to be much incentive to

share knowledge [18]. The knowledge sharing that does occur is often linked to a need for a

particular project or a personal interest (which sometimes is done in the volunteer’s own time rather

than as part of their professional duty). There are of course conflicting work pressures and varying

levels of autonomy, interest and experience. This relates to the nature of the profession, discussed

below.

The NHS has recognized the importance of knowledge management and has even had a National

Knowledge Service (NKS, http://www.nks.nhs.uk/) and a Chief Knowledge Officer (CKO). The NKS

website says that its role has been under review since 2007 and (ironically) it is unclear whether

anyone is currently in the CKO role. Furthermore, NHS Evidence has a specialist collection on

knowledge management [19], though it has had no activity since 2009. Online collaborative tools

such as eSpace have received a mixed response.

The secretive approach taken by some UK government agencies in recent years has not helped to

foster a knowledge sharing culture [20]. Commentators who seek to balance the ‘spin’ of agencies

and commercial suppliers tend to be viewed as subversive. Over time, changing social pressures may

overcome this restrictive attitude to information. It remains to be seen if social networking

approaches [21-22] are more than a transitory phenomenon and find new ways to capture and share

knowledge and wisdom with a wider audience, perhaps by-passing some of the existing constraints.

Evaluation standards

Finding ways to improve the standard of evaluating and reporting health informatics innovations has

been a continuing concern [23-26]. The principal standards now competing for the support of the

international health informatics community are STARE-HI, adopted by the International Medical

Informatics Association (IMIA) [27-28], and the method promulgated by the US Agency for

Healthcare Research and Quality (AHRQ) [29]. Both are extensive methodologies that may prove

Page 6: Health informatics: Where’s the evidence? · partnership between clinicians and informaticians. Case study 3: Seasonal escalation matrix For the winter of 2009, Dorset Community

prohibitively demanding for widespread NHS adoption without substantial additional funding ring

fenced for evaluation purposes.

4. Recommendations

The diagram below summarizes

done about it. The green boxes represent the categories of evidence problem. The blue ovals on the

left represent the causal factors we have identified that we believe are i

healthcare and research are or will be in the

understood and accepted. The pink ovals on the right represent the causal factors

tractable and amenable to change

the feasible changes.

Figure

Communications plan

We recommend an active approach to knowledge

communications planning and stakeholder management

presented in the Appendix.

The situation in health informatics

are simply too many potential sources of information, and

needed to bridge the gap between research and practice

and applicable guidance to policy makers, planners, service managers and practitioners.

Locally owned regional events seem to be e

2010 Clinical Informatics Marketplace in Bristol

Informatics conference in Portsmouth

Another approach worth consideration is to d

or specialty [11]. The idea is that by linking the informatics evidence to a particular clinical need or

area of practice, it becomes more relevant and therefore more likely to be well received and acted

upon. The February 2010 national

this might lead to the development of health informatics

consolidate expertise and help to

Page 6 of 22

prohibitively demanding for widespread NHS adoption without substantial additional funding ring

our view of the current state and what we recommend should be

The green boxes represent the categories of evidence problem. The blue ovals on the

left represent the causal factors we have identified that we believe are inherent in the way UK

healthcare and research are or will be in the near future. We believe these factors simply have to be

understood and accepted. The pink ovals on the right represent the causal factors

tractable and amenable to change. The lilac boxes are the actions that we recommend to achieve

Figure 1 – Current state and recommendations

We recommend an active approach to knowledge dissemination, informed by best practice in

planning and stakeholder management [30]. A draft stakeholder analysis is

in health informatics closely parallels that of the healthcare professions, where

are simply too many potential sources of information, and evidence summaries and guidelines are

needed to bridge the gap between research and practice [31]. There is a need to produce accessible

and applicable guidance to policy makers, planners, service managers and practitioners.

egional events seem to be effective ways to disseminate knowledge

rmatics Marketplace in Bristol and the annual Southern Institute of Health

Informatics conference in Portsmouth [32] offer good examples.

Another approach worth consideration is to disseminate knowledge ‘packages’ by clinical pathway

The idea is that by linking the informatics evidence to a particular clinical need or

area of practice, it becomes more relevant and therefore more likely to be well received and acted

national e-prescribing forum was an exemplar in this respect.

this might lead to the development of health informatics sub-specialties to strengthen and

consolidate expertise and help to avoid information overload.

prohibitively demanding for widespread NHS adoption without substantial additional funding ring-

our view of the current state and what we recommend should be

The green boxes represent the categories of evidence problem. The blue ovals on the

nherent in the way UK

future. We believe these factors simply have to be

understood and accepted. The pink ovals on the right represent the causal factors that we think are

we recommend to achieve

informed by best practice in

A draft stakeholder analysis is

healthcare professions, where there

evidence summaries and guidelines are

to produce accessible

and applicable guidance to policy makers, planners, service managers and practitioners.

ways to disseminate knowledge – the March

and the annual Southern Institute of Health

by clinical pathway

The idea is that by linking the informatics evidence to a particular clinical need or

area of practice, it becomes more relevant and therefore more likely to be well received and acted

was an exemplar in this respect. Over time,

strengthen and

Page 7: Health informatics: Where’s the evidence? · partnership between clinicians and informaticians. Case study 3: Seasonal escalation matrix For the winter of 2009, Dorset Community

Page 7 of 22

Evidence repository

An important tool to support knowledge dissemination is a central resource to index and access

relevant evidence. The UK Faculty’s research repository [33] has been established for this purpose

and offers a good opportunity to build a recognized and trusted evidence base. There needs to be

debate about governance and access rules to the repository so that it balances openness with

protection of commercially sensitive or organizationally embarrassing information that otherwise

might never be seen at all.

The repository should aim to be more than just a ‘cupboard’ of reports, as some other worthy efforts

seem to have become [34]. Actively gathering, documenting, quality reviewing and disseminating

the evidence and evaluating the effectiveness of its communication requires adequate resourcing.

This would require further work to develop a business plan for an ‘evidence service’ for health

informatics. Eventually, this may migrate to become a specialist collection within NHS Evidence.

Professionalism

Health informatics is an unregulated profession in the UK and therefore has no mandatory

educational requirements for practitioners. Educational standards have been proposed by IMIA [35]

and minimum professional standards have been proposed by the UK Council for Health Informatics

Professions [36].

As long as these standards are merely aspirational, anyone can work in and manage health

informatics regardless of their lack of specialist knowledge. This seems likely to reinforce the vicious

circle of uninformed, unevaluated projects. We believe that awareness of the evidence of what has

not worked in health informatics [37] is crucial and emphasizes the need for at least a minimum

educational requirement.

Education, training and development

We believe that to build the professional capability and capacity needed in UK healthcare there is

need for further promotion of health informatics education, training and development (ETD). This

applies at both the formative stage and in continuing professional development.

We suggest that this should expand beyond traditional academic routes and explore more vocational

approaches such as forms of apprenticeship or ‘boot camps’ for people who want to move quickly

into health informatics work. There may be opportunities for ETD partnerships between NHS Trusts,

suppliers, academia and employment agencies that could be pump-primed by HEIF or KTP funding.

We believe this should be explored with the Department for Business, Innovation and Skills, UKCHIP

and BCS Health. Such partnerships might also foster knowledge-sharing networks.

R&D: Incentives

We believe further work is needed to identify the most useful incentives for busy NHS staff to

participate in active knowledge-sharing in health informatics. As noted in the Faculty paper cited

above [18], existing mechanisms and cultural norms work against free donation of valuable

intellectual property to the common good or public acknowledgement of project ‘failure’ – when it is

in fact the ‘failures’ from which we often learn the most. The AMIA collation of case studies [38],

including ‘good’ and ‘bad’ stories, is a useful pattern for this.

We expect the incentives to link with the professionalism and ETD themes discussed above. We

would also like to see a requirement for organizations to demonstrate participation in the

knowledge economy, for example by including presumption of published evaluation in scrutiny

frameworks and gateway reviews for informatics projects.

Page 8: Health informatics: Where’s the evidence? · partnership between clinicians and informaticians. Case study 3: Seasonal escalation matrix For the winter of 2009, Dorset Community

Page 8 of 22

R&D: Evaluation lite

We suggest that a ‘lite’ form of evaluation that is achievable for routine usage in resource-stressed

NHS organizations should be developed and mandated. We recommend the approach to develop

this should be a modified Delphi method of iterative expert consultation as used by the developers

of the IMIA STARE-HI standard [27-28] and that it should not start from scratch but should be based

on the two main international standards and the earlier NHS evaluation method [24]. The EFMI

Evaluation Working Group [39] should also be consulted. The importance of evaluation standards

was endorsed by the conference declaration from the March 2010 EU Ministerial Conference on

eHealth [40].

5. Conclusions

Despite changes in approach to the national programme for IT, health informatics has remained a

central pillar of NHS policy as an enabler of service transformation [41-43]. This is subject to changes

of direction as governments come and go, but in some form will have to endure to achieve the

needed efficiency gains and safety improvements. This is consistent with the strategic objectives of

other programmes in Europe and North America [44-45].

The NHS's aim is that health informatics should interconnect citizens, patients and clinicians with the

right information, reduce operational costs and support new models of care [43]. Yet health

informatics projects have often failed to deliver their anticipated benefits, both in the UK [46-47]

and elsewhere [38, 48-49]. Many of the reasons for failure in complex IT projects are well known but

nonetheless recur [50]. We believe that our recommendations for an evidence-based approach go

some way toward addressing this situation.

Declaration of interests

PS is a Council member of UKCHIP and actively promotes the need for health informatics

professionalism. However, the paper does not represent an official UKCHIP perspective.

Acknowledgments

The authors thank Dr Susan Clamp of the Yorkshire Centre for Health Informatics for reviewing the

project plan and final report and Dr Paul Woolman of the Scottish eGovernment programme and Dr

Jean Roberts of the University of Central Lancashire for comments on the stakeholder analysis.

Page 9: Health informatics: Where’s the evidence? · partnership between clinicians and informaticians. Case study 3: Seasonal escalation matrix For the winter of 2009, Dorset Community

Page 9 of 22

Appendix – Stakeholder analysis

The aim of this stakeholder analysis is to inform a communications plan for improving awareness

and dissemination of health informatics evidence to the relevant parties. Evidence is defined here as

both research findings and unpublished case studies that can inform best practice. The purpose of

this document is to present a view of the various stakeholder groups and their respective interests in

the health informatics evidence base [30].

The tables below propose the main topics of interest of each group; the potential providers of the

information needed; the format, channels and frequency suggested for communicating to the

stakeholder group; how we could evaluate whether the communication plan is working; and a red-

amber-green assessment of the current gap in communication for the given topic and stakeholder to

suggest priorities for the communication plan. (The priority assessments do not claim to be

evidence-based; they are an initial subjective judgement offered for stakeholder comment.)

Health informatics stakeholders are here categorized into seven groups:

• Policy makers and planners

• Clinicians

• Healthcare provider management

• IT suppliers (including trade associations and industry-profession liaison groups)

• Academics

• Patients and carers

• Health informatics practitioners.

Obviously, these groups could be analysed into finer grained subgroups, but these are suggested as

appropriate divisions with distinct information needs. A relatively simple taxonomy is likely to be

more practicable for an effective communications plan.

The topics of interest within health informatics have been grouped into ten categories:

• Information standards (covering records management, clinical terminologies and

interoperability)

• Information analysis and reporting (covering both operational and commissioning usage)

• Knowledge management (covering healthcare evidence dissemination and accessibility)

• Electronic patient data (covering data aggregation, population data, detailed and summary

operational health records)

• Decision support (including prescribing and other ordering, care protocols and pathways)

• Communications support (covering inter-professional and patient/carer communications)

• Managing change (particular healthcare and informatics issues)

• Information governance

• Evaluation of health informatics effectiveness

• Health informatics education (covering all of the above)

(The distinction between “Electronic patient data” and “Decision support” is that the former is

simply passive access to data, whereas the latter offers active guidance about patient care. In

practice, however, the distinction is unlikely to be significant for communications planning.)

This proposal has been informed by the IMIA recommendations on health informatics education

[35], a literature review of the scope of health informatics [51], a systematic meta-narrative review

of electronic patient record research [52], the Department of Health human resources strategy for

Page 10: Health informatics: Where’s the evidence? · partnership between clinicians and informaticians. Case study 3: Seasonal escalation matrix For the winter of 2009, Dorset Community

Page 10 of 22

health informatics professionals [53], the registration standards of the UK Council for Health

Informatics Professions [54] and an informal stakeholder consultation.

In the following tables, “NHS information bodies” covers the NHS Information Centre for England,

Health Solutions for Wales and equivalent bodies in the other home countries; “IT programmes”

covers Connecting for Health (England), Informing Healthcare (Wales), the HSC ICT Programme

(Northern Ireland) and the Scottish government e-health directorate.

Page 11: Health informatics: Where’s the evidence? · partnership between clinicians and informaticians. Case study 3: Seasonal escalation matrix For the winter of 2009, Dorset Community

Page 11 of 22

1. Policy makers and planners

This stakeholder group comprises the Health Departments of each of the UK home countries and the various regional NHS management tiers that deal with

planning, strategic direction and commissioning.

Main health informatics topics of

interest

Sources Format Channels and frequency How evaluate? Current gap

Information standards Standards bodies

NHS info bodies

Digest

Evidence

summaries

Quarterly or event-driven email

with web links

Annual printed report and

seminar

Adoption of current/emerging standards in policy/plans

Information analysis and reporting NHS Evidence

NHS info bodies

UK Faculty HI

BCS Health

IT programmes

Suppliers

Adoption of best/innovative practice in commissioning

Knowledge management Promotion of best practice in policy/commissioning

Decision support NHS Evidence

UK Faculty HI

IT programmes

Suppliers

Adoption of latest evidence in policy/plans

Communications support UK Faculty HI

BCS Health

IT programmes

Suppliers

Guidance Quarterly email

Website

Promotion of best practice in policy/commissioning

Managing change OGC

IT programmes

UK Faculty HI

Suppliers

Case studies Targeted campaign

Website

Annual printed report and

seminar

Adoption of best practice in policy/plans

Information governance Info Commissioner

NHS info bodies

IT programmes

BCS Health

UK Faculty HI

Guidance Annual printed report and

seminar

Adoption of best practice in policy/plans

Evaluation IMIA

BCS Health

UK Faculty HI

UKCHIP

Targeted campaign

Annual printed report and

seminar

Policy driven increase in number and quality of

published UK evaluations

Education Guidance Targeted campaign

Annual printed report and

seminar

Policy driven increase in number of thriving UK health

informatics courses

Page 12: Health informatics: Where’s the evidence? · partnership between clinicians and informaticians. Case study 3: Seasonal escalation matrix For the winter of 2009, Dorset Community

Page 12 of 22

2. Clinicians

This stakeholder group comprises practising clinicians of all healthcare disciplines throughout the NHS and UK private healthcare and their respective

professional bodies.

Main health informatics topics

of interest

Sources Format Channels and frequency How evaluate? Current

gap

Information standards

Standards

bodies

Digest

Guidance

Case studies

Evidence

summaries

Event-driven email with web links

Occasional podcasts

Professional body newsletters

Occasional seminars with specialty

professional groups

Consultation with Royal Colleges and other clinician groups

Survey

Focus groups

Information analysis and

reporting

NHS Evidence

NHS info bodies

UK Faculty HI

BCS Health

IT programmes

Suppliers

Knowledge management

Electronic patient data

Decision support

Communications support

Managing change OGC

IT programmes

UK Faculty HI

Suppliers

Targeted campaign

Websites

Professional body newsletters

Occasional seminars with specialty

professional groups

Adoption of best practice in local policy/plans/projects

Survey

Focus groups

Information governance Info

Commissioner

NHS info bodies

IT programmes

BCS Health

UK Faculty HI

Evaluation IMIA

BCS Health

UK Faculty HI

UKCHIP

Targeted campaign

Event-driven email with web links

Professional body newsletters

Occasional seminars with specialty

professional groups

Locally driven increase in number and quality of published

UK evaluations

Education Increase in number of thriving UK health informatics

courses and clinician participation

Page 13: Health informatics: Where’s the evidence? · partnership between clinicians and informaticians. Case study 3: Seasonal escalation matrix For the winter of 2009, Dorset Community

Page 13 of 22

3. Healthcare provider management

This stakeholder group comprises managers and leaders in all healthcare provider services throughout the NHS and UK private healthcare.

Main health informatics topics of

interest

Sources Format Channels and frequency How evaluate? Current

gap

Information standards

Standards bodies Digest

Guidance

Case

studies

Event-driven email with web

links

Professional body newsletters

Regional/SHA seminars

NHS Confederation

CIO fora

Adoption of best practice in local policy/plans/projects

Consultation with professional management associations

Survey

Focus groups

Information analysis and reporting

NHS Evidence

NHS info bodies

UK Faculty HI

BCS Health

IT programmes

Suppliers

Knowledge management

Electronic patient data

Decision support

Communications support

Managing change OGC

IT programmes

UK Faculty HI

Suppliers

Targeted campaign

Websites

Professional body newsletters

Regional/SHA seminars

NHS Confederation

Adoption of best practice in local policy/plans/projects

Survey

Focus groups

Information governance Info

Commissioner

NHS info bodies

IT programmes

BCS Health

UK Faculty HI

Evaluation IMIA

BCS Health

UK Faculty HI

UKCHIP

Targeted campaign

Event-driven email with web

links

Professional body newsletters

Regional/SHA seminars

NHS Confederation

Locally driven increase in number and quality of published UK

evaluations

Education Increase in number of thriving UK health informatics courses with

employer support

Focus groups

Page 14: Health informatics: Where’s the evidence? · partnership between clinicians and informaticians. Case study 3: Seasonal escalation matrix For the winter of 2009, Dorset Community

Page 14 of 22

4. IT suppliers

This stakeholder group comprises large and small suppliers of IT products and services to UK healthcare organizations and includes trade associations and

industry-profession liaison groups.

Main health informatics topics

of interest

Sources Format Channels and frequency How evaluate? Current

gap

Information standards

Standards

bodies

Digest

Guidance

Case studies

Evidence

summaries

Event-driven email with web links

Websites

Trade association and liaison group

newsletters and seminars

Adoption of current standards in products and services

Consultation with trade associations and liaison groups

Supplier participation in standards development

Electronic patient data

NHS Evidence

NHS info bodies

UK Faculty HI

BCS Health

IT programmes

Adoption of best practice in products and services

Consultation with trade associations and liaison groups

Client feedback

Decision support

Communications support

Managing change OGC

IT programmes

UK Faculty HI

Suppliers

Information governance Info

Commissioner

NHS info bodies

IT programmes

BCS Health

UK Faculty HI

Evaluation IMIA

BCS Health

UK Faculty HI

UKCHIP

Targeted campaign

Event-driven email with web links

Professional body newsletters

Supplier sponsored increase in number and quality of

published UK evaluations

Education Increase in number of thriving UK health informatics

courses with supplier sponsorship

Page 15: Health informatics: Where’s the evidence? · partnership between clinicians and informaticians. Case study 3: Seasonal escalation matrix For the winter of 2009, Dorset Community

Page 15 of 22

5. Academics

This stakeholder group comprises UK educators and researchers within health informatics.

Main health informatics topics

of interest

Sources Format Channels and frequency How evaluate? Current

gap

Information standards

Standards bodies Digest

Guidance

Case studies

Evidence summaries

Bibliographies

Event-driven email with web

links

Websites

Professional body

newsletters

HEA

UK Faculty HI

BCS Health

UKCHIP

Adoption of best practice in UK health informatics courses

and academic consultancy

Consultation with UK Faculty HI, BCS Health

Survey

Focus groups

Information analysis and

reporting

NHS Evidence

NHS info bodies

UK Faculty HI

BCS Health

IT programmes

Suppliers

Knowledge management

Electronic patient data

Decision support

Communications support

Managing change OGC

IT programmes

UK Faculty HI

Suppliers

Information governance Info Commissioner

NHS info bodies

IT programmes

BCS Health

UK Faculty HI

Evaluation IMIA

BCS Health

UK Faculty HI

UKCHIP

Increase in number and quality of published UK evaluations

Education Increase in number of thriving UK health informatics courses

Page 16: Health informatics: Where’s the evidence? · partnership between clinicians and informaticians. Case study 3: Seasonal escalation matrix For the winter of 2009, Dorset Community

Page 16 of 22

6. Patients and carers

This stakeholder group comprises patients and their carers within the NHS and UK private healthcare.

Main health informatics topics

of interest

Sources Format Channels and

frequency

How evaluate? Current

gap

Knowledge management

NHS Evidence

NHS info bodies

UK Faculty HI

BCS Health

IT programmes

Suppliers

Leaflets

Directories

Evidence

summaries

Guidance

PPI groups

Websites

HealthSpace

Occasional press

items

Improved patient involvement in improving self management advice

and guidance

Consultation with PPI groups

Survey

Focus groups

Electronic patient data

Patient involvement in improving information systems, data quality and

inter-professional communication

Survey

Focus groups

Decision support

Communications support

Information governance Info Commissioner

NHS info bodies

IT programmes

BCS Health

UK Faculty HI

Evaluation IMIA

BCS Health

UK Faculty HI

UKCHIP

Patient involvement in UK evaluations

Page 17: Health informatics: Where’s the evidence? · partnership between clinicians and informaticians. Case study 3: Seasonal escalation matrix For the winter of 2009, Dorset Community

Page 17 of 22

7. Health informatics practitioners

This stakeholder group comprises staff in the NHS and UK private healthcare whose primary work function is health informatics.

Main health informatics topics of

interest

Sources Format Channels and frequency How evaluate? Current

gap

Information standards

Standards bodies Digest

Guidance

Case studies

Evidence

summaries

Event-driven email with

web links

Websites

Occasional podcasts

Professional body

newsletters

Occasional seminars

Adoption of best practice in local policy/plans/projects

Consultation with BCS Health, UKCHIP, ASSIST and other

professional groups

Survey

Focus groups

Information analysis and

reporting

NHS Evidence

NHS info bodies

UK Faculty HI

BCS Health

IT programmes

Suppliers

Knowledge management

Electronic patient data

Decision support

Communications support

Managing change OGC

IT programmes

UK Faculty HI

Suppliers

Information governance Info

Commissioner

NHS info bodies

IT programmes

BCS Health

UK Faculty HI

Evaluation IMIA

BCS Health

UK Faculty HI

UKCHIP

ASSIST

Targeted campaign

Event-driven email with

web links

Professional body

newsletters

Regional/SHA seminars

Locally driven increase in number and quality of published UK

evaluations

Education Increase in number of thriving UK health informatics courses and

practitioner participation

Page 18: Health informatics: Where’s the evidence? · partnership between clinicians and informaticians. Case study 3: Seasonal escalation matrix For the winter of 2009, Dorset Community

Page 18 of 22

Conclusion

The table below proposes the priority areas for attention in a communications plan for health

informatics evidence.

Main health informatics topics of interest Stakeholders

Electronic patient data Patients and carers

Decision support

Communications support

Managing change

Policy makers and planners

Clinicians

Healthcare provider management

IT suppliers

Health informatics practitioners

Evaluation

Education

Stakeholder comments are welcomed on this draft document.

Page 19: Health informatics: Where’s the evidence? · partnership between clinicians and informaticians. Case study 3: Seasonal escalation matrix For the winter of 2009, Dorset Community

Page 19 of 22

References

[1] Dar O, Riley J, Chapman C, Dubrey SW, Morris S, Rosen SD, et al. A randomized trial of home

telemonitoring in a typical elderly heart failure population in North West London: results of

the Home-HF study. Eur J Heart Fail. 2009 Mar;11(3):319-25.

[2] Ball S. Decision support and corollary order benefits. National ePrescribing Forum,

Birmingham; 2010.

[3] Lines LD, Weightman NC. Incidence and effect of incorrect prescription and administration of

nasal mupirocin for methicillin-resistant Staphylococcus aureus. Journal of Hospital

Infection. 2005;61(2):180-180.

[4] Moran GJ, Krishnadasan A, Gorwitz RJ, Fosheim GE, McDougal LK, Carey RB, et al.

Methicillin-Resistant S. aureus Infections among Patients in the Emergency Department. N

Engl J Med. 2006;355(7):666-674.

[5] Shiell A, Hawe P, Gold L. Complex interventions or complex systems? Implications for health

economic evaluation. BMJ. 2008;336(7656):1281-3.

[6] Robson C. Real world research. 2nd ed. Oxford: Blackwell; 2002.

[7] Clegg C. Sociotechnical principles for system design. Applied Ergonomics. 2000;31(5):463-

477.

[8] Hanseth O. Integration-complexity-risk: the making of information systems out-of-control.

In: Hanseth O, Ciborra C, editors. Risk, complexity and ICT. Cheltenham: Edward Elgar; 2007.

[9] Plsek PE, Greenhalgh T. Complexity science: The challenge of complexity in health care. BMJ.

2001;323(7313):625-628.

[10] Broadhurst K, Wastell D, White S, Hall C, Peckover S, Thompson K, et al. Performing 'Initial

Assessment': Identifying the Latent Conditions for Error at the Front-Door of Local Authority

Children's Services. Br J Soc Work [electronic document]. 2009 [cited 11 December 2009].

Available from: http://bjsw.oxfordjournals.org/cgi/content/abstract/bcn162v1

[11] Eason K. Providing high quality care for all through effective information sharing - lessons

learned across the UK. HC 2010, Birmingham; 2010.

[12] UK Office of Government Commerce. Managing Successful Projects with PRINCE2. London:

OGC; 2005.

[13] Allen RK. The failure of modern textbooks. BMJ. 2010 7 May 2010;340(may07_1):c2132.

[14] Clamp S, Keen J. Electronic health records: is the evidence base any use? In: Bryant J, editor.

Proceedings of Healthcare Computing 2006. Harrogate: BCS Health Informatics Forum.

[15] Shekelle PG, Morton SC, Keeler EB. Costs and benefits of health information technology.

Evidence report/technology assessment No.132. AHRQ Publication No. 06-E006. 2006 [cited

18 August 2006]. Available from: http://www.ahrq.gov/clinic/tp/hitsystp.htm

[16] Car J, Black A, Anandan C, Cresswell K, Pagliari C, McKinstry B, et al. The Impact of eHealth

on the Quality & Safety of Healthcare. [electronic document]. 2008 [cited 12 March 2010].

Available from:

http://www.haps.bham.ac.uk/publichealth/cfhep/documents/NHS_CFHEP_001_Final_Repor

t.pdf

[17] Chaudhry B, Wang J, Wu S, Maglione M, Mojica W, Roth E, et al. Systematic review: impact

of health information technology on quality, efficiency, and costs of medical care. Ann Intern

Med. 2006 May 16;144(10):742-52.

Page 20: Health informatics: Where’s the evidence? · partnership between clinicians and informaticians. Case study 3: Seasonal escalation matrix For the winter of 2009, Dorset Community

Page 20 of 22

[18] Murphy PJ, Ward R, Solomonides T, Sale S, Haines C. Developing a culture of knowledge

sharing across the NHS that stimulates the application of HI research and best practice.

[electronic document]. 2009 [cited 29 April 2010]. Available from:

http://hsc.uwe.ac.uk/net/research/Data/Sites/1/Final%20Report%20Grant%203%20for%20

publication.pdf

[19] NHS Evidence. Knowledge management specialist collection. 2009 [cited 27 April 2010].

Available from: http://www.library.nhs.uk/knowledgemanagement/

[20] Collins T. Government ordered to publish reviews of risky IT projects. Computer Weekly

[electronic document]. 2009 22 June [cited 15 December 2009]. Available from:

http://www.computerweekly.com/Articles/2009/06/22/236526/government-ordered-to-

publish-reviews-of-risky-it-projects.htm

[21] Crowdsourcing. 2010 [cited 10 May 2010]. Available from:

http://en.wikipedia.org/wiki/Crowdsourcing

[22] Booth R. Trafigura: A few tweets and freedom of speech is restored. 2009 [cited 10 May

2010]. Available from: http://www.guardian.co.uk/media/2009/oct/13/trafigura-tweets-

freedowm-of-speech

[23] UK Department of Health. Project review: objective evaluation. Guidance for NHS managers

on evaluating information systems projects. [electronic document]. 1996 [cited 12 March

2010]. Available from:

http://www.dh.gov.uk/en/Publicationsandstatistics/Publications/PublicationsPolicyAndGuid

ance/DH_4009612

[24] Heathfield H, Clamp S, Felton D. PROBE. Project Review and Objective Evaluation for

Electronic Patient Record Projects: UK Institute of Health Informatics; 2001.

[25] Cundy A. Computer systems evaluation (1969-80). In: Hayes G, Barnett D, editors. UK health

computing: recollections and reflections. Swindon: BCS; 2008.

[26] Friedman C, Wyatt J. Evaluation methods in biomedical informatics. 2nd ed. New York:

Springer Verlag; 2006.

[27] Talmon J, Ammenwerth E, Brender J, de Keizer N, Nykanen P, Rigby M. STARE-HI--Statement

on reporting of evaluation studies in Health Informatics. Int J Med Inform. 2009 Jan;78(1):1-

9.

[28] Nykänen P, Brender J, Ammenwerth E, Talmon J, Keizer Nd, Rigby M, et al. Guidelines for

Good Evaluation Practices in Health Informatics (v0.16). [electronic document]. 2008 [cited

September 4 2009]. Available from: http://iig.umit.at/efmi/gep-

hi/GEP_HI_v016_Dec012008.doc

[29] Cusack C, Byrne C, Hook J, McGowan J, Poon E, Zafar A. Health Information Technology

Evaluation Toolkit: 2009 Update. Rockville, MD: Agency for Healthcare Research and Quality;

2009 June. AHRQ Publication No. 09-0083-EF.

[30] UK Office of Government Commerce. Managing Successful Programmes. London: OGC;

2003.

[31] Wyatt J. Clinical knowledge and practice in the information age: a handbook for health

professionals. London: RSM Press; 2001.

[32] Centre for Healthcare Modelling and Informatics. Southern Institute of Health Informatics

conference. 2010 [cited 6 June 2010]. Available from:

http://www.chmi.port.ac.uk/sihi/sihi2010/index.htm

Page 21: Health informatics: Where’s the evidence? · partnership between clinicians and informaticians. Case study 3: Seasonal escalation matrix For the winter of 2009, Dorset Community

Page 21 of 22

[33] UK Faculty of Health Informatics. Research repository. 2010 [cited 10 May 2010]. Available

from: http://www.rmtrack.com/NHSHIRLC/Login.asp?Goto=%2FDefault%2Easp

[34] Ammenwerth E, de Keizer N. A web-based inventory of evaluation studies in medical

informatics. 2006 [cited 7 July 2006]. Available from: http://evaldb.umit.at/index.htm

[35] Mantas J, Ammenwerth E, Demiris G, Hasman A, Haux R, Hersh W, et al. Recommendations

of the International Medical Informatics Association (IMIA) on Education in Biomedical and

Health Informatics. Methods Inf Med. 2010;49.

[36] UK Council for Health Informatics Professions. Registration with UKCHIP. [electronic

document]. 2009 [cited 11 November 2009]. Available from:

http://www.ukchip.org/?q=page/Registration-with-UKCHIP

[37] Huffington Post Investigative Fund. Explore Health IT 'Adverse Event' Reports Submitted to

the FDA. 2010 [cited 10 May 2010]. Available from:

http://huffpostfund.org/stories/pages/database-explore-health-it-adverse-event-reports-

submitted-fda

[38] Leviss J, Gugerty B, Kaplan B, Keenan G, Ozeran L, Rose E, et al., editors. H.I.T. or miss:

lessons learned from health information technology implementations. Chicago, IL: AHIMA

Press; 2010.

[39] Ammenwerth E, Nykänen P, Brender J. EFMI Working Group for Assessment of Health

Information Systems. 2010 [cited 6 June 2010]. Available from: http://iig.umit.at/efmi/

[40] World Health IT Conference, EU Ministerial Conference on eHealth. Conference Declaration:

European Co-operation on eHealth. 2010 [cited 6 June 2010]. Available from:

http://www.ehealthweek2010.org/sala-de-prensa/prensa/conference-declaration-

european-co-operation-on-ehealth

[41] UK Department of Health. Health informatics review. 2008 [cited 15 August 2008]. Available

from:

http://www.dh.gov.uk/en/Publicationsandstatistics/Publications/PublicationsPolicyAndGuid

ance/DH_086073

[42] UK Department of Health. The operating framework for the NHS in England 2010/11.

[electronic document]. 2009 [cited 17 December 2009]. Available from:

http://www.dh.gov.uk/prod_consum_dh/groups/dh_digitalassets/@dh/@en/@ps/@sta/@

perf/documents/digitalasset/dh_110159.pdf

[43] UK Department of Health. Informatics planning 2010/11. [electronic document]. 2009 [cited

12 March 2010]. Available from:

http://www.dh.gov.uk/en/Publicationsandstatistics/Publications/PublicationsPolicyAndGuid

ance/DH_110335

[44] EU eHealth ERA team. eHealth priorities and strategies in European countries: Towards the

Establishment of a European eHealth Research Area. [electronic document]. 2007 [cited 12

March 2010]. Available from:

http://ec.europa.eu/information_society/activities/health/docs/policy/ehealth-era-full-

report.pdf

[45] Canada Health Infoway. Fulfilling the Promise - Canada Health Infoway Inc. Annual Report

2005-2006. 2006 [cited 16 February 2007]. Available from: http://www.infoway-

inforoute.ca/Admin/Upload/Dev/Document/Annual%20Report%2005-06%20EN.pdf

[46] British Computer Society Health Informatics Forum. The Way Forward for NHS Health

Informatics. [electronic document]. 2006 [cited 15 December 2009]. Available from:

http://www.bcs.org//upload/pdf/BCS-HIF-report.pdf

Page 22: Health informatics: Where’s the evidence? · partnership between clinicians and informaticians. Case study 3: Seasonal escalation matrix For the winter of 2009, Dorset Community

Page 22 of 22

[47] Greenhalgh T, Stramer K, Bratan T, Byrne E, Mohammad Y, Russell J. Introduction of shared

electronic records: multi-site case study using diffusion of innovation theory. BMJ.

2008;337:a1786.

[48] Hanseth O, Jacucci E, Grisot M, Aanestad M. Reflexive integration in the development and

implementation of an Electronic Patient Record system. In: Hanseth O, Ciborra C, editors.

Risk, complexity and ICT. Cheltenham: Edward Elgar; 2007.

[49] Aarts J, Doorewaard H, Berg M. Understanding implementation: the case of a computerized

physician order entry system in a large Dutch university medical center. J Am Med Inform

Assoc. 2004 May-Jun;11(3):207-16.

[50] UK National Audit Office. Delivering successful IT-enabled business change. [electronic

document]. 2006 [cited November 9 2009]. Available from:

http://www.nao.org.uk/publications/0607/delivering_successful_it-enabl.aspx

[51] Schuemie M, Talmon J, Moorman P, Kors J. Mapping the domain of medical informatics.

Methods Inf Med. 2009;48(1):76-83.

[52] Greenhalgh T, Potts HW, Wong G, Bark P, Swinglehurst D. Tensions and paradoxes in

electronic patient record research: a systematic literature review using the meta-narrative

method. Milbank Q. 2009;87(4):729-88.

[53] UK Department of Health. Making information count: A human resources strategy for health

informatics professionals. [electronic document]. 2002 [cited 12 March 2010]. Available

from:

http://www.dh.gov.uk/prod_consum_dh/groups/dh_digitalassets/@dh/@en/documents/di

gitalasset/dh_4073084.pdf

[54] UKCHIP. Standards for Registration. [electronic document]. 2009 [cited 12 March 2010].

Available from: http://www.ukchip.org/?q=document/186