secure technology innovation€¦ · secure technology innovation evolvent technologies...

232

Upload: others

Post on 07-Jul-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Secure Technology Innovation

Evolvent Technologies Incorporated Presents

BETTER DATA FOR BETTER CARE

Evolvent Headquarters

5111 Leesburg Pike Suite 506 Falls Church, VA 22041 703.824.6000 Fax: 703.379.2148

Federal Contract Vehicles Prime and Sub

• D/SIDDOMS 3 Small Business Prime • GSA IT Schedule 70: GS-35F-0364M • SeaPort-e: N00178-07-D-5055 • VA GITSS • VA VistA BPA • ARMY MRMC • AF-DESP-2 • AF NETCENTS • ARMY ITES-2s • AF MASS • CIOSP2 • GSA Alliant Small Business

Evolvent Technologies, Inc. Locations Northern Virginia Northern Virginia San Antonio

22980 Indian Creek Dr.

Suite 300

100 Carpenter Drive

Suite 206

4400 Piedras Drive South

Suite 175

Sterling, VA 20166 Sterling, VA 20164 San Antonio, TX 78228

703.824.6000 703.824.6000 210.268.1400

Fax: 703.481.7925 Fax: 703.481.7925 Fax: 210. 735-3404

BETTER DATA

FOR BETTER CARE Unleashing the Power of

Technology to Transform the Delivery of Care

Written and compiled by the staff at Evolvent Technologies, Incorporated, Falls Church, Virginia.

BETTER DATA

FOR BETTER CARE © 2010 Evolvent Technologies, Incorporated

All rights reserved. No part of this publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise without the prior written permission of the copyright holder, Evolvent Technologies, Incorporated, Falls Church, Virginia.

050 EVO

ISBN-978-0-9826568-0-8

Published by Evolvent Technologies, Incorporated 5111 Leesburg Pike, Suite 506 Falls Church, VA 22041

For updates, resources and our latest magazine, please visit us at our website

www.evolvent.com

For more information about Evolvent capabilities and what Evolvent can do for you, contact us via e-mail at [email protected] or by phone at 703-824-6000.

Better Data For Better Care 1

Bill Oldham, Evolvent Chief Executive Officer

Mr. Oldham’s career began delivering real-time financial analytics data for the banking sector on Wall Street and has carried him to Europe and several postings in the Middle East. A journey that started in pursuit of postgraduate work in Manchester and Rotterdam resulted in experiences that led Bill into healthcare.

A keen awareness that “so much needs to be done” has been Bill’s mantra during his years of leadership at Evolvent Technologies. “When you have the opportunity to combine doing good, creating jobs, and helping our Warfighters and Veterans, it just doesn’t get any better.” Mr. Oldham’s leadership at Evolvent has resulted in some of the most innovative cutting-edge applications of technology ever initiated for our clients in both Government and private sectors:

• Six Sigma implementation – US Air Force Medicine • PTSD and TBI evaluations for returning Warfighters • VA’s first incident response center • Networked telemedicine for remote desert areas

Bill’s extraordinary energy and talent directed at finding just the right technology for the task-at-hand at the lowest possible cost has received praise from many of Evolvent’s 200+ customers.

• Department of Defense Military Health System • Department of Veterans Affairs • Bell Helicopter Textron, Inc. • Danya International • Naval Medical Information Management Center

In alliance with other healthcare services vendors, Bill’s dedication to the needs and objectives of customers is tempered with “service where it counts” to community and wider-world organizations such as Veterans Employment Programs, Food and Friends’ Slice of Life, M.D. Anderson Cancer Center, John L. Young Center-Homeless Shelter, and others.

Better Data For Better Care 3

Acknowledgements

There are a number of individuals at Evolvent without whom this book would not have been possible. First of all, our core project team consisted of four talented professionals who kept this project on track. Mr. Greg Parish, Jr. served as research manager and project coordinator and did an outstanding job. Greg’s efforts kept us on track, challenged our ideas, and helped us make the content meaningful and rich for our readers. Ms. Kari Larsen, our executive director at corporate, kept us moving and made sure the schedule was tightly tracked. Ms. Jenn Cupka, our Director of Marketing, marshalled the publishing and editing teams to polish the drafts with the core editors. Finally, the core team was completed by our lead technical editor, Mr. Rick Bishop – who took all our files and made a book! My thanks to Greg, Kari, Jenn, and Rick for an amazing effort in so short a time. Your professionalism and hard work is a continuing credit to our corporate family.

We reached out to many of our corporate leaders for writing and research. Our primary writing team included John Daniels, James D. Whitlock, Geoff Howard, Kim Bahrami, Al Hamilton, and myself. Contributing writers and reviewers included: Greg Parish Jr, Dr Barry Chaiken, Monty Nanton, Kent Stevenson, Dave Parker, Paul Ramsaroop, and Dave Walton. My thanks to all of these incredible professionals for their time and intellectual direction. We are very proud to have all of them in the Evolvent corporate family.

My thanks also to our client community for the opportunities we share to do good work. It is truly a pleasure working with our colleagues across military medicine and the VA.

Finally, to my family who gives me so much encouragement and strengthens my conviction on a daily basis – it is vitally important we bring better data to enable better care.

Better Data For Better Care 4

Table of Contents Introduction 6

Health 2.0 and the Future of the PHR 14

Consumer Empowerment and Healthcare Innovation 31

Knowledge Management in Healthcare 45

Communities of Practice, Knowledge Junctions, and Web 2.0 62

Effectively Managing Values and Processes 75

Financial Management and Economic Down-Cycles 90

Healthcare Key Performance Indicators (KPI) 109

Reprinted from Evolvent Magazine 133

Author and Contributor Biographies 217

List of Illustrations Figure 1. Percent GDP spent on Healthcare 79

Figure 2. Projected Growth in Healthcare Spending 94

Figure 3. Growth in Medicare/Medicaid Spending 95

Figure 4. Spending Variation 101

Figure 5. ARRA / HIPAA Timeline 115

Figure 6. Shift to new Healthcare Paradigm 116

Figure 7. Integrated Solution 142

Figure 8. Solution Components 145

Better Data for Better Care 5

Preface

Our corporate legacy at Evolvent has always been closely involved with military medicine in the U.S. Almost no other part of American healthcare has deployed as much technology as military medicine – and deployed it well. At Evolvent, we are deeply honored to work with literally thousands of talented IT professionals in our Department of Defense Military Health System as well as their colleagues at the Department of Veterans Affairs. Their data challenges though on the professional level – are legion. Even though we are upgrading a program that has already shared more than 5 million DoD medical records with the VA – many, many challenges remain in the access, sharing, and management of health information at all levels. There are also many success stories.

This book is built on our shared experience in this community. We have seen a lot of amazing work done and learned a lot along the way.

One particular story stands out to me – it is of the typical soldier who has been injured in one of our wars overseas and whose records and images are taped to the evacuation stretcher in hopes that they make it either back to the US or at least to some point of advanced care along the way home. Our teams are working to make sure these images and records go across a network in time and with the right metadata to pull the US-based records and any other relevant information needed, wherever it is needed at the right time.

One of my colleagues who has many years of military ex-perience and is a recognized national leader claims that how we help the wounded is an issue of national security. As long as we have an all-volunteer fighting force, we have a moral obligation to return our citizen soldiers to their communities in good health or with the resources needed to sustain health. This demands data at the point of need – and thereafter!

Better Data for Better Care 6

Introduction

Better data for better care. Why did a group of health care information professionals get together to write a book on this topic? Well, there are actually many reasons. But at the beginning, this is personal.

I’m blessed with a very healthy family for the most part, but as I suspect, like most everyone else, we have sick fathers and mothers, grandmothers and children. Very little in life is harder to deal with in my own view than having major illness eat away at your family.

In my case, it is my two youngest sons that are the main motivator every day. My middle son was diagnosed at the top clinics in the country as being severely autistic in 2007. Less than a year later, his little brother, my youngest, was given the same diagnosis. For much of the last two years, our lives have been about medications, supplements, dietary interventions, different doctors, the next test, what do we do with the school system?, and a stream of professionals of all types from Applied Behavioral Analysis therapists to neuropsychologists and everything in between.

All of this spawns an enormous amount of data. Books, magazine articles, cookbooks, email lists and discussion forums, and then the clinical records and test results. There is also the therapy-based data of measurement results against the ABLLS (Assessment of Basic Language and Learning Skills) educational tool which generates an overwhelming amount of statistical data on how each child is doing in their basic communications and cognitive abilities.

And then you always wonder what you’re missing. What information cues did we miss? What email did we not read thoroughly? Is there more research that might have helped?

Sound familiar? I doubt that there is anyone who has engaged our healthcare system in a crisis that has not faced similar

Introduction 7

challenges. These concerns have led to a transformation in healthcare’s use of technology and in our expectations about how we interact with providers and data.

The deep end of the data dilemma

About a year into our family odyssey, we decided to go see a specialist out-of-state. This specialist had published a great deal and been featured in the media as a nationally recognized innovator in the treatment of cases like ours.

We arrived mid-morning when the doctor’s office opened to a scene which could have been out of the 1960s from a data and technology perspective. File folders were everywhere and stacks of papers lined the walls behind the reception desk. The only evident automation was a credit card reader and a computer with a bubble jet printer for receipts.

While many files were in evidence, ours was not. Apparently, the couriered package of my wife’s blood, sweat and tears had not yet arrived or had been misplaced. Given the general disarray, I was inclined to believe the latter.

While we waited and chased our young boys around the waiting area and tried in vain to keep them out of places they weren’t supposed to be (child-proofing is an exact science in our house), my wife tried to get anything faxed that could be and to chase down the missing records.

It was not long until we were brought into the doctor’s office and encountered another series of paper stacks and scribbled notes. This specialist in particular believed in detailed interviews and a diagnostic methodology which was comprehensive. Seemingly, a kindred soul! We recognized at once that the overwhelming amount of data we (I should say at this point that I am using “we” very liberally, since most if not all of the credit belongs to my wife for what “we” know) had collected would be not only necessary but appreciated.

Better Data for Better Care 8

Let me say that this particular specialist knew the data he was looking for, he had an appreciation for the data, and a keen sense of what needed to be observed. In many ways, the unlikely embodiment of what providers across medicine are becoming – integrative specialists that appreciate the narrative needed for successfully treating chronic disease. Unfortunately, the use of technology to aid this encounter was missing.

The problem which was readily apparent to me at the time was – now this encounter may lead to a better course of treatment and potentially healthier kids – but how will we know? Instead of feeling better as a parent, my anxiety level was actually increasing. We will increase the velocity of activity and data – but how will we deal with it? How will it get processed, and will we just miss more important data?

Fast forward two years. Recent tests (similar to those run two years ago) showed that in fact several things were missed. Now, medicine continues to advance and our clinical understanding takes leaps across time – yet our conditions and our test results were known variables, and still things were missed that should have stood out. Little things like viral strains and thyroid activity. Were we looking for these things or did they just get missed? Who knows.

For two years, we reviewed data set after data set – in person, and over the phone – without substantive clinical intervention. Upon seeking a new specialist, we sought to obtain our medical records. We were told that state law allowed a charge of $75 per patient to obtain our records. When we received these records some three months later, they were incomplete and contained very little of what had actually transpired in each encounter. In spite of the innovative diagnostic methods and the much greater time spent with the specialist – better data and potentially better care did not add up for us in this case. We just paid for our records and moved on.

Some measurable improvement…

Introduction 9

At least not just anyone can phone for a record anymore. I remember about ten years ago needing to get a family member transferred from one provider to another and to have their medical records moved. We did this twice within the space of about two months for specialty care. In one case, the response was “sure, here you go” with no ID, no signature – just here’s the file and away I went. In the other case, the family member (inconveniently unconscious at the time) had to sign their concurrence for me to even have viewing rights – though I was the primary insurance carrier!

Now it does seem that access to records and the rules are more streamlined and readily understood – which is not to say consistently applied. In our business, your password on the proverbial Post-It note is the Achilles heel of privacy. Yet, we make progress – at least in locking down data, less so in making data accessible and harnessing knowledge.

Progress of IT in healthcare

All too often we get bogged down at one end or the other of the “hype” extreme in healthcare IT. I find that we are either way too pessimistic or way too optimistic about what is needed and what is achievable. Sometimes it is hard to know what “good” looks like at the intersection of healthcare and technology. For a moment though, I would like to engage your imagination to think about what is possible, even now:

• All of your information at the doctor’s office …. before you arrive for your appointment.

• Real-time data about how your body is doing—correlated with the past and potential markers for the future.

• Test results, current health data, and your past history –coupled with images and relevant research as well as possible clinical trial options you qualify for.

• Nutritional and dietary supplement information relevant to your health condition provided in concert with patient education information delivered to your iPhone® or PDA.

Possibilities abound … yet why is it so hard to change?

Better Data for Better Care 10

Healthcare can be a complicated and far-reaching subject. For this reason, solutions to challenges in healthcare require clearly defined parameters. In 2009, Federal lawmakers had the opportunity to encounter the diversity and breadth of issues in the healthcare industry and to explain them to the American public. We have learned from this experience that we have to think about very specific and succinct goals in order to explain how to improve healthcare.

Making a comprehensive argument for improving healthcare is fundamentally challenging because there are always multiple competing interests. These interests are oftentimes not only economic in nature but also political because healthcare policy is relevant to opinions concerning acceptable standards of responsibility for oneself and for others. Any thoughtful conversation about healthcare therefore should have a specific scope for the sake of cogency and out of respect for the politically sensitive nature of the overall debate.

Many healthcare industry stakeholders have focused on cost reduction as their scope for discussion about reform. At first reducing costs may come across at first as too callous or too cold a starting point for a discussion about healthcare which of course pertains to human lives. But the fact of the matter is that cost reduction claims have helped political interests from all sides.

From universal healthcare insurance to setting limits for medical malpractice claims and from provider reimbursement reform to electronic resources and financial incentives that assist patients with preventive care, a variety of policy positions have benefited from being able to claim cost reduction as an outcome in the short term or long term. Cost reduction strategies are a good place to start because almost everyone can agree that the issue of increasing costs in the healthcare system needs to be addressed – yet we agree on very little else it seems.

Another important aspect of focusing on the bottom line in healthcare is that it means profiling the straightforward realities that are faced by patients, providers, and payers. To put it

Introduction 11

differently, focusing on reducing costs as a perspective for improving healthcare forces us to literally quantify the challenges we face and hence to concentrate on solutions that are simple and measurably effective. When healthcare industry stakeholders are properly and adequately incentivized to focus on reducing the costs they impose on the system, the unambiguous nature of that approach often leads to recognizing solutions to all kinds of challenges such as access and quality. The imperative then be-comes fundamentally a data challenge.

Thinking outside the box in healthcare is about designing and applying incentive structures that appeal to the singular purpose of cost reduction within the bounds of better performance. The typical stakeholder in the healthcare industry has several components for consideration in this regard. In the broad sense, the importance of governance and policy incentivizing cost reducing behavior has already been suggested as a key component. But specifically, our discussion has to address the assets unique to each stakeholder that will help them attain cost reduction. In doing so, we will answer the following questions: What are the processes relevant to moving each stakeholder towards reducing costs and achieving their goals? What are the skills and knowledge necessary for carrying out those processes? And which technology solutions will offer the most comprehensive support along the way, balancing outcomes with cost reduction?

These issues and others inform the discussions we share with our clients on a daily basis as we build systems and support the mission of healthcare delivery.

I am very fortunate to have a talented group of writers to join me in this book. Many of the ideas were an outgrowth of our corporate projects, and many more encompass our hopes and dreams for the future of our industry. Here’s a brief preview:

• Health 2.0 and the Future of the PHR – we take a look at the challenges of personal health records and sharing of data between patients and providers.

Better Data for Better Care 12

• Consumer Empowerment and Healthcare Innovation – what more informed consumers and more technologically connected patients means for healthcare IT and our industry.

• Knowledge Management in Healthcare – the possibilities and meaning of the practice of knowledge management in the field of healthcare, looking at what differences there are from other applications.

• Communities of Practice and Web 2.0 technologies – how the technologies and collaboration elements of knowledge domains can be leveraged.

• Effectively Managing Values and Processes – we dive into the intersection of value, IT, care, and process management to explore the impacts and comparative ROI of different technical strategies in healthcare.

• Financial Management – a look at the impact of economic downturns on healthcare and IT implementations.

• Healthcare Key Performance Indicators – longer term demo-graphic trends and how those are reflected in access, quality, and cost.

Our goal in writing this is to enlighten, challenge and inform those interested in healthcare with technology ideas, as well as to provide insights into the various elements of the national debate. We hope you find this helpful and look forward to your feedback.

Bill Oldham Washington, DC February 2010

Better Data for Better Care 13

Contributors to this Book

Bill W. Oldham Chief Executive Officer

Paul Ramsaroop President & Chief Operating Officer

Geoff Howard SVP, Chief Technology Officer

John Daniels SVP, Chief Information Officer

J.D. Whitlock VP, Technology Programs

Kimberly Bahrami Senior Vice President

Al Hamilton VP, Federal Healthcare Operations

Monty Nanton Senior Vice President, Military Health

David Parker, M.D. Vice President, Chief Medical Officer

Guy Sherburne Senior Vice President

Anna Worrell Vice President, Program Management

Joey Meneses Vice President, IT Operations

Rick Khosla Senior Director, Information Technology

For additional information, please see the biography section in the back of this book.

Better Data for Better Care 14

Health 2.0 and the Future of the PHR

Health 2.0 is participatory healthcare. Enabled by in-formation, software, and community that we collect or create. We the patients can be effective partners in our own healthcare, and we the people can participate in reshaping the health system itself.1 Dr. Ted Eytan

Patient-focused internet tools have matured significantly in the last couple years, mirroring and building on the recent explosion in the adoption of social networking tools, and on the slow growth of Care Delivery Organizations (CDO) that are using Electronic medical Records (EMR). It is now possible to envision how these tools might come together to offer patients a comprehensive suite of functionality within a tightly integrated, portal-like user interface, even if major pieces of this functionality were delivered by multiple third parties, and even if the patient changes their health plan. A toolset that could, within one user login and user interface:

• Uniquely identify the patient across different Regional Health Information Organizations, and even across different Regional Health Information Organizations (RHIO), via a Voluntary Universal Healthcare Identifier (VUHID)

− Deliver benefits and eligibility functionality – even if the patient changes their insurer, because their VUHID could be used to match the current (or even multiple insurers)

• Deliver access to clinical and medication data from different CDOs, as long as each CDO had an EMR that could export clinical data in a standards based format like Continuity of Care Record (CCR) or Continuity of Care Document (CCD) via the Nationwide Health Information Network (NHIN)

− To include test results, immunizations, past office visits, prescriptions, allergies, and health conditions

− Eventually, to include the level of detail found in text from outpatient clinical notes, operative reports, discharge summaries, etc.

• Deliver secure patient-provider communications, tightly integrated with appointing and prescription refill functionality, to

Health 2.0 and the Future of the PHR 15

streamline the most common processes for both patient and provider

• Deliver trusted health education resources, with subscription to new content, tailored to specific diseases or any other topic of interest to the patient

• Deliver social networking functionality, to enhance communications with providers, family members, or patients with similar diagnoses

• Deliver information on cost and quality measures for different CDOs and providers, to help patients find the best source of additional care when required

• Deliver the functionality of specialized third party applications that integrate with the patient portal and leverage its authentication service and clinical data

Although some of these tools fall outside the scope of what most people think of as a Personal Health Record (PHR) today, the idealized PHR as defined by the Department of Health and Human Services (HHS), Office of the National Coordinator for Health Information Technology includes most of these features. HHS describes a PHR as “An electronic record of health-related information on an individual that conforms to nationally recognized interoperability standards and that can be drawn from multiple sources while being managed, shared, and controlled by the individual.” 2

The key phrase in the definition above is “multiple sources,” or more plainly, clinical data from multiple CDOs. This is the hard part. It requires multiple CDOs, each with their own EMR, to be able to exchange clinical data via a common standard that every EMR is capable of using. Given the slow progress of EMR implementation among CDOs, combined with the slow progress of interoperability standards, it is clear that this vision is not going to become reality overnight. However, this slow progress is coalescing towards the idealized PHR becoming reality. We will first look at the current state of affairs, and then look forward to the likely future of the PHR and Health 2.0.

Three general categories of PHRs exist today, each with their own pros and cons. Let’s briefly examine each:

Better Data for Better Care 16

Insurer-sponsored PHRs

Insurer sponsored PHRs are currently the most widely used. They generally do a good job of aggregating all types of claims data from across different CDO organizations in order to present a comprehensive scope of care delivered to the patient. Some are maturing and adding features like lab results, and the export of data to EMRs to encourage use by providers.

However, because these PHRs are not generally tied to an EMR, and the data is most frequently only claims data, the usefulness of this data is very limited. Additionally, because these PHRs are not generally tied to individual CDOs, they do not provide appointing functionality or communication with a patient’s provider. These features are precisely the kind of conveniences that most patients expect to get before they will bother using a PHR. Lastly, there is generally no interoperability with other insurers’ data, so when a patient changes insurers (as most patients do), there is little continuity of data for the patient.3

Microsoft® Healthvault™ / Google™ Health

For simplicity (and not to choose favorites), let’s call this the Healthvault model. These PHRs attract users by the growing wealth of functionality found in third party applications with pre-built plug-ins. For example, home health monitoring devices with an interface into Healthvault make it very easy for a patient to track their progress (or a bad trend) and share this data with their provider. Some analysts point out that the Healthvault model does not even require patients to use the core functionality of the PHR. If a plug-in application exists for Healthvault that adds value to patient care, and in order to use that application they must use Healthvault, then the patient will likely provide consent for Healthvault to store their data even if they have no intent of ever using Healthvault itself. One value added application leads to many for this particular patient, and before long they are “data invested” in Healthvault.4 Additionally, given the resources of Microsoft and Google, this model has staying power. For example, Microsoft recently introduced a new front end to

Health 2.0 and the Future of the PHR 17

Healthvault called MyHealthInfo, which permits a very rich user interface along the lines of a personalized health dashboard.5

The biggest potential strength of the Healthvault model is that it is not tied to any one insurer or CDO, and therefore can more legitimately represent itself as a true longitudinal, cradle-to-grave PHR. To emphasize this point, Peter Neupert, Corporate Vice President of the Health Solutions Group at Microsoft, uses the term “Personal Health Management System” instead of PHR when describing the capabilities of Healthvault. In a recent edition of Health Affairs he writes:

What differentiates personal health management systems from EMRs or PHRs is that they (1) contain data from many different sources, including EMRs and PHRs; (2) give people total control over their data and enable them to add their own information, such as progress on a fitness plan or self-management of chronic conditions; (3) seamlessly connect to the workflows of multiple provider and payer systems; (4) offer secure, programmatic access to these data and processes, to enable the development of new kinds of tools and services; (5) cater to the needs of the “family health manager” – the person most often responsible for a family’s care; and (6) make it possible for people to search for (and share) relevant health information, knowing that this information is credible and actionable.6

Leaving aside the semantics of PHR vs. “Personal Health Management System,” this sounds like PHR nirvana. However, the potential strength of the Healthvault model “open” architecture is also its Achilles heel. Because Healthvault has no direct tie to any EMR or other source of clinical data, it is entirely dependent on the progress of interoperability standards, the progress of EMR adoption, and the willingness for CDOs to permit access to “their” (the patient’s) data, in order to become any more than a third party application hub. Today, only a handful of CDOs offer the ability to transfer a patient’s data to Healthvault. If patients rise up en masse and demand that their

Better Data for Better Care 18

CDOs enable the transfer of their clinical data to Healthvault, then this model may work sooner rather than later. However, the odds of this happening in the near future are relatively low. Barring a short term revolution in data liquidity, it is difficult to see how the Healthvault model will be able to “seamlessly connect to the workflows of multiple provider and payer systems” in the near future – even if its architecture seems like the most logical long term model.

“Tethered-to-an-EMR,” CDO sponsored PHRs

When large CDOs have an EMR, and offer a “tethered” PHR, then they are able to deliver their patients a PHR with a built-in interface to clinical data. Additionally, when this kind of CDO is integrated with a payer (as in a staff model HMO), a world of possibilities is opened involving the integration of clinical and business process automation from a patient’s perspective. Perhaps not surprisingly, the best commercial example of this PHR model is Kaiser Permanente’s® My Health Manager, used by about one third of Kaiser’s 8.6 million members.7

Fully deployed in 2007, My Health Manager offers all of the following functionality, which is an excellent start towards the PHR “wish list” at the beginning of this chapter. Using My Health Manager, a Kaiser member can:

• View lab results, immunizations, past office visits, prescriptions, allergies, and health conditions

• View, schedule, or cancel appointments • Refill prescriptions • Act on behalf of another family member (with proxy permissions) • Communicate securely with their providers and pharmacists • View trusted health and medication information and utilize

behavioral change programs for smoking cessation, weight management, nutrition, insomnia, pain, depression, and stress reduction

• Manage their health benefits, including estimating the cost of treatments and viewing medication formularies8

Health 2.0 and the Future of the PHR 19

Use of My Health Manager continues to grow, driven by the top three most frequently used features: viewing test results, scheduling appointments, and prescription refills. Based on user surveys conducted by Kaiser, they have determined that “members find greatest use in a web site that facilitates e-connectivity with their health care team, allows them to view key components of their medical records and conduct clinical transactions online, and provides them with information so that they can make knowledgeable decisions about their health.”9 Kaiser recommends that other PHRs focus on these same priorities to maximize their chance of success. Of course, as we have already discussed, Kaiser’s integrated payer-provider model allows them to “start on second base” compared to the insurer model, the Healthvault model, or most other CDOs.

On the downside: Just like with the insurer model, if a patient leaves Kaiser, they leave behind all the functionality of My Health Manager. They do not necessarily lose their clinical data, however, because Kaiser is working on an interface to port clinical data in My Health Manager to HealthVault.

The Future of the PHR

How do we get from the status quo PHR to the idealized PHR? How do we best combine public policy, technology, incentives, and education to maximize the chance that our health care system can deliver a PHR that is not tied to any one CDO, but nonetheless offers ubiquitous access to all of a patient’s clinical data wherever it resides? To paraphrase the title of this book, how do we leverage better patient data for a better patient experience? Let’s look at each one of the idealized PHR elements in a little more detail, with examples where they exist.

Voluntary Universal Healthcare Identifier (VUHID) In order to efficiently and safely collect a patient’s clinical data

from different CDOs and different EMRs into one PHR, it is necessary to uniquely identify a patient across different CDOs. The best way to do this is to implement a universal healthcare identifier. Demographic matching algorithms that are used now

Better Data for Better Care 20

for this purpose are prone to error, with potentially catastrophic results for patient safety. For political reasons, it is unlikely that the Government will attempt to establish a mandatory universal identifier. This leaves a voluntary universal identifier as the best near term solution.

Global Patient Identifiers Inc. (GPII) has a proposed mechanism in place to implement a VUHID. It is not in our scope here to discuss the mechanics of how VUHID would work; however, these details are presented at http://gpii.info/

What we are concerned about are the benefits of a VUHID to individual patients, and what kind of public policy might be needed to jump start VUHID into mainstream use. From the gpii.info website, the benefits of VUHID for a patient are:

• Convenience: Once a VUHID identifier is issued, patients should no longer have to repeatedly supply identifying information, even when seeing a new physician for the first time

• No patient identification errors: VUHID reduces Enterprise Master Patient Index (EMPI) demographic matching errors, estimated to occur 8% of the time

• Reduced risk of identity theft: Participating in the VUHID system means that each patient's medical information will be linked using his or her unique identifier — not names, addresses, birth dates, social security number, telephone number, etc. This eliminates the potential breaches of confidentiality associated with demographics-based exchange of such vital data

• Better control of medical privacy: A key benefit of the VUHID system is the ability to provide private identifiers (PVIDs) to an individual. Private identifiers are anonymous. They work by enabling electronic linkage of various types of clinical information without revealing the patient’s identity. For example, suppose a physician orders an AIDS test but does not want anyone to know who that test is for. A PVID can be attached to the sample and the laboratory would return that result to the physician using only that PVID for identification

• Better medical care: In support of the vision of a nationwide health information network (NHIN)—the beginnings of which are seen in the good work of RHIOs and Health Information

Health 2.0 and the Future of the PHR 21

Exchanges (HIE) around the country—the VUHID system makes it feasible for physicians to have complete and accurate medical information available at the point of care. It has repeatedly been shown that a physician equipped with complete medical information is able to provide more efficient and less error-prone care for their patients. The VUHID system plays a key role in making that optimum care feasible by ensuring that there is never any confusion concerning a patient’s identity10

There are a lot of tough policy issues surrounding health care, but VUHID is not one of them. It can be argued that VUHID (or something very similar) is a prerequisite both to the future of the PHR, and also to ensure safe transmission of patient data between CDOs / RHIOs via the NHIN. In our view, HHS should actively encourage the adoption of VUHID by including VUHID interoperability into EMR accreditation requirements.

Access to Clinical Data from Multiple CDOs The long term answer here is ubiquitous EMRs, all of which

can securely port patient data into a PHR at the request of the patient, via the NHIN that will become the arbiter of data standards and formats. As already discussed, this is a ways off – see chapter 6 for a discussion of the challenges of EMR adoption and NHIN development.

In the meantime, there is technology available today that enables transfer of clinical data from multiple CDOs into a PHR, and accommodates varying states of CDO EMR maturity. Health Postbox Express (HePoEx) is a plug-in to HealthVault that allows CDOs to send clinical data to a patient’s HealthVault account securely. HePoEx supports delivery of records in CCR format, for any CDO that can support CCR. However, it also allows clinical data to be sent in almost any other electronic format as well. If the provider does not have an EMR, the system can handle scanned copies.

To receive their health records, a patient provides their HePoEx ID to their provider, along with their authorization form for Release of Information (ROI). The provider or CDO staff then logs on to www.hepoex.com with their provider account, enters

Better Data for Better Care 22

the HePoEx ID provided by the patient, uploads the clinical data / documents and submits the form. The uploaded data / documents are converted into a single encrypted package and uploaded to HealthVault.

The patient then receives an email with a 16 digit pass code and detailed instructions to retrieve the clinical data. A link embedded in the email takes the patient to HealthVault, where they may enter this pass code. After the pass code is verified, the patient is additionally required to provide the answer to the secret question that they had previously registered with HePoEx. After providing the correct answer (two-factor authentication), the patient is able to import the clinical data into their HealthVault record.

This model is attractive to CDOs because no IT infrastructure setup or advanced training is required. Any provider/CDO staff can request a provider account, open a browser, and start sending health records securely in minutes. This decreases turnaround time for requests for ROIs, resulting in increased patient satisfaction. For the majority of providers that do not yet have EMRs, it is cost effective when compared to the time and money spent making paper copies. Automatic audit reports for completed ROIs assists CDOs with Health Insurance Portability and Accountability Act (HIPAA) compliance.11

Often when a patient requests a ROI from a provider, it is specific to a referral. A limitation of the HePoEx model is that it only works if the referral provider agrees to look at the patient’s PHR. If the referral provider insists on paper copies because looking at clinical documentation in a PHR is too much of a disruption to their CDO workflow, then this model does not work. To turn the tables, if patients start taking their business to specialty providers that incorporate this type of business process, and away from those that do not, then CDOs would quickly adjust.

Health 2.0 and the Future of the PHR 23

Secure Patient-Provider Communication According to an August 2009 survey by Lightspeed Research,

1 in 2 Americans would like to be able to email their provider to ask questions and request prescription refills. Women are more likely than men to desire this online communication. In addition to the obvious convenience of saving time, 49% of patients desire written vs. verbal instructions from their provider in order to make it easier to comply with treatment recommendations. 49% of patients are also unwilling to pay for an email consult, which of course explains why integrated payer-provider CDOs like Kaiser are so far ahead of most other CDOs with third party payers in this regard.12

Although many providers are at first resistant to the idea of opening up their practice to email, after they try it, most find that responding to an email is easier than responding to a phone consult, because it is easier to budget their time. For example, if a provider has five minutes available before a scheduled office visit, they may well be able to respond to an email consult, but they would not pick up the phone and respond to a phone consult because they have no idea how long that phone call would take.

In order to comply with HIPAA requirements, it is much easier to enable email / online patient-provider communication in the context of a patient portal specifically designed to accommodate HIPAA. Providers that permit online communication with their patients armed only with an email account are opening themselves up to significant legal liability, because HIPAA requires that the clinical information used for diagnosis in emails be treated as any other clinical data. In the absence of a PHR tethered to an EMR that can handle HIPAA requirements, vendors such as Medfusion and RelayHealth® offer secure patient-provider communication solutions, to include e-prescribing and other features. For practices without EMRs, they market themselves as a logical step toward an EMR.

Hopefully, the medical home initiative and reform towards paying providers to manage conditions instead of visits will incent

Better Data for Better Care 24

better and more widespread patient-provider communication in the not too distant future. The technology is ready as soon as the business model is ready.

Deliver Trusted Health Education Resources Patients want information on their (and their loved ones’)

health concerns. They want to stay up to date as medical science evolves and new treatments or medications are released related to their health concerns. Providers want their patients to get trusted health information, so they do not have to waste valuable time in the exam room debunking fad treatments.

Additionally, whether or not patients understand anything about RSS (Really Simple Syndication), thanks to Facebook® and Linked In®, most are familiar with the concept of “subscribing” to topics / groups / organizations of interest to them and receiving updates in one centralized location. So it is not surprising that PHRs and third party websites are copying this model. To cite one example of many, MyDailyApple.com is a Google Health integrated service that delivers news and search results tailored to specific health concerns.

In the idealized PHR, a patient subscribes to health topics / disease conditions of interest to them, and is delivered updated content in one centralized location (and/or via email). If they have a question, e.g. “is this new medication for my chronic disease appropriate in my case?” with one mouse click they initiate an email to their provider that references the content they just read. This streamlines the business process not only for the patient but also for the provider, because the provider can more quickly understand the clinical context of the request (as opposed to a query based on, “my brother told me he heard about some new drug on CNN”).

Deliver Social Networking Functionality “Our profession, at its core, is fundamentally flawed relative

to how today’s world communicates and functions. The infrastructure of health care needs a total repair from the

Health 2.0 and the Future of the PHR 25

ground up. It needs to be Facebook-ed and Wiki-ed” Dr. Jay Parkinson, Hello Health.™

This subject ties directly to and builds on the previous two topic areas. There is enormous potential in bringing social networking capabilities into the communication equation between patients and their providers, between patients and their family members, and between patients with similar health concerns.

Social networks offer CDOs a new tool to connect with healthcare consumers. As consumerism continues to take hold and consumer directed health plans become more commonplace, CDOs need to remain in constant communication with the patients they serve to keep those individuals loyal to their facility or practice. Social networks allow for multifaceted, regular communication with patients that assists in building and maintaining this necessary loyalty. As consumer control over the spending of their healthcare dollars increases, their behaviors will more closely mimic their behavior in purchasing other products and services.13

Hello Health™ is a new model of health care delivery that has been described as “part EMR, part practice-management system, and part social-networking site, complete with profiles and photos of doctors and patients, all in a secure environment that complies with federal privacy standards.” Patients pick a provider via their Facebook-like profile, and pay a monthly enrollment fee (usually on the order of $30-$35) that covers quick questions via email or IM, prescription refills, and viewing their medical records online. Some practices include simple lab tests and generic medications for acute conditions in the enrollment fee. Office visits are guaranteed within 24 hours if necessary, and patients can schedule an appointment themselves online by viewing open slots on their provider’s calendar. Five different modalities of outpatient visit are offered (with different prices set by each practice): house calls, office visit, video visit, phone visit, and IM visit. The Hello Health model does not include accepting

Better Data for Better Care 26

insurance, but patients can of course submit office visit invoices for reimbursement if they have out-of-network coverage. Hello Health’s parent company, Myca, sells the Hello Health software platform for use at other practices that want to adopt its model.14

Another way of describing Hello Health is concierge health care meets social networking. A video on the Hello Health website states “The internet has made your life simple and convenient in every other area, so why not your health?” Whether or not Hello Health’s business model succeeds, they have certainly blazed new ground in the integration of social networking and health care delivery.

Another innovative example of healthcare social networking is Emota™ (Emota.net). A startup funded by the National Science Foundation, Emota is working on an “emotional networking” solution to help busy families and care professionals stay aware, in touch, and supportive of the elderly. The interface for the elderly is a touch screen, on dedicated hardware always connected to the internet. The presentation is like a “two-way interactive picture frame” that allow the elderly to communicate through touch-based video, voice, and text tools (without any technical skills). The interface for family members and friends is an iPhone app or a web widget that helps them stay aware of the elderly and provide timely emotional and physical support when required. The interface for caregivers is a dashboard that lets them monitor multiple patients and provide support as needed. The Emota technical architecture is built to allow rapid integration with PHRs, remote patient monitoring, and telehealth applications.15

An example of social networking among patients is patientslikeme® (patientslikeme.com). Their goal is to “enable people to share information that can improve the lives of patients diagnosed with life-changing diseases.” Patients with one of 19 (as of November 2009) serious conditions can create a profile, track their symptoms, find patients like themselves, and communicate via forum discussions or private messages. Profile charts let patients see how their treatment is affecting their health

Health 2.0 and the Future of the PHR 27

over time, and anonymized patient outcome data is used by providers, pharmaceutical companies, and research organizations to drive treatment research and improve medical care.

Deliver information on cost and quality measures An important part of the equation for consumer driven

healthcare is the availability of cost and quality measures for providers and CDOs. The potentially life-altering (and certainly bank account altering) decision about whether to see the in-network surgeon recommended by your health plan, or an out-of-network surgeon recommended by a friend, is the kind of decision almost all patients face, and all patients wish they had better data to support their health care decisions. Is there outcomes-based data to support paying more for the out-of-network surgeon? How much more will the out-of-network surgeon cost? How do other patients rate both surgeons?

Angie’s List™ is a well known consumer website with over a million paying members that rate all kinds of service providers, from painters to plumbers to heart surgeons. Ratings for doctors, dentists, hospitals, and health plans are an increasingly popular part of the website. Just like non-healthcare service providers, healthcare providers are given the opportunity to respond to negative reviews, but Angie’s List had to incorporate the complexity of a HIPAA waiver into the dispute resolution process. Categories of ratings for physicians include availability, office environment, punctuality, staff friendliness, bedside manner, communication, effectiveness of treatment, and billing and administration. Given the track record of Angie’s List, it is likely that healthcare ratings on Angie’s List will become an increasingly influential part of patients’ decision making process when choosing providers and CDOs.

The problem with Angie’s List is that it includes no quality of care data. There have been many attempts by health plans to rate their physicians on quality of care using claims data, but this is very controversial because it is so difficult to do in a statistically creditable way. It may be that valid physician quality ratings will

Better Data for Better Care 28

have to wait until wider adoption of EMRs and payment reform makes the right kind and amount of clinical data available on all physicians.

The Informed Patient Institute, http://www.informedpatient institute.org, advocates for making more – and more useful – healthcare quality information available to patients. It does not rate providers and CDOs, instead it “rates the rater” to inform patients which rating organization and mechanism is the best state-by-state. One of their favorites is the National Committee for Quality Assurance (NCQA), which rates health plans at http://www.ncqa.org in a very consumer friendly format. NCQA takes a process and technology based approach to evaluating physicians, such as evaluating a provider’s ability to know and use patient histories, follow up with patients and other providers, manage patient populations, use evidence-based care, and employ electronic tools to prevent medical errors.

An idealized PHR could combine information on cost, clinical quality, subjective patient ratings, and process ratings from different sources into an aggregated and statistically valid report card. This vision is likely several years off.

Delivering Specialized Third Party Applications All kinds of “Health 2.0” web and mobile applications are

being developed today to help patients manage chronic diseases, assist the healing/recovery process, manage fitness, diet, etc. Here is a small sampling:

ScanAvert® (ScanAvert.com) delivers instant personalized decision support in the supermarket. Patients register and establish a profile at the ScanAvert website, identifying any allergies, dietary preferences/avoidances, illnesses/conditions, prescriptions, etc. Then they use the camera on their mobile phone to scan bar codes on food items they are considering for purchase. The item’s ingredients are compared to the consumer’s profile. If a product’s composition is incompatible with a customer’s profile, an alert is generated identifying the substance(s), accompanied by proposed compatible substitutes.

Health 2.0 and the Future of the PHR 29

This intriguing application would be even better if it were available as a PHR plug-in so that the patient’s diagnosis, allergy, and (changing) medication profile could be seamlessly imported into ScanAvert.

AccessDNA® (AccessDNA.com) generates a personalized genetics report based on a survey taken by the patient. This report identifies which genetic tests might be worthwhile, lets patients compare thoughts on a discussion board, and allows patients to ask questions of a board certified genetic counselor.

TrialX® (TrialX.com) matches patients to relevant clinical trials based on their health conditions. It is available as a plug-in to both Health-Vault and Google Health.

411fit™ (411fit.com) allows users to set weight loss and exercise goals, enter diet and exercise data, and receive daily feedback on their progress. What makes 411fit unique from other diet websites is a very rich user interface and social networking features, which allows friends (or personal trainers) to monitor and encourage progress.

Summary

David Kibbe, MD, senior adviser to the American Academy of Family Physicians, believes that it will be PHRs that ultimately drive the sharing of data between RHIOs:

"We can't expect government to build a network. Government didn't build the Internet. Government didn't build PCs … It’s you and me."16

Summary points: First, there is a great deal of innovation going on right now that is dramatically increasing the ability of patients to manage their own care and better communicate with their providers. Second, the synergy between this innovation and the economic trends towards patient empowerment will ultimately become an unstoppable force driving towards the idealized PHR. Third, the idealized PHR (or something close) will deliver enough benefits to enough patients that it truly could

Better Data for Better Care 30

become the “data broker” between RHIOs in the event the NHIN is unsuccessful.

1 Ted Eytan Blog, “The Health 2.0 Definition: Not just the Latest, the Greatest!,” 13 June 2008, http://www.tedeytan.com/2008/06/13/1089 2 National Alliance for Health Information Technology, Defining Key Health Information Technology Terms, 28 April 2008, http://www.nahit.org/images/pdfs/HITTermsFinalReport_ 051508.pdf 3 Information Gap: Can Health Insurer Personal Health Records Meet Patients’ and Physicians’ Needs?, Health Affairs 28, no. 2 (2009): 337-389 4 Kuraitis, Vince, Four Misconceptions About HealthVault and the Emerging Personal Health Information Ecosystem, http://e-caremanagement.com/four-misconceptions-about-healthvault-and-the-emerging-personal-health-information-ecosystem-phi-ecosystem/ 5 http://my-health-info.health.msn.com/MSN/Default.aspx 6 Personal Health Management Systems: Applying the Full Power of Software to Improve the Quality and Efficiency of Care, Health Affairs 28, no. 2 (2009): 390-392 7 Healthcare At Your Fingertips, Los Angeles Daily News, 28 October 2009, http://www.dailynews.com/lalife/ci_13661211 8 If You Build It, Will They Come? The Kaiser Permanente Model of Online Health Care, Health Affairs 28, no. 2 (2009): 334-344 9 If You Build It, Will They Come? The Kaiser Permanente Model of Online Health Care, Health Affairs 28, no. 2 (2009): 334-344 10 VUHID Benefits, http://gpii.info/benefits.php 11 http://hepoex.com 12 http://www.healthpopuli.com/2009/11/americans-want-to-use-email-for-all.html 13 Chaiken, Barry, Using Social Networking to Engage the Healthcare Consumer, Evolvent Magazine, volume 1 (2009) 14 http://www.myca.com/applications/hello-health 15 http://www.emota.net 16 Healthcare leaders favor personal networks to RHIOs for data exchange, Healthcare IT News, 23 July 2008, http://www.healthcareitnews.com/news/healthcare-leaders-favor-personal-networks-rhios-data-exchange

Consumer Empowerment and Healthcare Innovation 31

Consumer Empowerment and Healthcare Innovation

For many, it is clear that the high cost of American healthcare continues to have a crippling effect on the competitiveness of U.S. resources both at home and abroad. As a nation, we now spend almost 18% of our GDP on health care. In 1966, Medicare and Medicaid made up 1% of total Government spending; now that figure is 20 percent, and rising rapidly. Already, the federal Government spends eight times as much on health care as it does on education, 12 times what it spends on food aid to children and families, 30 times what it spends on law enforcement, 78 times what it spends on land management and conservation, 87 times the spending on water supply, and 830 times the spending on energy conservation. The rapid increase in healthcare spending is reducing funds available for such public services as education, public safety, environment, and infrastructure.

As public attention on healthcare continues to escalate nation-wide, elected officials are intensifying their efforts toward more effective government incentives and economic reforms, investing a significant portion of the national treasury into innovative tech-nologies designed to improve U.S. healthcare delivery. While these moves are well intended, many believe productive outcomes and lasting results will be short lived due to an inherent reliance on Government appropriations.

The majority of Government funding is designed to incentivize more meaningful and efficient use of health information technology. According to the eHealth Initiative Study, a Wash-ington-based industry advocacy organization, “in a nation of 300 million residents living within 3.5 million square miles, there are 193 Health Information Exchanges (HIE) in various stages of development. By self-attestation, only 57 of these HIEs are operational. Most HIEs don't have a sustainable business model, and getting a critical mass of regional stakeholders to cooperate in exchanging their data remains an extremely difficult proposition.”

Better Data for Better Care 32

Recent grant appropriations via the ARRA (American Re-investment and Recovery Act) have helped to accelerate the growth of HIE programs and networks.

HIEs exchanging laboratory and radiology reports, and outpatient and emergency episodic data, increased significantly in the past year, according to the survey. And hospitals increasingly are realizing that better communication with physicians is good health care and good business, and HIEs can help them link with physicians.”1

HIE grants are a large component of the effort to bring collaboration to the management of healthcare information. They are viewed as a catalyst for a national infrastructure, a possible “Heath Internet” for consumers. “A Health Internet would involve adoption of a standard service-oriented architecture, tailored to the health care environment, but similar to what's used on major online retail sites such as Amazon…A consumer placing a book order on Amazon, for instance, can track the status of the order and real-time shipping status via UPS from the Amazon site.”2

A growing number of healthcare professionals believe that a national information network, capable of handling clinical transactions, already exists in the financial and banking industries. Claims clearinghouses and other electronic data interchange vendors automatically transmit billions of claims and related financial/administrative transactions daily between U.S. provider and payer organizations.

One issue often considered a stumbling block is the privacy and security of personal healthcare information. While the generation of Facebook users readily accepts the reality of shared online information exchanges there are other segments of the population that are more wary of sharing their personal health information over the internet. Although there is a wide variance of opinions about the function of Government in national healthcare delivery, consumers appear to stand solidly behind the Government when it comes to privacy and security standards for

Consumer Empowerment and Healthcare Innovation 33

healthcare infor-mation exchanges and transactions. According to participants in the 2009 Deloitte Survey, 38% are very concerned about these issues and “6 in 10 consumers feel that the Government should establish standards for how medical information is collected, stored, exchanged and protected.” Leaving the responsibility of privacy and security in the purview of the health plans was a distant second, supported by only 21% of the respondents.

Even with privacy and security issues in the forefront, the trend toward a more participatory “consumer-driven” care system, a term coined by Harvard Business School Professor, Regina Herzlinger, is sure to increase dramatically in the next few years. The younger generation’s predilection for texting, messaging and social networking and the aging population’s insistence on convenience, fitness and wellness will continue to drive an expansion of consumer-oriented healthcare options. The increasing longevity in the baby boomer population, combined with decreasing numbers in other consumer groups, sets the stage for increased healthcare consumerism.

Deloitte conducted a well known study of consumer attitudes, behaviors, and unmet needs in terms of health, health care, and health insurance. This study revealed that “consumers embrace innovations that enhance self-care, convenience, personalization, and control of their personal health information: such as retail medicine, e-visits with physicians, personal health records, self-monitoring devices, personalized physician referrals, and custom insurance products. They are willing to try new services, change providers, and use their money in different ways to obtain better value. They are highly receptive to technology-enabled care that eliminates redundant paperwork, replaces unnecessary tests, and saves time and money.”

The Deloitte Study also found that most consumers “making purchasing decisions, rely on perceptions of service, quality and costs based on their personal experiences with doctors, hospitals, insurance companies and others.” However, it also confirmed

Better Data for Better Care 34

that consumers’ use of more objective information is on the rise and that patient-consumers are moving away from making healthcare purchasing decisions based solely on personal perspectives.

Personalizing Healthcare Technology

Consumers understand that health insurance does not necessarily equate to healthcare and that traditional medical services are only one component of their overall state of health. An increasing number of consumers now consider factors such as nutrition, fitness, and mental health to be as much as part of their overall health management system as routine check-ups, health screenings, and medical treatment.

They are seeking healthcare and technology providers that recognize other factors, beyond medical care, that contribute to wellness and quality of life. Because time and distance are major considerations in today’s society, convenience is also a key factor for the majority of consumers. As a result, people are now more frequently turning to the internet in search of direct, online resources that offer value and support beyond the customary insurer- sponsored medical services. Websites such as WebMD® have successfully capitalized on this trend while technology giants such as Microsoft and Google have jumped into the health consumer fray and are offering personalized healthcare management solutions for their customers.

While the public healthcare debate continues, more and more average Americans are beginning to understand that their greatest opportunity for improved healthcare availability and affordability will be derived from their individual and collective influence as consumers of healthcare goods and services. New internet-based healthcare delivery models are offering this growing population of patient-consumers the tools they need to purchase and receive their healthcare in a way that aligns more closely with their individual lifestyle and health concerns− all without the direct intervention of public or private healthcare insurance orga-nizations.

Consumer Empowerment and Healthcare Innovation 35

Consumer interest in online healthcare information and services is strong. The 2009 Deloitte survey cited the following statistics:

• 60% of consumers say they looked online for information about treatment options in the past year

• 27% say they looked online for information about the quality of care provided by specific doctors within the last 12 months

• 13% say they looked online for cost information within the last 12 months

• 57% want a secure Internet site that would enable them to access their medical records, schedule office visits, refill prescriptions and pay medical bills

• 42% want access to an online personal health record connected to their doctor’s office

• 55% want to be able to communicate with their doctor via email to exchange health information and get answers to questions

• 9% have a computerized personal health record (PHR)

As the public embraces emerging technology-based healthcare services, demand will only rise for online tools to enable the average patient-consumer to perform basic healthcare trans-actions. The 2009 Deloitte survey shows that patient-consumers are highly receptive to the following three technology-enabled healthcare offerings:

• Online interactive tool to assist in making decisions about insurance alternatives and options, especially in terms of costs

• A secure internet site that enables consumers to access medical records, schedule office visits, refill prescriptions, and pay medical bills

• The capability to securely communicate with a healthcare professional via email or virtual consultation to get answers to questions as well as creating and maintaining personal health plans

To ensure that consumer dollars for healthcare goods and services are spent more wisely, Government policymakers and industry leaders should aggressively consider the tools and resources to be made available for self-care, provider selection, and health plan choices. Additional consumer oriented measures

Better Data for Better Care 36

that increase transparency in pricing and quality should be considered as well.

The Challenge: Channeling Information and Technology

“Technology has transformed much of our daily lives, in almost all cases by adding quantity, speed and quality while lowering costs. So why is healthcare different?” Aligning rising consumer interest in self-directed care with traditional methods of healthcare delivery is the logical next step, but also the greatest challenge for healthcare and technology provider organizations. Adequate healthcare information technologies are currently available for market consumption. However, due to entrenched medical practices, misaligned economic incentives; effective methods for translating these resources into improved healthcare have not been realized. These obstacles to change include the healthcare insurance industry’s resistance to reform, the HIT industry’s reluctance to genuinely productize standards, the stagnation and politicization of Government health programs, and the lack of sustainable economic models to support these services on a long-term basis.

Most of the current programs offering innovative solutions to channeling and exchanging healthcare information are reliant on grant funding. While grant funding covers the initial costs of developing and piloting disruptive or innovative healthcare information technologies, it does eventually end. When the grant well runs dry with no viable revenue streams to sustain the services or products developed, these initiatives wither slowly on the vine. And so the cycle continues – except in those rare instances where local cooperative or collaborative ventures develop models that utilize the right technology applications for clinical use cases that have clearly understood and recognized benefits for the majority of participating healthcare stakeholders and benefactors.

There is another interesting and fairly universal dilemma associated with the channeling of information and technology to improve healthcare quality and delivery. It is the differing “lens”

Consumer Empowerment and Healthcare Innovation 37

through which various players in the healthcare industry tend to view and perceive the utilization of information technologies within their professional domains. Healthcare providers, for example, tend to view quality in a technical and clinical sense. They are most concerned with accuracy of diagnosis, appropriateness of therapy, and measurable health outcomes. Their medical training and the pressures of U.S. healthcare law have led them to focus on high quality treatment outcomes and practices that minimize the risks of medical errors and mistakes. healthcare payer organizations, however, tend to focus first and foremost on costs. The payer organizations benefit by managing and cutting costs, increasing margins per beneficiary and rising profits. They will typically focus on issues such as reducing duplicative tests which saves time and increases access to and utilization of the system.

Then there is the patient – also commonly referred to as the “beneficiary” in DoD and federal healthcare environments. The patient is the patient, and the patient is the beneficiary of the program. But the patient is also becoming the most important consumer of healthcare goods and services. The speed with which this will occur is dependent on how quickly patients recognize and are incentivized (by external forces) to inform themselves and move from a general state of unawareness and apathy regarding their health to a state of being well-informed and actively engaged in managing their own personal health situations.

Whether obtained from the internet or in person, patients desire easy-to-access and easy-to-understand information about their personal healthcare. They expect clear and compassionate communications as well as clinical acumen and skill from their providers. Unlike many in the healthcare industry and medical profession, average patient-consumers do yet fully understand the benefits that will result from their providers having access to the information that a true integrated electronic health record can provide. They are only just beginning to be exposed to the reality that appropriately channeled information can generate significant, even life saving benefits for everyone. If in doubt, just

Better Data for Better Care 38

consider the emerging public realization that technology can reduce the number of patient safety errors and omissions and decrease the number of duplicative and painful tests that are time consuming and, in far too many cases, extremely detrimental to a patient’s state of health.

When all is said and done, the patient has to be the key consideration—but right now they aren't. They are relatively uninformed, and, unfortunately, they are confused. They are dissatisfied with the time and effort required to handle unclear and overly complex paperwork and with the unreasonably high costs they see on the medical bills submitted to their insurance companies for payment. They know they want better, simpler, and more personalized healthcare but the industry has done a poor job of articulating and educating them on what that really means for the average consumer of healthcare services.

Implications for the HIT Industry

In today’s market, there are numerous HIT products and solutions that enable healthcare organizations to share healthcare information when and where needed to meet patient-consumer demands and provider expectations. The market is characterized by multiple initiatives and multiple mandates with few clearly defined paths. These conditions, when combined with intense public interest and political pressure, rising costs and the availability of large amounts of Government funding, create an environment that does not easily lend itself to the meaningful and targeted channeling of information and technology to improve the quality of healthcare.

For decades, Government and industry officials have been debating and discussing the multitude of issues associated with U.S. healthcare. An embarrassingly large portion of these issues still remain essentially unresolved today and there is relatively little agreement on the right answers to the problems that both Republican and Democratic administrations have failed to address successfully. One thing is clear, however. Throwing technology at the problem is not the answer. Perhaps the answer

Consumer Empowerment and Healthcare Innovation 39

lies in more targeted and collaborative public-private efforts to really figure out the problems we’re trying to solve for the healthcare consumer and which technologies are most appropriate for what we are trying to achieve?

Many believe that the most successful businesses will be the ones that most quickly and accurately determine what information is needed, when and where it should be made available, and by whom it should be utilized. Historically, commercial HIT concerns have thrived on the significant profits made from closed or uniquely designed proprietary systems. These businesses have sustained their relatively large profit margins by collectively and individually capitalizing on the inherent lack of cooperation and information sharing between federal, state and local healthcare agency customers. This dynamic provided the HIT industry with the capacity to delay the commoditization of their HIT products and product suites. Most large commercial HIT entities settled into this comfortable pattern for the long haul and managed to grow or sustain their profit base over a prolonged period of time. An increasing number of HIT professionals now believe that the new culture of open source technologies, enterprise web services, and service oriented architectures will break this vicious cycle to finally pave the way for real and meaningful interoperability of healthcare information, data, and services.

Given these circumstances and dynamics, there are crucial questions for HIT companies to consider from the 2009 Deloitte Survey:

• Can HIT companies leverage clinical and administrative health applications to provide personal technology solutions?

• What direct-to-consumer market opportunities exist for medical device and monitoring companies?

• What innovative channels offer increased access to consumer markets?

• What is the optimal marketing strategy? What is the right balance of push through health professionals and pull through direct-to-consumer channels?

Better Data for Better Care 40

As HIT companies develop and evaluate products and services that will generate the most substantial returns, it will be critical that they analyze and understand the societal, cultural, and generational attributes of the consumers that will purchase their specific healthcare goods and services.

Implications for Healthcare Providers

While nothing will ever replace the vital personal physician-patient relationship, the consumer will gradually emerge as the most powerful player in shaping the healthcare services of the future. As consumerism continues to take hold and personalized healthcare become more commonplace, organizations and providers will be required to leverage internet technologies for patient communications. More and more, consumers will begin to expect and demand online provider-patient communication and providers and organizations will be required to offer these services to sustain patient and consumer loyalty.

Shared Decision Making and Responsibility “I’m a big believer that the most powerful player in under-

standing and managing costs is going to be the individual consumer” Michael McCallister, C.E.O. of Humana

Entrepreneurial and traditional healthcare deliverers alike will discover that shared decision-making responsibility is emerging as the new norm. Internet technologies and online offerings will continue to drive new personalized healthcare models that require a greater degree of shared decision-making between patients, providers, and suppliers.

Industry experts believe that the personalization of care will accelerate consumer adoption of increased cost sharing between patients and providers which, in turn, will generate more affordable healthcare options for all. In other words, the advent of cost-sharing models like the new defined healthcare contribution plans will also result in higher quality and lower costs. By selectively “packaging” and offering healthcare purchase options that patient-consumers can readily tailor to their own

Consumer Empowerment and Healthcare Innovation 41

unique health conditions, the patient can receive a higher quality of healthcare and institutions can lower their cost of care.

Implications for Healthcare Market

At issue is the market reality that large institutions are generally not as well positioned to take advantage of the change required by disruptive technologies. Clayton Christensen, in “The Innovator’s Dilemmas,” writes that “to maintain their share prices and create internal opportunities for employees to extend the scope of their responsibilities, successful companies need to continue to grow. But while a $40 million company needs to find just $8 million in revenues to grow at 20 percent in the subsequent yr, a $4 billion company needs to find $800 million in new sales. No new markets are that large. As a consequence the larger and more successful an organization becomes, the weaker the argument that emerging markets can remain useful engines for growth.”

Historically, the greatest chance for commercial success lies in finding a new market that places great value on the attributes of the disruptive technologies being introduced. Larger established organizations can achieve new markets through the development of smaller, more agile autonomous business units that have intellectual property associated with a successful disruptive technology. While these smaller units or business entities often build and adapt effective home grown solutions at a low cost, their innovations are usually not scalable nor built to standards that allow for a larger, cost-effective integration of systems. This generates opportunities for larger, more established firms to successfully seize strong positions in the new markets enabled by disruptive technologies and by giving responsibility for the commercialization of the disruptive technology to an organization whose size matches the size of the targeted market3.

This phenomenon is playing out in many of world’s largest healthcare insurer organizations that are diversifying and developing new portfolios and products through acquisitions. United Health Group is a prime example of this. In recent years,

Better Data for Better Care 42

this global healthcare insurer organization purchased multiple niche healthcare product and service vendors to enhance their consumer service offerings. A major pitfall that often occurs in these transitions, however, is that these larger organizations frequently fail to effectively integrate these new acquisitions into their core business and subsequently continue to offer disparate and disjointed solutions.

Past efforts indicate that new technology is more successfully introduced independent of traditional delivery methods. The higher margins demanded by investors of large institutions create a vacuum at low price points. Competitors with disruptive tech-nologies are prepared to enter this void. “Affordable care does not come by expecting the high-end expensive institutions and care-givers to become cheap, but by bringing technology to lower-cost providers and venues of care so that they can become more capable.”4

Remote Monitoring/Telehealth

“The issue we have is not the development of new technology or more information…The challenge is going to be how to channel that information and technology to provide improved healthcare. How do you get that to market…and what are the right applications?” Jonathan Linkous, CEO of ATA.

One instance of this shared responsibility that is already being adopted is remote monitoring, due to the economic savings and convenience it creates for the provider and consumer. While utilizing technology, it also maintains constant communication between provider and patient, creating a loyalty beneficial to both parties.

In some cases, three-quarters of those surveyed were enthusiastic about devices that enabled contact with medical professionals from the comfort of their home. Remote monitoring is especially of interest to seniors and others with limited mobility. The 2009 Deloitte Survey indicated:

Consumer Empowerment and Healthcare Innovation 43

• “68% are interested in home/remote monitoring devices that enable self-monitoring of their condition and electronic reporting of results to their physician; seniors (78%) and consumers with a chronic condition (75%) express highest interest” (Deloitte Survey 2009).

• “64% are interested in home/remote devices that include prompts and reminders to improve adherence to a health improvement or treatment plan; interest is especially high among consumers with chronic conditions (71%) and ranges from 51% of Gen Y consumers to 76% of seniors” (Deloitte Survey 2009).

• “47% support allowing nurses to diagnose problems and administer care for uncomplicated conditions (21% oppose)” (Deloitte Survey 2009).

Implications for Ancillary Care

The economic viability of remote monitoring systems for ancillary care has led to independent companies moving market share away from traditional home healthcare deliverers. Recent studies show that technologies that monitor and assess consumers’ health remotely and electronically transmit results will significantly improve current preventative and medical care services. Especially promising are the “first-mover” leadership advantages that home and remote monitoring and assessing technology are creating for the market. As these products and technologies become cheaper to produce and deliver, we will see the gradual erosion of the traditional barriers of regulation, reimbursement, licensure, provider reluctance, and isolated systems development. As a general rule, products based on disruptive technologies are cheaper, simpler, smaller, and offer more convenient use.

Health applications are even appearing on popular handheld communication devices.

• “Remote monitoring is going to change an incredible amount over the next couple of years because the technology has become so cheap. Remote monitoring companies-the independent diagnostic labs-are getting into the business and they represent a real challenge to traditional visiting nurse and home healthcare

Better Data for Better Care 44

agencies, as they find some of their market share being picked away by these large companies that are doing remote monitoring using telemedicine and automated systems.” Telethinking with Jonathan Linkous, CEO of ATA

• “We are starting to see applications like the iPhone using a lot of healthcare applications and consumer-based technology” Jonathan Linkous, CEO of ATA

1 Joseph Goedert, Health Data Management Magazine, December 2009 2 Ibid. 3 Christensen, Clayton. The Innovator’s Dilemma. New York, NY: HarperCollins Publishers Inc., 2000. 4 Garrettson, Jim. “Disruptive Healthcare: Clayton Christensen’s Prescription”. Gov Con Executive. October 2009.

Knowledge Management in Healthcare 45

Knowledge Management in Healthcare

The increased complexity of health care requires a deepened commitment by all stakeholders to develop a healthcare system engaged in producing the kinds of evidence needed at the point of care for the treatment of individual patients.1 Institute of Medicine Roundtable on Evidence Based Medicine

More than 50 percent of Medicare spending is used to buy visits to physicians, diagnostic tests, and hospitalizations, mostly for patients with chronic illness. The conditions that generate the most spending are congestive heart failure, chronic lung disease, and cancer. Our research shows there are wide variations in what Medicare spends for services to treat chronically ill patients and that higher spending does not achieve better outcomes.2 Dartmouth Atlas Project

Across the core report measures tracked in the 2008 National Health Quality Report, the median level of receipt of needed care was 59%. We can and should do better.3

40% of the time a patient in the United States walks into a hospital or exam room, they are not getting the treatment recommended by evidence based medical science. There is clearly a need for better Knowledge Management (KM) in healthcare. In a recent report from the National Research Council (NRC), the modern healthcare industry is described as “…an information and knowledge intensive enterprise.”4 How well does the healthcare industry transform data and information into actionable knowledge? The answer is generally acknowledged to be “poorly” due to all the data trapped in paper records, the constantly expanding body of medical knowledge that no physician can possibly keep track of without effective Clinical Decision Support (CDS), and the lack of incentives to more effectively share knowledge.

We are confronted with tough questions: How can better data from EMRs be exploited to advance and accelerate clinical research? How can we increase the velocity and accuracy of

Better Data for Better Care 46

evidence based validation (or invalidation) of new treatments as superior to the status quo? How can new evidence based treatment recommendations be most effectively communicated to clinicians at the point of care? How can technology be leveraged to provide CMS (and all payers) the best data to assist them in making the best decisions about which treatments, in which conditions, can be cut without a significant adverse effect on outcomes? How can clinicians and administrators leverage traditional KM and Web 2.0 collaborative technologies to effectively share best and worst practices, within and across organizational boundaries?

The answers lie within a tangled web of politics, policy, funding, people, process, and technology. Our scope here is to discuss the technology and touch upon policy, process, and people, so the conversation is guaranteed to be incomplete. Nevertheless, it is a fascinating topic that has great potential to advance medicine in many ways.

KM in Healthcare is Different

“Traditional KM” is all about sharing knowledge as widely and efficiently as possible. While there may be intellectual property or safety issues to consider before blasting new knowledge to the world, generally speaking, in other industries KM is about collecting and sharing all relevant knowledge without an extensive validation process.

However, if an Endocrinologist develops an innovative new way to treat diabetes, we don’t want her to post a best practice document on the internet and have every physician in the world immediately adopt the new practice. Instead we want to see her methodology published in a peer reviewed journal, and we want to see multiple clinical trials done with valid experimental models that clearly demonstrate her new method is an improvement over the old method. Then we want every Family Practice and Internal Medicine provider to adopt this new practice. Moreover, very specifically, we don’t want it to take 5-10 years for this new medical science to become widespread in medical practice (as

Knowledge Management in Healthcare 47

unfortunately happens with much new medical science). We want to deliver this new knowledge at the point of care, at the right time, in the right way, seamlessly baked into clinical workflow. So we need to carefully manage both the research end of this process, and the CDS end of this process.

On the other hand, administrative and business process best practices in healthcare should be treated like administrative and business process best practices in any other industry. As a result, in healthcare we have two completely different KM models, clinical and non-clinical. Because the disciplines and technology within clinical KM are very different between the knowledge gathering (research) and knowledge dissemination (CDS) phases, it makes sense to address them separately. We have three modes of KM within healthcare to consider:

1. Clinical Research 2. Clinical Decision Support 3. Non-Clinical or Traditional KM

Clinical Research

The traditional model of clinical research involves a clinician, armed with a research hypothesis, writing up a proposal and submitting it to an Institutional Review Board (IRB). After approval and funding, a long and tedious process is involved in identifying research subjects. Then depending on the study, a very labor intensive (and therefore costly) process is involved in acquiring data, often including an extensive manual review of paper medical records. Sometimes months or years can be eaten up by writing a research proposal, getting it approved and funded, and searching for research subjects, only to find out that not enough research subjects can be found that meet the research criteria. The worst part is the research that is not completed, and the medical breakthroughs that are not discovered, and the lives that are not saved, because this process is so hard and so expensive that many good research hypotheses never get off the ground.

Better Data for Better Care 48

There is a better way. Now that large CDOs like DoD, the VA, Kaiser, and many Academic Medical Centers (AMC) have had EMRs in place for a few years, there exists a steadily growing mountain of clinical data on tens of millions of patients. There are many holes the in the data, and depending on the research need, some required data is still on paper or stove-piped away in difficult to access specialty systems. Also, the EMR data must be exported to a data warehouse to permit mining the data without shutting down the EMR, and data warehouse projects are expensive. Nevertheless, these clinical data warehouses now exist or are soon coming online at large innovative CDOs, and the answer to thousands of unasked research questions are now orders of magnitude easier to discover.

Gartner Research healthcare analyst Vi Shaffer has coined the term Advanced Clinical Research Information System (ACRIS) to describe the kind of system that will be required to manage clinical research in this new world order. ACRIS is:

… a complex constellation of capabilities that can rapidly assemble data assets for research questions, and provide data mining and research processes support to meet the needs of clinical and translational research and related biostatistics and biocomputation …. The Enterprise Data Warehouse … may be shared between the ACRIS and an enterprise business intelligence system that assembles data from some of the same sources but for the purposes of performance management. However, the requirements for clinical research are very different from — and even more complex than — the requirements for business intelligence.

An ACRIS includes tools that enable: • Mining of patient data, including that contained in transcribed

and other unstructured reports. • Automatic correlation of data with medical knowledge in

published research, providing more effective/efficient secondary research.

Knowledge Management in Healthcare 49

• Use of external data and open-source tools, including assistance in translating between ACRIS data models and vocabularies, and those of other institutions, for collaborative research

• Cohort identification. • Creation of research study data marts from enterprise and other

clinical trial data. • Facilitation of researcher workflows, including support of the

scientific method, grant preparation, internal/external collaboration, and documentation.5

Shaffer points out that the need for ACRIS capabilities are being accelerated by the rapidly advancing interest in genomics and translational research, as well as the fact that the funders of research are coming to expect the research process velocity and discipline that an ACRIS can deliver.

This all closely mirrors the goal of the National Institutes of Health (NIH) in funding the Clinical and Translational Science Awards (CTSA) program. This consortium includes 46 medical research institutions located across the nation. When fully implemented by 2012, about 60 institutions will be linked together with the goal of collaborating on and energizing the discipline of clinical and translational science. Drawing from the NIH Roadmap for Medical Research and extensive community input, the CTSA program creates an academic home for clinical and translational research. The CTSA program vision statement is to: “Improve human health by transforming the research and training environment to enhance the efficiency and quality of clinical and translational research.”6

Of course, the three modes of KM in healthcare are not mutually exclusive. Even if the subject is clinically focused, Web 2.0 tools can and should be used to collaborate on the business processes surrounding clinical subjects. A great example of this is the CTSA wiki used by consortium members to collaborate and provide value above and beyond conventional collaborative tools. Through an open, community-oriented Web environment, the wiki allows users to actively contribute and modify content in real time. It allows consortium members to collaborate on projects,

Better Data for Better Care 50

services, news, program vision, emerging technology, documentation and more.

Let’s briefly examine the experience of one of the CTSA facilities, The Ohio State University Center for Clinical and Translational Science. Vi Shaffer at Gartner feels that OSU is a leader in understanding the critical link between operational medicine and clinical research. They ensure epidemiology expertise is embedded throughout the entire clinical research process. They are focused on developing translational research information architects, to ensure that the data collected during research can be efficiently and effectively “translated” into operational medicine as soon as it is clinically responsible to do so.

OSU had an early start with clinical data warehousing, and the lessons they have learned in the last ten years from the blood, sweat, and tears associated with pioneering this work are now baked into their clinical data warehouse data model. The data model and associated business logic surrounding the clinical research process is a goldmine for other CDOs wanting to copy their success. And copying their success is critical if we want to expand translational research and empower thousands of researchers with all the clinical data that EMRs will produce in the next 5-10 years. Vi Shaffer feels it is important to replicate the success of larger Academic Medical Centers (AMCs) like OSU at second tier AMCs. She states that to do this the market desperately needs accelerators, and the acceleration required is part technical, part change management, and part research operations.7

OSU Medical Center CIO Herb Smaltz has leveraged the expertise on his team and created a spinoff organization aimed at exactly this vision. Healthcare Data Works offers software and consulting to support replicating the OSU data model, various pre-built dashboards aimed at both clinical and administrative domains, and most relevant to our discussion, an application named Advanced Screening for Active Protocols (ASAP).

Knowledge Management in Healthcare 51

ASAP leverages the OSU clinical data warehouse model to screen patients against research protocols. Patient recruitment is crucial to the success of clinical trials; however, as discussed above, manually screening patients for the trials is a laborious and time consuming process. By automating the screening criteria provided by the protocol, ASAP increases the efficiency of these processes making it an effective accelerator for clinical research. Patients can be screened based upon structured data elements as well as the contents of clinical narrative text. Data elements are organized into six query templates: demographics, diagnosis, laboratory tests, procedures, medications, and text reports and is presented in a familiar, eligibility checklist format. The query templates are flexible enough to express very complex criteria. Results from the query can be drilled down to view the entire medical history of the patient, from appointment details to lab values, diagnosis, and procedures, to free text in the dictated reports.

Another aspect of the future of KM in clinical research is the ability to conduct computer simulations of healthcare interventions. The Archimedes Model is a full-scale simulation model of human physiology, diseases, behaviors, interventions, and healthcare systems. By using advanced methods of mathematics, computing, and data systems, the model enables managers, administrators, and policymakers to be better informed and to make smarter decisions than has previously been possible. The core of the model is hundreds of equations that represent human physiology and the effects of disease. Attached to these are hundreds more equations and algorithms that realistically simulate the healthcare system including processes such as tests, treatments, admissions, and physician behaviors. Together with population data, the equations are integrated into a single, large-scale simulation model that accurately represents what happens to real people in real healthcare systems.8

The Archimedes Model has been able to recreate studies done with “real humans” with remarkable accuracy. Critics are concerned that the model is trying to use algorithms to define

Better Data for Better Care 52

connections that even medical science does not understand. Archimedes failed to predict that raising “good cholesterol” would actually lead to an increase in heart disease (as did the rest of medical science – any model can only be as good as the science used to build it). Critics are concerned that Archimedes’ creator, Dr. David Eddy, is keeping the model shrouded in secrecy and not providing the kind of access that other clinical researchers need in order to validate the methodology and understand how all 10,000 of the variables and equations interact.

Dr. Eddy states, “Clinicians tend not to trust models, which is understandable, since many models are junk and it can be difficult to tell which is which.” He goes on to point out that when the model fails, like in the cholesterol study, this is a good thing, because it can help pinpoint holes in medical science that need to be filled in. When they are filled in, this new information becomes another variable or equation in the model and the predictive power of the model becomes that much stronger.

Proponents of Archimedes point out that the model would never be used independently to recommend a new intervention. If the model pointed to a particular treatment or intervention as more efficacious than the status quo, then clinical trials would always be performed to confirm the treatment worked on carbon based human beings instead of software model-based human beings. The key point is that Archimedes could be used as an accelerator to pinpoint treatment modalities that could then be confirmed with clinical trials – as opposed to “fishing” with clinical trials that can drag on for years and decades.

Unlike clinical trials, Archimedes can easily run a wide variety of modifications to the experimental hypothesis. Unlike clinical trials, Archimedes can conduct experiments that could not be done on humans for ethical reasons. It can model factors like arterial plaque that cannot easily be measured with diagnostic devices. Because it can run 16,500 person-years of simulation per minute, it can conduct large scale studies with so many required

Knowledge Management in Healthcare 53

variables that it would be astronomically expensive (and take decades) to complete them with clinical trials.

Dr. Eddy is currently testing a “personalized” Archimedes – the ability to input individual patient data from an EMR into the model in order to run personally tailored (and therefore more accurate) simulations of different treatment options. This would allow a provider to have a much more informed conversation with their patient about which treatment option would be best for them.

Last but not least, Dr. Eddy’s team is working on ARCHeS (ARChimedes Health care Simulator), an online interface that will allow physicians and researchers to access Archimedes and design their own trials.9

Hopefully the future of clinical research looks something like this: Physicians, epidemiologists, and other researchers at AMCs across the country are able to access mature ACRIS-type systems to mine mountains of clinical data, discover new relationships among the data, and propose new hypotheses on better treatments and interventions. They log on to ARCHeS and run these studies on a massive Archimedes platform built to support a large volume of analyses. If confirmed on Archimedes, an ASAP-like application is used to streamline the process of identifying patients that qualify for a clinical trial to confirm the hypothesis. As mentioned in chapter two, PHR plug-ins like TrialX.com could be used by patients to volunteer themselves for clinical trials even if their clinical data is not accessible to the AMC conducting the study. Clinical trials are conducted and repeated to the extent required to satisfy the medical community that the proposed new treatment or intervention is in fact an improvement. A process along these lines would greatly accelerate the advancement of medical science.

Clinical Decision Support

As we have just seen, the intelligent application of information technology in clinical research will be able to significantly expand

Better Data for Better Care 54

the volume and quality of new medical science produced. That’s the good news. The challenge now becomes: How are we going to effectively convey all this new knowledge to clinicians in ways they can most effectively use it?

The answer is evidence based Clinical Decision Support (CDS) that is baked into the EMR clinical workflow process, always available but minimally obtrusive to the provider. Ideally, like the Archimedes model just discussed, it is informed by data specific to the patient being treated, so that treatment recommendations can be tailored to the patient to the extent possible. CDS encompasses a variety of tools and interventions such as computerized alerts and reminders, clinical guidelines, order sets, patient data reports and dashboards, documentation templates, diagnostic support, and clinical workflow tools.

Because CDS is optimized in the context of EMR workflows, it cannot progress any faster than EMR adoption. Some CDOs are beginning to have some success with CDS, but many others have had limited success or have not even begun in earnest. Success is difficult to replicate because of the complexity of clinical decision making, the technical complexity of effectively delivering CDS, and social aspects of incorporating changes into clinical practice that are very different at different CDOs.10

A 2008 Journal of Biomedical Informatics article titled “Grand Challenges in Clinical Decision Support,” authored by a who’s-who of CDS experts, is an excellent roadmap for the technical and process interventions that are required in order to refine and accelerate CDS efforts. Using an iterative, consensus-building process, they have identified a top 10 list of challenges facing CDS. A summarization of this list will make an excellent cornerstone for our CDS discussion:

Improve the human-computer interface: CDS should unobtrusively, but effectively, remind clinicians of things they have truly overlooked and support corrections, or better yet, put key pieces of data and knowledge seamlessly into the context of the workflow or clinical decision-making process.

Knowledge Management in Healthcare 55

Summarize patient-level information: The challenge is to intelligently and automatically summarize all of a patient’s electronically available clinical data, both free text and coded, and to create one or more brief (1-2 page) synopses of the patient’s pertinent clinical data.

Prioritize and filter recommendations to the user: The challenge is to rank in priority order, and reduce the number of computer-generated recommendations that a clinician has to deal with to a manageable number, thus reducing the “alert-fatigue” that is a frequent cause of user dissatisfaction.

Combine recommendations for patients with co-morbidities: The majority of elderly patients have multiple co-morbidities and medications that must be addressed, but the majority of clinical guidelines today do not adequately address these co-morbidity and polypharmacy issues.

Use free text information to drive CDS: At least 50% of the clinical information describing a patient’s current condition and stage of therapy resides in the free text portion of the EMR. We need methods of extracting the clinical information contained in this free text into a form that would allow CDS systems to utilize it.

Prioritize CDS content development and implementation: CDS development should be prioritized according to value to patients, cost to the health care system, availability of reliable data, difficulty of implementation, and acceptability to clinicians.

Mine large clinical databases to create new CDS: We need to develop and test new techniques to allow researchers to mine large clinical data repositories in order to expand the global fund of clinical knowledge. We must address the social and political challenges facing researchers as they struggle to create or gain access to these large clinical databases.

Disseminate best practices in CDS design, development, and implementation: The CDS implementation process needs to be

Better Data for Better Care 56

expressed and catalogued in a way that allows information from successful sites to be easily found by others.

Create an architecture for sharing executable CDS modules and services: The goal is to create a set of standards-based interfaces to externally maintained CDS services that any EMR could “subscribe to,” in such a way that CDOs can implement new state of the art CDS interventions with little or no extra effort on their part.

Create internet-accessible CDS repositories: The challenge is to build one or more internet-accessible repositories of high quality, evidence-based, tested, CDS knowledge modules. These interventions and services could be easily downloaded, maintained, locally modified, installed, and used on any EMR.11

Approximately half these recommendations simply need a combination of technical innovation and market forces to mature into fruition. The other half will require a coordinated centralized effort to be successful, it is therefore important that the federal government play some role in the solution. Just like the Certi-fication Commission for Healthcare Information Technology (CCHIT) approach to certifying EMRs, it is important to strike the right balance between the government and private sector role. The government ultimately needs to referee the definition of standards and help establish centralized architectures, but only with strong and continuous private sector input to ensure quality feedback and innovation. It appears there has been a lot of discussion but little action towards a coherent centralized CDS architecture. We hope this will change so that CDOs really can implement state of the art CDS with little effort (at least compared to the effort required to implement the EMRs that the CDS will ride on). No one would argue with the statement that medical science is a public good. Logically, the infrastructure required to most effectively translate this science into clinical practice should also be a public good.

Government established EMR “meaningful use” criteria are still in development as of this writing, and will not be completed

Knowledge Management in Healthcare 57

by Spring 2010. Nevertheless it appears there will be CDS requirements in the criteria, which means that CDOs will have to demonstrate at least some level of CDS integrated with their EMR in order to receive ARRA incentive payments. This economic incentive should drive progress with CDS in the short term.

Also in the short term, there are plenty of relatively simple CDS that can be implemented without significant EMR integration expenses. Vendors like UpToDate and MDConsult, providers of evidence based medicine resources via the web, are enabling very lightweight EMR integration models by accepting search queries and patient demographics via URL. For example, if a clinician has made a new diagnosis and wants help with treatment recom-mendations, one click from within the EMR results in a URL request to UpToDate. Imbedded in the URL is the diagnosis the clinician just input into the EMR, so that UpToDate can return search results on the diagnosis without the provider having to open a new browser window and typing in the diagnosis on the UpToDate welcome screen. It may only save a few seconds, but those few seconds are critical in the exam room. It may very well mean the difference between the clinician consulting an evidence based resource – or not bothering to. Because the programming effort on the EMR end is minimal, this is the kind of CDS that can be quickly implemented.

Another tool available in the short term to improve CDS is Adjusted Clinical Groups (ACGs), developed by a team at Johns-Hopkins over the last ten years. ACGs take clinical history and medication data from the patient file, and categorize patients into risk adjusted acuity groups. This allows, for example, prediction of which patients are on their way to becoming “train wrecks” in the future, and would therefore be good candidates for case management. In addition, ACGs can be used to refine CDS recommendations. For example, when a patient is given a new diagnosis, the diagnosis plus their ACG profile can be combined to more accurately describe the typical disease progression, and how the patient should be treated differently given their ACG profile.12

Better Data for Better Care 58

Using ACGs to boost the power of CDS is the “middle ground” between, on the one hand, CDS that is uninformed of patient-specific data, and on the other hand, CDS that has complete awareness of everything about the patient, to include genomics data. The former is widely available now, the latter is in its infancy and is incredibly complex to implement. Using ACGs is the best option we have that could be made widely available today.

Speaking of genomics data, experts predict that in the not too distant future, advances in nanotechnology and molecular biology will allow us to screen a single drop of blood for thousands of genetic markers, giving us the ability to predict the likelihood of illness over a patient's lifetime. Significant work is already underway, at Kaiser Permanente and OSU Medical Center among others, to figure out the best way to model this genomic data and include it into CDS. As medical science continues to discover how gene expression relates to specific morbidities and treatments, this capability will become increasingly powerful in its ability to tailor CDS recommendations to specific patients.

Last but not least, physicians need to be very involved with CDS implementations. Dr. Terry Steinberg, CMIO of Christiana Care Health System in Delaware, puts it this way:

‘Hit enter to confirm that the computer is smarter than I am,’ is the way clinicians see CDS – they have a real love-hate relationship with it. They like knowing they have a safety net, but they don’t like it when the computer tells them something they knew already.13

CDS experts agree that CDS is something that has to be “done with physicians, not to them.” CDOs must incorporate the voice and unique culture of their physicians into any CDS imple-mentation, and at the same time ensure that CDS is effective in bringing practice patterns closer in line with evidence based recommendations.

Non-Clinical or Traditional Knowledge Management

Knowledge Management in Healthcare 59

Traditional KM is not necessarily any less important than clinical research and CDS. If better KM can help break through the paralysis of politics, organizational stovepipes, and perverse incentives in the business of healthcare, then it is invaluable. If it can accelerate clinical research and CDS by helping practitioners collaborate around these practices, and if it can enable better communication between private sector innovators and government policy writers, then it plays a central role in advancing the science and practice of medicine.

One notable example is the CTSA wiki mentioned earlier in this chapter. Another is Sermo (http://www.sermo.com/), the largest online physician community in the US. Physicians can register on Sermo only after verifying their status as licensed, practicing physicians. They then choose a pseudonym for a username. This pseudonym and their clinical specialty are the only pieces of information that other users will be able to see automatically, making Sermo a credentialed, but anonymous community. Physicians post observations and comments, create and respond to polls, and browse medical articles within the site. If desired, they may create profiles, revealing more information about themselves. Sermo makes money by selling access to physicians’ anonymized comments and polling data to financial institutions, health care organizations, and governmental agencies. Clients have the ability to read doctor’s comments and create a limited number of postings (identified as Client Postings) to which physicians respond.

Sermo has been successful in creating enough value for its users and clients to attract over 110,000 busy physicians, certainly no small feat. Sermo’s enabling of “virtual curbside consults” has no doubt improved care for many thousands of patients. However, Dr. Barry Chaiken, chair of the Health Information Management Systems Society (HIMSS), feels that Sermo can be a “mish-mash of the clinical, political, and the inane” and that “social networking in Sermo feels disconnected from workflow and workplace reality.” He points out that KM

Better Data for Better Care 60

and social networking value is enhanced when communication is among similarly trained physicians.14

Dr. Chaiken’s comments point directly to the KM axiom “knowledge sticks to practice.” Communities of Practice (CoPs) are an absolutely critical component of effective KM, and if an online community is so diverse that the conversation cannot be focused on topics of compelling interest to everyone in the community, then a “mish-mash” is the inevitable result. A better approach is an online community purposefully built around multiple CoPs, where any individual user can subscribe and contribute to as many CoPs as he/she is interested in – and simultaneously not be distracted by discussions they are not interested in.

In the next chapter we will turn our attention to CoPs, and how they can most effectively leverage information technology to maximize successful Knowledge Management.

1 The Learning Healthcare System: Workshop Summary, IOM Roundtable on Evidence Based Medicine, 2007, http://www.nap.edu/catalog.php?record_id=11903 2 Supply Sensitive Care, Dartmouth Project Atlas Brief 3 2008 National Healthcare Quality Report, Agency for Healthcare Research and Quality, http://www.ahrq.gov/qual/nhqr08/Key.htm 4 Computational Technology for Effective Health Care: Immediate Steps and Strategic Directions; William W. Stead and Herbert S. Lin, Editors, Committee on Engaging the Computer Science Research Community in Health Care Informatics Computer Science and Telecommunications Board Division on Engineering and Physical Sciences, The National Research Council of the National Academies; The National Academies Press, January 9, 2009 5 Gartner Research Report, Hype Cycle for Healthcare Provider Applications and Systems, 2009 6 CTSA web site: http://www.ctsaweb.org 7 Interview with Vi Shaffer, 22 Oct 2009 8 http://archimedesmodel.com/ 9 Kahn, Jennifer, The Body Synthetic, Wired Magazine, December 2009

Knowledge Management in Healthcare 61

10 Osheroff JA et.al., A Roadmap for National Action on Clinical Decision Support, Journal of the American Medical Informatics Association, 2007; 14(2): 141-5. 11 Sittig, Dean et. al. Grand Challenges in Clinical Decision Support, Journal of Biomedical Informatics, 41:387-392, 2008 12 Interview with Gary Nissen, CEO of Health Plus Technologies, 30 October 2009 13 Lawrence, Daphne, Making the Right Decision, Healthcare Informatics, vol 26, no 10, October 2009, pp. 14-18 14 Interview with Dr. Barry Chaiken, 12 November 2009

Better Data for Better Care 62

Communities of Practice, Knowledge Junctions, and Web 2.0

This chapter focuses on three key concepts critical to the success of non-clinical or “traditional” Knowledge Management (KM).

• Communities of Practice (CoPs): Networks of people that share information and knowledge, held together by a common goal and desire to share experiences, insights, and best practices within a topic or discipline using shared norms and processes.

• Knowledge Junctions (KJs): Innovative use of information technology to merge many different forms and formats of knowledge into one Knowledge Junction. Each KJ is focused on the knowledge of interest to a specific CoP, and a large intranet may have several hundred KJs. The knowledge contained in each KJ is made available to users in the form that is most helpful to them, thus going beyond the traditional model of a static, shared document repository.

• Web 2.0: The Web 2.0 revolution is here to stay, and it has transformed how we communicate socially. It should also transform how we communicate professionally, and the best way to do this is to bolt Web 2.0 technologies onto a Knowledge Junction infrastructure, so that the technical barrier to contribute content helpful to a CoP is lowered significantly.

This chapter is structured to offer practical advice, either to potential leaders of a large strategic KM effort, potential leaders of a CoP, or anyone who wants to help their community more efficiently share knowledge. It ties together recommendations from KM literature, as well as extensive practical experience from a six year old KM platform used by the Air Force Medical Service.

Communities of Practice (CoPs)

To facilitate this discussion, we will borrow heavily from an excellent book titled “Cultivating Communities of Practice” written by thought leaders on this subject. Two successful Air Force KM platforms, Air Force Knowledge Now (AFKN), and the Air Force Medical Service Knowledge Exchange (Kx) were both

Knowledge Management in Healthcare 63

very purposefully built around an understanding of CoP dynamics.

Based on significant experience in both the public and private sector, KM initiatives succeed based on the ability of groups of people with similar jobs to “gel,” establish a true community, and share knowledge that ultimately helps both community members and the “greater good” – whether the greater good is the success of a particular CDO, or medical science as a whole:

The knowledge of experts ... is much more a living process than a static body of information. Communities of practice do not reduce knowledge to an object. They make it an integral part of their activities and interactions, and they serve as a living repository for that knowledge.1

Sharing knowledge is an unnatural act. You can't just stand up and say, 'Thou shalt share knowledge'-it won't work.2

What managers have been missing so far is an understanding of the kind of social structure that can take responsibility for fostering learning, developing competencies, and managing knowledge.3

Because this knowledge, particularly in healthcare, is a living process and not a list of documents, it is important that we understand what CoPs are, and understand the best way to go about cultivating CoPs. Let’s start with the basics. A CoP is built on a combination of three elements:

• A domain of knowledge which defines a set of issues • A domain is not an abstract area of interest but key issues or

problems that members commonly experience. If a subject area is too broad, it will not pique anyone’s interest.

• Most successful CoPs have a domain that both excites participants and has strategic relevance to a greater good.

A well-developed domain becomes a statement of what knowledge the community will steward. It is a commitment to take responsibility for an area of expertise and to provide the organization with the best knowledge and skills that can be found.

Better Data for Better Care 64

In turn, when an organization acknowledges a domain it legitimizes the community’s role in stewarding its expertise and capabilities4

A community of people who care about the domain

• Community is absolutely critical to creating a social structure that can effectively create and steward knowledge. KJ technology can facilitate the process, but is not by itself a community.

• Community creates the social fabric of learning. Community is the only way to build trust and a willingness to share. People don’t generally want to expose their ignorance to total strangers.

My biggest learning is that it is all about the relationships. If you make the request based on “I owe you a beer,” people will respond.5

Like a neighborhood bar or café, a community becomes a “place” where people have the freedom to ask for candid advice, share their opinions, and try their half-baked ideas without repercussion.6

Developing shared practice in the community

• A practice is the body of shared knowledge that creates a shared foundation and allows members to work together efficiently. It is a set of frameworks, ideas, tools, information, styles, language, stories, and documents shared by CoP members.

• Knowledge “sticks” to practice. On the one hand it is frequently desirable to “unstick” some of this knowledge and share it throughout the organization. On the other hand, it is precisely this stickiness that allows communities of practice to function.

• Because knowledge sticks to practice, sharing knowledge within your practice but outside your organization is often easier than sharing knowledge within your organization but outside your practice.

Research shows that the single most important factor in the success of a CoP is a leader that is passionate and knowledgeable about the CoP’s domain.7 However, the CoP leadership role is different than a traditional leadership role. A CoP leader’s primary focus is to link people to resources and other people, not

Knowledge Management in Healthcare 65

just provide answers. “Cultivate” is the proper analogy. A CoP leader’s job is to feed, water, and keep the weeds away, not control the process.

Particularly in the early development of a CoP, it is important to generate excitement about the domain, and make the intrinsic value in participation obvious to prospective CoP members.

People have to see tremendous immediate benefit from knowledge management. They have to see, smell, touch, and taste how it's going to improve their work lives.8 If it is not clear how CoP members benefit directly from participation, the community will not thrive.9

It is especially important for geographically distributed communities of practice to have some sort of regularly scheduled events in order to maintain a “heartbeat” of community. If regularly established meetings already exist, CoP contributions can be tied to these meetings. For example, members could be asked to submit a best practice document on an area of importance to the CoP, which would then be discussed at the next regular meeting. If regularly established meetings do not exist, start a monthly teleconference or web conference. Not just for the usual purpose of disseminating information, but also for the purpose of soliciting problem areas from the field and connecting those problems to answers that are then made available to all CoP members.

An important organizational benefit of encouraging the growth of CoPs is that active participation in a CoP is related to retention (obviously a relevant point for many CDOs struggling to retain talent). As two different CoP members put it:

You are redeployed so often, the only source of stability is your community of practice.10

I know I will be connected to the same group of (technical specialists) no matter where I am assigned, and this helps me feel more connected to the company as a whole.11

Better Data for Better Care 66

CoP leaders should not be frustrated by a slow start. It is natural for CoPs to take a while to get off the ground. The results are most often worth the wait. It is also natural for CoP members to have very different levels of participation. Most CoPs have a minority of members who are very active and contribute frequently, and a majority of members who look on from the sidelines and rarely contribute (but nevertheless learn and benefit from the experience). CoP leaders should not get discouraged by all the folks on the sidelines; this is a normal CoP phenomenon. Community activities should be designed to allow all members to feel included – some on the periphery will eventually become more involved.

From the perspective of your organization (or even the healthcare industry) as a whole, sharing knowledge between CoPs is vitally important:

Communities are most effective when they connect teams across divisions to catalyze ideas and solutions that would not be discovered otherwise.12

Knowledge Junctions (KJs)

The concept of sharing knowledge between CoPs makes a good transition to the topic of KJs. We don’t want just one CoP, we want a forest of CoPs that are greater than the sum of their parts because each feeds off of and contributes to others. We want to leverage technology to create a platform of common services available to each CoP. Like a cafeteria health benefit plan, individual CoPs can pick and choose the technology and services they require based on the needs of their membership. From an organizational per-spective, once the KJ infrastructure is built, the marginal cost of supporting one new CoP with one new KJ is trivial assuming no new types of data services are required to support the new CoP.

KM simultaneously is and is not “about the technology.” The best technology in the absence of a coherent community of practice will be ineffective. At the same time, coherent CoPs in

Knowledge Management in Healthcare 67

the absence of well executed technology, in particular a well executed user interface, will experience limited success. With the rapid advance in publicly accessible Web 2.0 technologies in the last couple of years, what that probably means is that well formed CoPs without an effective KJ toolset will “take their business elsewhere” and use free tools like Google Groups or Facebook to collaborate. In healthcare, this is clearly not a good solution because of HIPAA and other security concerns. Healthcare organizations that care about effective KM have a responsibility to deliver effective collaboration technology to their CoPs.

A Knowledge Junction model helps to do this because it architecturally recognizes the centrality of the CoP. A Knowledge Junction model makes navigation between the separate content nodes within the web enterprise audience-centric and not organization-centric. An idealized KJ is able to bring both structured data (data in a database) together with unstructured data (documents, static web pages, discussion forums, etc.). This is much easier said than done. An idealized KJ is able to effectively wrap metadata (data about data) around structured data so that the structured data is discoverable via a search query. When this structured data resides globally in hundreds or thousands of separate databases globally, this additionally requires a Virtual Data Network architecture to index metadata associated with this structured data into a centralized search repository. Once all the metadata is centralized and indexed, it can be discovered via a search query and then “point” back to the authoritative source of the data wherever that may be.

Confused yet? Let’s discuss a healthcare example in order to explain the concept a little more concretely. This example is clinical, but for our purposes here it will explain the Virtual Data Network use of metadata in a way that can be understood in a clinical or non-clinical setting.

The Military Health System’s Healthcare Artifacts and Images Management Solution (HAIMS) project tasks Evolvent Technologies, Inc. to design and develop an enterprise-wide

Better Data for Better Care 68

imaging system that links all patient artifacts and images to their electronic health records. Radiological images stored in PACS will be linked, and artifacts, to include scanned documents, EKGs, and other file formats, will be efficiently stored and managed. The need for specialties to share images will be met by providing rapid access to image studies and workflow tools, and shared artifacts and images will be accessible across the DoD and VA in real time.

The design will use a sophisticated architecture designed to cost-effectively address complex issues such as a globally dispersed system, network latency, interrupted communications, large image/file sizes, redundancy, and storage. The metadata will be displayed via a web application module where providers will search for available artifacts and images, and via a pick list can pull and display the artifacts and images on their clinical workstation. This architecture will also enable the scanning and capture of any and all other types of image data, to include paper records provided by out-of-network care, sharing of data with the VA and other future Federal agencies, and certainly the historical paper medical record as well. HAIMS will link patient data that addresses the needs of both primary care clinicians and specialists. Specialists will be able to review and interpret artifacts and images and generate reports that will aid providers in their treatment plan. HAIMS will help give providers a more complete medical record enabling them to provide safer care.13

Virtual Data Network projects like HAIMS can help healthcare organizations significantly improve their ability to collect, analyze, and study their own data, thereby unlocking the knowledge hidden away in data silos and enabling better decision support.

Short of complex Virtual Data Network projects, the focus of KJs is more typically on bringing together different modalities of unstructured data. For example, traditional KM tools are good at keeping documents and web content up to date. However, when a user is searching for this information, the system spits back this

Knowledge Management in Healthcare 69

“knowledge” in the same format it was created in, which is usually not the ideal format for getting the information into the user’s brain. Traditional eLearning, on the other hand, does an excellent job of delivering knowledge in a format best suited to the learner actually retaining the information, but a generally poor job of keeping the content of the eLearning course up to date. If KM and eLearning tools are married into a single KJ structure, the strengths of both disciplines can be combined. In other words, a subject matter expert can monitor all their CoP’s communication at a particular KJ, and then update an eLearning course on a regular basis to keep a particular subject up to date. Meanwhile, when an end user does a search on the KM platform for the subject at hand, they can reference the eLearning course, and will be able to more effectively soak up knowledge from reviewing the multimedia course than they could have from reading a long document.

Web 2.0 tools Many enterprise KM projects experience limited success

because they are never able to collect a critical mass of content from end users (as opposed to content managers). Content managers generally have some business need to get information out to end users. The users, however, generally have much less of a business need to communicate back. Most have something to contribute that would be useful to their peers and to the enterprise, but the vast majority are unwilling to do so if there is any technical learning curve whatsoever required to participate.

This scenario results in a mostly one-way conversation. Even if hundreds of content managers are contributing quality content into a robust system indexed by an excellent search engine so that content is discoverable, a top-down-only information flow is not truly effective Knowledge Management.

A Web 2.0 Case Study This was precisely the dilemma faced by the Air Force Medical

Service (AFMS) a couple years ago. Subject matter expert content managers, most of them in headquarters jobs removed from

Better Data for Better Care 70

hospitals and clinics, need to stay in close touch with their “tribe” in the trenches (their Community of Practice) in order for their expertise to stay relevant. Of course conferences, phone calls, and emails are used to maintain this line of communication, but Web 2.0 tools were the missing link.

The AFMS launched a web content management and document management intranet platform called the AFMS Knowledge Exchange (Kx) in 2003. Five years later the Kx had grown to over 400 Knowledge Junctions and 20,000 registered users. Over 50,000 documents are kept up to date by an automated expiration system that requires content manager action to renew documents, and users received emails announcing new or updated content for communities they choose to subscribe to. This was innovative stuff in 2003 – but not so much in the Web 2.0 world.

Any of the 20,000 registered Kx users could contribute documents, however, very few (besides the 400 content managers) ever did. Document check-in web forms with required metadata fields – a necessary part of good document management – discouraged end users from “casual” contribution of content. Bottom line: the Kx was a textbook case of a KM platform that needed Web 2.0 tools as a catalyst for end user contribution of content. The user interface needed to be instantly self-explanatory, and the presentation needed to make the benefits of contribution immediately obvious to users.

While the Kx was in dire need of bottom-up intelligence via end user contributed content, it was also ideally suited as a launch pad for discussion forums, blogs, and rich user profiles. Much has been written by experts about the need to coherently exploit Web 2.0 tools inside the enterprise, as opposed to letting every department deploy their own silo’ed toolsets ad-hoc. The whole point is to have one search interface that efficiently returns all content, and while search can always be federated, it is more practical to have all the content inside one enterprise platform. One good way to do this is bolt Web 2.0 tools on top of an existing

Knowledge Management in Healthcare 71

enterprise-wide Knowledge Junction framework with established community members/subscribers. A great question buried in a discussion forum that no one sees does no one any good. A great question posted on a discussion forum that is emailed out to 50 existing community subscribers is probably going to generate some thoughtful answers.

The AFMS and Evolvent Technologies (the Kx systems integrator) chose Jive Software’s Social Business Software suite to use for a Web 2.0 supplement to the Kx. Dubbed “Kx Communities,” the full featured community platform was launched in April 2009 and since then has been used by many AFMS communities of practice to enhance two-way communication and bottom up intelligence. A search query on the Kx now returns not only web content and documents; it also returns discussion threads, blog posts, rich user profiles, and wiki-like collaborative documents all on the same search results page.

A basic tenet of Knowledge Management is the concept that “knowledge sticks to practice,” and this is no-where more true than in healthcare. Clinicians and administrators of any given healthcare specialty generally have a need to compare best and worst practices with their geographically separated “tribe-members” more than they have a need to collaborate locally with co-workers in other specialties. Because the AFMS has a lot of small clinics in far-flung places across the globe, many AFMS professionals are one-person shops in their area of expertise. They may see their fellow tribe-members face-to-face once a year at a conference.

Let’s discuss an example of typical use of the Kx Community discussion forum. A flight surgeon needed to know how flight surgeons at other Air Force hospitals and clinics are handling the administration of Army flying exams. He posted a discussion thread, designated it as a question, and received nine replies. He rated one answer as “correct” and one answer as “helpful,” so the system reflects that this question has been answered. This type of

Better Data for Better Care 72

functionality helps the subject matter expert / content manager for this community easily monitor the status of queries, an invaluable feature in larger communities.

This hits on a key point. A big part of success with Web 2.0 is the feature set:

• Not just a discussion forum, but a discussion forum that allows users to rate the quality of answers, and shows the most popular discussion threads on the Kx Community home page.

• Not just a subscription platform, but a system that permits detailed management of email subscription options, so users can tailor the amount of system generated email in their inbox. They may subscribe to all new content in a given community, or just certain discussion threads, or just certain subject matter experts.

• Not just a blog, but a blogging platform in which every post and every comment has a link to the author’s user profile so there is complete transparency into who is saying what. There is no such thing as an anonymous comment on the Kx. Even if a user has not posted a photo and biographical information into their user profile, important demographic and contact information is always available for them (this information is required to get a Kx account).

• Not just better web content management, but a drag-and-drop, iGoogle-like widget interface, which enables content managers to easily create a rich user experience tailored to the needs of their CoP.

In contrast, another successful government KM platform added Web 2.0 tools, but did so with custom code instead of using a best of breed platform. This resulted in a feature-poor user interface, and to date their discussion forums and other Web 2.0 tools have not enjoyed broad adoption even though the core functionality of the site continues to be very successful.

One good example of invaluable bottom-up intelligence enabled by Kx Communities is the Self Aid and Buddy Care (SABC) blog. SABC is a program that trains every Airman in the Air Force (not just medics) on battlefield first aid. With the military involved in two wars, it is not hard to understand that

Knowledge Management in Healthcare 73

there are life and death implications to this program. If Airmen are not effectively trained on the latest battlefield first aid measures with the latest life saving equipment, informed by the latest lessons learned from Iraq and Afghanistan, then somebody’s son or daughter overseas is not going to make it home that otherwise would have. The SABC program’s training manager blogs on SABC details, and she frequently solicits specific feedback from the field in order to improve the program. Two recent posts, one about revising SABC policy, and one about updating the SABC instructor kit, resulted in dozens of back and forth comments from all across the AFMS and clearly assisted the training manager in making the best management decisions.

As a final example, a senior flight surgeon recently posted a question about the use of probiotics to reduce the incidence of traveler’s diarrhea in aircrews conducting missions in the third world. It was a complex question that was unlikely to have one correct answer, but it was likely that other flight surgeons around the AFMS would have some similar experiences and might be able to help solve his dilemma. The author ended his question with the phrase “awaiting enlightenment.” Kx Communities sent a system generated email to the 200+ subscribers in the Aerospace Medicine community, and three helpful replies from fellow flight surgeons were posted back to the discussion thread that day.

It is a pretty safe bet that knowledge workers all across your enterprise are “awaiting enlightenment” on multiple tacit knowledge questions they face. Are you giving them the Web 2.0 tools they need to locate subject matter experts, get help with their questions, easily share their expertise, and deliver bottom-up intelligence across your organization?

1 Wenger, McDermott, Snyder. Cultivating Communities of Practice: A Guide to Managing Knowledge. Boston: Harvard Business School Press, 2002. p.9 2 Hubert Saint-Onge. CEO, Konverge and Know 3 Wenger, McDermott, Snyder. p.11 4 Wenger, McDermott, Snyder. p.32 5 CoP Facilitator (Wenger, McDermott, Snyder. p.133)

Better Data for Better Care 74

6 Wenger, McDermott, Snyder. p.61 7 Wenger, McDermott, Snyder. p.80 8 Barbara Saidel, CIO, Russell Reynolds Associates 9 Wenger, McDermott, Snyder. p.17 10 Wenger, McDermott, Snyder. p.20 11 Wenger, McDermott, Snyder. p.137 12 Wenger, McDermott, Snyder. p.193 13 Worrell, Anna, and Daniels, John; Virtual Healthcare Artifact and Imaging Management; Evolvent Magazine, 2009 volume 1

Communities of Practice, Knowledge Junctions, and Web 2.0 75

Effectively Managing Values and Processes

In remote parts of the United States, brave men and women stand watch twenty-four hours a day, seven days a week, 365 days each year in control of the world’s most destructive weapons of mass destruction, the intercontinental ballistic missile; they like to call it the ICBM. ICBMs are scattered throughout the Dakotas ready to be launched at a moment’s notice using rigorously and continuously practiced and tested processes; processes used to ensure every emergency action message transmitted to them is deliberately assessed before taking any actions leading up to the fateful key turn that irretrievably hurls megatons of destructive power toward targets of war. The United States has gone to great lengths to create this nuclear deterrent force, and has gone to even greater lengths taking care to ensure the ICBM weapon system is effectively managed to not only ensure its availability when needed, but to prevent accidental, intentional, or any other unauthorized launch.

Just as we treasure the value of freedom in the United States as evidenced by our investment in ICBMs, to a greater extent we, too, must treasure the value of health and our healthcare system. Our ICBM nuclear force, along with the other two legs of our nuclear triad – our submarines and military aircraft – has successfully deterred aggressors from using weapons of mass destruction against us. Some may argue that the events of 9/11 could be considered mass destruction events using commercial airliners as the weapon. But the fact remains that the Department of Defense’s effective management of the nuclear triad has proven successful.

Wouldn’t it be good if the same could be said about how effectively we have managed healthcare in the United States? Unfortunately, there are some who believe doctors have done more harm than good.1 The history of medicine describes caregivers who intentionally caused patients to bleed or vomit to heal disease and prevent death. But to their credit, most

Better Data for Better Care 76

physicians did the best they could given the knowledge they had at the time. In his article in The New York Times, David Leonhardt recounted that medicine did not make much progress from the time of Hippocrates until the 19th Century.

Today’s healthcare information technology companies may be quick to point out caregivers during that time did not have the advantages of information technology. Advanced information technology is available to caregivers today, yet there still seems to be a struggle with realizing value and best case outcomes in healthcare. Consider how information technology has affected the finance community; we can travel virtually anywhere in the world and still have access to all of our financial information. But if I do not have the same level of access to my basic medical information when I travel, I put my life at risk anytime I need medical attention regardless of the skills of the physician proving me treatment. Quite frankly, this makes me intensely uncomfortable.

I limped into to a strange hospital for an emergency situation while traveling within the United States. At the time I was unaware that the problem was life threatening if I had not sought medical care. The hospital I chose was the one nearest to the hotel where I was staying. It looked like a typical hospital on the outside. I entered the emergency department around ten o’clock pm to find the waiting area about half full. I checked in with the clerk who gave me several sheets of paper to complete before they would see me. I took my place among the other patrons and began completing the paperwork: what is my name, where do I live, do I have insurance, and why was I seeking medical attention, in that order. After about ten minutes, I handed the completed paperwork back to the clerk and instructed me to have a seat and someone would be with me shortly. After about an hour, I was called to come to a room behind the glass partition that separated the clerks from the sick, waiting crowd and there I underwent the usual five minute screening, e.g., blood pressure, temperature, pulse, respiration, and quick problem interview

Communities of Practice, Knowledge Junctions, and Web 2.0 77

where I verbally reported what I had just written on the papers that are now in the clerks hands.

I noticed the interviewing clerk seemed to be struggling with the computer system in which the information I provided was being entered. Being the inquisitive information technology professional that I am, I asked the clerk what system they were using and how they like it. The clerk immediately responded with a heavy sigh and side-to-side head shake accompanied by a brief explanation of how hard it is to learn the new system. The clerk proceeded to vent about the struggles they were having with the new system on its first day of use. It turns out, I had happened upon an emergency room during its go live on a new emergency department electronic documentation system.

The clerk seemed exasperated, but yet somewhat enthused about using the new system. However, I felt the new system was receiving more attention than my urgent medical condition. After gathering all the information she needed from me, the clerk ordered me to return to the waiting room until I was called back again. Once again, I joined my fellow sick in the waiting room where I sat for another hour. I was finally called to join the nurse in an exam room where the nurse performed a brief examination and another interview where I repeated my complaint for the third time. She had me sit back on the exam table that was configured in a half lying position and she instructed me to sit as comfortably as I could because it may be a while before the physician was able to see me. She proceeded to tell me there was only one physician on duty and that they were trying to learn a new computer system that seemed to be slowing everyone down. She said she would check on me occasionally, and she kept her word over the next three hours poking her head in the door to see if I was still alive.

At last, the young physician joined me in the exam room and performed another examination and interview. I had the opportunity to explain for the fourth time why I had come to see

Better Data for Better Care 78

her. She listened intently as she looked at the computer screen and furiously typed and clicked. She did eventually look at me and my specific medical condition and with some prompting from me, found what seemed to be the source of my pain. This was followed by a quick explanation by her of what my body was doing and that it was a good thing I came to the hospital because left untreated I could have died. Sounds much more serious than it was, but I had no idea what was brewing. She then ordered the nurse to stab me in my good leg with what I believed to have been the largest needle I have ever seen in any hospital needle inventory. The physician typed and clicked some more on the computer and wished me a well recovery.

I walked into the emergency room limping on one leg, and walked out of the emergency room barely walking on my own without any assistance from the staff. And, I still had to drive back to my hotel after having gone without sleep for over 24 hours. I spent eight hours in the emergency room and concluded that the system they installed and perhaps the way in which it was installed may not have been done as efficiently as it could have been. Why did it seem that the emergency room staff was seeing the system for the first time? Did they not receive any training before I walked into the building? Did anyone look at the emergency room workflow to see how the new system would impact patient throughput and staff processes?

My eight-hour experience in this particular emergency department tells me no, and in my opinion this was far from being patient-centered healthcare delivery. Perhaps it was due to the distraction of the go-live implementation of a new system. But that should not be distracting. In fact, it should be completely transparent to the patient and to the staff, as much as is possible.

Communities of Practice, Knowledge Junctions, and Web 2.0 79

Figure 1. Percent GDP spent on Healthcare

It is no wonder that health outcomes in the United States lag those achieved elsewhere in the world, even though the United States spends more per capita on health care than any other industrialized nation. It is reported that the increasing costs of care reduces access to care and creates a heavy and growing burden on employers and consumers.2 The promotion of health information technology is described as the most commonly mentioned prerequisite priority for generating sustainable progress in achieving greater value in healthcare for processes that improve quality, monitor outcomes, provide clinical decision support, develop evidence, track costs, streamline paperwork, improve care coordination, and facilitate patient engagement.3 It is one thing to promote health information technology to achieve these wonderful goals, but how we implement information technology in healthcare will determine whether or not we actually achieve those goals.

Better Data for Better Care 80

The topmost focus on improving the nation’s healthcare system must be on patient safety and the quality of care provided. However, we cannot undervalue the impact of improvements information technology can have on the bottom line. One particular study, published in a 2005 Health Affairs journal article, indicated a potential savings of $77.8 billion per year across the industry if a national health information exchange network was fully implemented.4

What does fully implemented health information exchange mean? The Health Affairs article described it as, “… electronic data flow between providers (hospitals and medical group practices) and other providers, and between providers and five stakeholders with which they exchange information most commonly: independent laboratories, radiology centers, pharmacies, payers, and public health departments.” Considering how fragmented the United States healthcare system is today, can a fully implemented health information exchange be achieved in a reasonable timeframe, e.g., by 2014? Imagine if you will, that it is achievable. Then what is the true value of using information technology in healthcare?

The Center for Information Technology Leadership published a paper in 2002 that described an environment in which healthcare executives are under a tremendous amount of pressure to address medical errors, inconsistent quality, inefficiency, declining clinician job satisfaction, and mounting staffing shortages.5 The paper described the results of a study to answer the question, “What are the demonstrated benefits of a given system or application?” The findings revealed the absence of literature relative to assessing the true value of healthcare information technology. It revealed that most studies focused on one or two benefits of a given healthcare information technology rather than the three dimensions of financial, clinical, and organizational benefits. “But there is very little hard evidence demonstrating the value of specific HIT investments.”

Communities of Practice, Knowledge Junctions, and Web 2.0 81

Research on value should focus on the relationships between HIT and these dimensions:

• Financial

− Cost reductions issuing from decreased administrative clinical staffing and resource requirements (i.e., elimination of paper chart pulls and transcription services).

− Revenue enhancements resulting from improved charge capture and charge entry to billing times.

− Productivity gains stemming from increased procedure volume, reductions in average length of stay, and increased transaction processing rates.

• Clinical

− Service delivery advances from better adherence to clinical protocols and improvements in the stages of clinical decision-making (i.e., initiation, diagnostics, monitoring and tracking, and acting).

− Clinical outcomes improvements represented as reductions in medical errors, decreases in morbidity and mortality, and expedited recovery times.

• Organizational

− Stakeholder satisfaction improvements resulting from decreased wait times, improved access to healthcare information, and more positive perceptions of care quality and clinician efficacy.

− Risk mitigation resulting from decreases in malpractice litigation and increased adherence to federal, state, and accreditation organization standards.

Source: The Three Dimensions of Value for Healthcare IT, Finding the Value in Healthcare Information Technologies, D. Johnston, E. Pan, B. Middleton; Center for information Technology Leadership, 2002

The Center for Information Technology Leadership defines healthcare information technology value as the sum of financial, clinical, and organizational benefits directly resulting from the implementation of a given health information technology.6 Effective healthcare information technology decision making requires capturing all of the tangible and intangible benefits to

Better Data for Better Care 82

derive real evidence of the value of healthcare information technology. We already know how well the industry collects data, but converting that data into information that demonstrates the value of healthcare information technology will help enable the effective management of healthcare values and outcomes.

Generating increased knowledge about the value of healthcare information technology will help providers make better investment decisions, and in turn, facilitate better decision-making about adopting “best practice” healthcare information technology solutions. The expected result should be higher quality and more efficient healthcare delivery.

A comprehensive representation of technology value is derived by collecting and analyzing data. With the help of information technology, our healthcare industry has become adept at data collection; but it is definitely behind the power curve in effectively analyzing its data. The National Research Council Recently described the healthcare industry as “…an information- and knowledge-intensive enterprise.” But, the dilemma revolves around the industry’s ability—or lack thereof—to transform these data into information and knowledge.7 Although there have been some successful healthcare information technology imple-mentations, even the most successful implementations seem to fall short. Because of the transactional nature of our healthcare system, its information technology systems do not effectively assimilate the data, information, and knowledge necessary to provide optimal, situational appropriate care to us when we are sick. Worse, these systems fall well short of proactive value to help prevent us from becoming sick.

We currently have a healthcare system where providers get lost amid all the data, tests, and monitoring equipment trying to understand a particular patient’s health status. I experienced this first hand during my emergency room visit. This is where we need innovation that focuses on using health information technology to produce the outcomes we need, rather than

Communities of Practice, Knowledge Junctions, and Web 2.0 83

implementing health information technology for the sake of implementing health information technology.

This idea is also supported by a joint White Paper from the Federation of American Hospitals and Booz Allen Hamilton. The White Paper basically states that we must move beyond a focus on electronic health record adoption to focusing on information flow and communication.8 It is one thing to talk about being able to share information between different healthcare organizations, but if an individual healthcare organization cannot efficiently share its own information internally, how is connecting it to another organization going to help them meet their own goals and objectives?

Take for example the Joint Commission’s call for a National Performance Measurement data system.9 The goal of the system is to improve the safety and quality of healthcare provided to the public. Our current national infrastructure will not support such a system, and even if a national infrastructure was in place, most organizations are not ready to exchange information inside their own institution, much less outside the institution in a standardized manner. The result is a situation where data is collected in fragmented ways resulting in an incomplete view of performance quality for the organizations or individual healthcare providers. The Joint Commission report goes on to say information tech-nology “… could alleviate much of the burden associated with data collection as long as the systems have been designed with the requisite functionality to support performance measurement activities.” This is but one example of the value of health infor-mation exchange and interoperability. In terms of the bottom line, it has already been stated that the value of health information exchange and interoperability has been estimated to save about $77.8 billion per year across the industry if a national health information exchange network were fully implemented.

The ultimate goal is to take all of the stockpiled clinical, administrative, and financial data contained within all of the

Better Data for Better Care 84

information systems in use and transform these data into information, and into knowledge, and into wisdom, wherever it is needed and whenever it’s needed. Especially at the point of care without overwhelming the provider; as is happening today. One of the trends to address this issue has to do with knowledge enablement, or better known as knowledge management. How can we reuse the information our organizations have stockpiled to support better clinical, administrative, and financial decision making and maximize the effectiveness of healthcare delivery and the healthcare dollar? How do we organize and make use of all of this stockpiled information, whether it is in an explicit or in tacit form?

Our industry must embrace health information technology and the strategies of knowledge management that have made other industries so successful. We need to create a true intelligence continuum in healthcare. We need to embed medical knowledge into our clinical workflows to provide true clinical cognitive support to physicians and other care givers at the point of care to improve clinical outcomes. We need to embed medical knowledge into our finance workflows to provide finance professionals with the clinical inputs needed to make truly informed investment decisions. We need to embed medical and financial knowledge into our administrative workflows to provide healthcare executives with just the right clinical and financial information needed to help improve the overall operations of our healthcare organizations. We need to embed medical, financial, and administrative knowledge into our research workflows to provide researchers immediate access to anonymized information to improve their research capabilities and to encourage more research that provides value and results in the best outcomes for our healthcare system.

To gain access to health information and create value, we must embrace health information technology as if we are building a new weapon system to combat the threat of information overload among caregivers and to deter medical errors, inconsistent quality, inefficiency, declining clinician job satisfaction, and

Communities of Practice, Knowledge Junctions, and Web 2.0 85

mounting staffing shortages. Perhaps we need to build virtual launch control centers to get us on the Data-Information-Knowledge-Wisdom (DIKW) trajectory – to transforming healthcare data into healthcare information, knowledge, and wisdom for the entire healthcare system.

Since 2004, two U.S. Presidents have issued a call to have an electronic medical record for every American by 2014. That is only four years from the time of this writing. We must provide demonstrable evidence of value generation and best-case outcomes if we expect healthcare information technology to help us reach that goal.

As healthcare industry thought leaders and subject matter experts tackle these tough value and outcomes questions, we begin to see applications emerge that hold promise in moving the industry forward, moving the industry closer to where other industries are today in terms of squeezing true value out of information technology and putting it to good use in healthcare. For example, the electronic medical record, (EMR), is rapidly becoming the focal point, or the cornerstone for moving toward effective value and outcomes management. The EMR is the legal record created in hospitals and ambulatory environments serving as the source of data for the electronic health record, (EHR). The EHR represents the ability to easily share medical information among stakeholders, and to have a patient’s information follow him or her through the various modalities of care engaged by that individual. Stakeholders are composed of patients and consumers, healthcare providers, employers, and/or payers and insurers, including the government.

It is difficult to understand the level of EMR capabilities in hospitals in the U.S. healthcare information technology market today. As mentioned earlier, we only have four more years to achieve the Presidents’ goal. HIMSS Analytics, the authoritative source on EMR Adoption trends, devised the EMR Adoption Model to track EMR progress at hospitals and health systems.10

Better Data for Better Care 86

The EMR Adoption Model scores hospitals in the HIMSS Analytics Database (n = 5,100+) on their progress in completing the eight stages to creating a paperless patient record environment. In 2009, the first hospitals in the nation to achieve Stage 7, the highest level of EMR adoption, were recognized in a formal recognition ceremony in Chicago. As of November 2009, 27 hospitals have achieved Stage 7. Will the other 5,073+ hospitals reach this stage before 2014?

Although the EMR is rapidly becoming the focal point, the majority of US hospitals are still in the early stages of EMR transformation. In 2006, 19 percent of US hospitals had not achieved Stage 1 yet and were rated at Stage 0; 21 percent had achieved Stage 1, 50 percent had achieved Stage 2, approximately eight percent had achieved Stage 3, approximately two percent had achieved Stage 4, and less than one percent of hospitals had achieved Stages 5 and 6.11 In the third quarter of 2009, 12.1 per-cent still have not achieved Stage 1 and were rated at Stage 0, 7.1 percent have achieved Stage 1, 29.8 percent have achieved Stage 2, 40.4 percent have achieved Stage 3, 4.1 percent have achieved Stage 4, 4.8 percent have achieved Stage 5, and 1.2 percent have achieved Stage 6.12 The progress is encouraging, but is it occurring fast enough?

Another aspect of the electronic health record is the personal health record, or PHR. HIMSS defines an ePHR as “ … a universally accessible, layperson comprehensible, lifelong tool for managing relevant health information, promoting health maintenance and assisting with chronic disease management via an interactive, common data set of electronic health information and e-health tools. The ePHR is owned, managed, and shared by the individual or his or her legal proxy(s) and must be secure to protect the privacy and confidentiality of the health information it contains. It is not a legal record unless so defined and is subject to various legal limitations.”13

One of the areas supported by EHRs is clinical decision support. HIMSS defines clinical decision support as a clinical

Communities of Practice, Knowledge Junctions, and Web 2.0 87

system, application, or process that helps health professionals make clinical decisions to enhance patient care. Austin and Boxerman indicate that clinical decision support applications and systems assist physicians in diagnosis and treatment planning.14

Just as the topmost focus on improving the nation’s healthcare system must be on patient safety and the quality of care provided, the topmost focus of clinical decision support must be based on patient-centered, evidence-based, contextual, transparent, and learning principles. Decision makers are currently faced with rules that seem vague and are poorly tailored to the evidence. Getting there, again, is a function of having the right information available on the costs, outcomes, and strength of the information. These principles must be baked in to healthcare information systems as best as possible to help assess value and provide guidance for decision-making.15

Telemedicine seems to be another trend that may bring value to the healthcare system, especially for rural and remote areas. Telemedicine is the combination of computer technologies and communications technologies to support healthcare provided to patients at remote locations. It has been of interest for a while because of the benefits it promises in reducing costs and improving access to care, and had it not been for the lack of reimbursement issues and privacy and security issues, we might have seen more widespread adoption of the technology outside the Department of Defense. Telemedicine has made progress in terms of better technologies, and it is now receiving the attention it needs to increase its adoption rate.

The American Recovery and Reinvestment Act of 2009 called for establishment of the Health Information Technology Policy Committee. Among the responsibilities of this committee is providing recommendations on telemedicine technologies with the goal of reducing the travel requirements for patients in remote areas. And, it does not stop there. The U.S. Congress appropriated billions of dollars to the Department of Health and

Better Data for Better Care 88

Human Services to specifically invest in the infrastructure and tools needed to promote telemedicine.16 The technology is available today. Even the Department of Commerce has taken an interest; exploring the intersections between itself and the healthcare industry; landing on telemedicine. Telemedicine has been of interest to the Department of Defense by virtue of its global mission.

Effectively managing the values and processes in healthcare using information technology promises huge dividends. It promises to reduce medical errors, improve healthcare quality, and save costs along the way. Just as physicians have learned over time not to discount the importance of science in identifying care guidelines and treatments, the healthcare information technology community must also embrace the science behind measuring the value of information technology to the healthcare system. What could we do with the $77 billion dollars saved in healthcare costs?

1 David Leonhardt, Making Health Care Better, The New York Times, November 8, 2009 2 Value In Health Care: Accounting for Cost, Quality, Safety, Outcomes and Innovation, Institute of Medicine of the National Academies, An IOM Learning Healthcare System Workshop Preliminary Discussion Brief, March 2009 3 Ibid 4 The Value of Health Care Information Exchange and Interoperability; J. Walker, E. Pan, D. Johnston, J. Adler-Milstein, D. W. Bates, and B. Middleton; Health Affairs – Web Exclusive, The Policy Journal of the Health Sphere, January 15, 2005 5 Finding the Value in Healthcare Information Technologies, D. Johnston, E. Pan, B. Middleton; Center for information Technology Leadership, 2002 6 Finding the Value in Healthcare Information Technologies, D. Johnston, E. Pan, B. Middleton; Center for information Technology Leadership, 2002 7 Computational Technology for Effective Health Care: Immediate Steps and Strategic Directions; William W. Stead and Herbert S. Lin, Editors, Committee on Engaging the Computer Science Research Community in Health Care Informatics Computer Science and Telecommunications Board Division on Engineering and Physical

Communities of Practice, Knowledge Junctions, and Web 2.0 89

Sciences, The National Research Council of the National Academies; The National Academies Press, January 9, 2009 8 Toward Health Information Liquidity: Realization of Better, More Efficient Care From the Free Flow of Information; S. Penfield, K.M. Anderson, M. Edmund, and M. Belanger, Booz Allen Hamilton White Paper, January 2009 9 Health Care at the Crossroads: Development of a National Performance Measurement Data Strategy; The Joint Commission White Paper; 2008 10 http://www.himssanalytics.org/hc_providers/emr_adoption.asp 11 Electronic Medical Records vs. Electronic Health Records: Yes, There Is a Difference, A HIMSS Analytics White Paper By Dave Garets and Mike Davis Updated January 26, 2006 12 http://www.himssanalytics.org/stagesGraph.html 13 http://www.himss.org/ASP/topics_phr.asp 14 Austin and Boxerman's Information Systems For Healthcare Management, Seventh Edition; G. Glandon, D. Smaltz, D. Slovensky; July 24, 2008 15 Value In Health Care: Accounting for Cost, Quality, Safety, Outcomes and Innovation, Institute of Medicine of the National Academies, An IOM Learning Healthcare System Workshop Preliminary Discussion Brief, March 2009 16 Title 13 of Division A and Title 4 of Division B of the American Recovery and Reinvestment Act of 2009, ARRA, which was signed into law on February 17, 2009, contains provisions known as the “Health Information Technology for Economic and Clinical Health Act” or the “HITECH Act.”

Better Data for Better Care page 90

Financial Management and Economic Down-Cycles

It may be hard to imagine, but there are those who have benefited from the economic crisis of late. The catastrophic downturn of the US and global economy has given many healthcare financial analysts an over-flowing thought pool from which to draw, to contribute to the current healthcare finance debate. One cannot deny the increased focus on the impact of the 2007-2009 economic down-cycle to healthcare, an economic crisis referred to by economists as the worst financial crisis since the Great Depression.1

The impact of the 2007-2009 economic down-cycle did not spare the healthcare sector as most healthcare organizations and medical communities felt the impact from the economic crisis. From the smallest physician practices to the largest hospitals, from the clinical community to the health information technology sector, the healthcare industry suffered the consequences resulting from the flawed new financial instruments that led to the virtual collapse of the banking industry.

State and Local Governments

In a November 2008 report to Congress, the Government Accounting Office (GAO) suggested that the root cause of states’ long-term financial difficulty rests in the projected rise in health-related costs.2 The report looked at state and local government long-term fiscal challenges and suggested that the projected growth in expenses will greatly outpace revenues. The projected fiscal gap requires either a 7.6 percent reduction in state and local government expense budgets or an increase at the same rate of state and local tax receipts.

This November 2008 GAO report updated the January 2008 report from the same agency which was based on data that pre-dated the 2007-2009 economic down-cycle (data from 1980 to 2006). The January 2008 report showed that state and local governments had a few years left to address imbalances in future revenue and expenses.3 When the GAO updated its report to

Communities of Practice, Knowledge Junctions, and Web 2.0 91

include data from the beginning of the 2007 – 2009 economic down cycle (data from 1980 – 2007), the results showed a dramatic six year shift, from 2015 to 2009, in the year when expenses would exceed revenues. This means that state and local governments, in aggregate, are not covering their expenditures with current receipts, and are increasing revenues or cutting expenses to correct this imbalance. The GAO singled out the increase in health related costs as the main driver of state and local governments’ inability to cover expenses with current receipts.

What is the connection between rising health-related costs and the 2007-2009 economic down cycle? The federal government relies on state and local governments to act as fiscal intermediaries for the delivery of healthcare services to people who earn below a certain income level. This income level is based on poverty guidelines published by the U.S. Department of Health and Human Services.4,5 State and local governments rely on the federal government to underwrite the major share of the cost of these healthcare services, but not 100% of the costs. The federal government requires that each state provide a minimum basket of benefits, which the state government is partially fiscally responsible. In addition, states are permitted to offer additional benefits that are partially the fiscal responsibility of the state. The requirement that states provide Medicaid benefits locks states into a federal entitlement program. As benefits expand and healthcare costs increase, states must secure additional revenue to cover any increased costs associated with the program. The option to cut benefits is limited, especially in those states that provide baseline federally mandated benefits. Therefore, increases in federal funding for Medicaid require similar increases in funding for state and local governments. This growth in the Medicaid program has led to the Medicaid portion of states’ budgets averaging 21 percent.6

Before the 2007- 2009 economic down-cycle, state and local governments faced difficult challenges in their effort to find

Better Data for Better Care page 92

funding for Medicaid services. The 2007- 2009 economic down-cycle dramatically exacerbated these challenges. The recession caused the U.S. unemployment rate to rise from a historically low 4.4 percent in March 2007 to 10.2 percent in October 2009. 7 As unemployment pushes families below the threshold for Medicaid eligibility, the number of persons requesting Medicaid benefits swelled. In 2009, 45 of the 50 states reported a gap in the state budget with 17 of those states reporting no pre-existing gap.8 Three of the remaining five states reported an expected budget gap in 2010. In terms of dollars, the total of state budget gaps in 2009 was expected to be $46.8 billion.9 It is expected to more than triple to $158.5 billion in 2010.10 Due to pressures posed by Medicaid costs, the budget outlook for state and local governments is bleak, to say the least.

Federal Government

Similar cost pressures from Medicare and Medicaid exist for the federal government. According to the Centers for Medicare and Medicaid Services (CMS), average annual health spending growth is anticipated to outpace average annual growth in the overall economy by 2.1 percentage points per year between 2008 and 2018 (4.1 percent vs. 6.2 percent, respectively).11 National health spending is projected to reach $4.4 trillion by 2018, an amount that represents more than 20 percent of Gross Domestic Product (GDP).12 The 2007-2009 economic crisis and recession exacerbates these cost pressures.

The non-public share of national health spending (private health insurance and personal out-of-pocket spending) is expected to decline almost 2 percentage points between 2007 and 2009 representing a 15-year low.13 This decline is attributed to slower income growth and a decrease in the number of individuals covered by private health insurance. The federal government’s share of national health spending (including Medicare and Medicaid) is expected to grow from 6.4 percent in 2007 to 7.4 percent in 2009 with Medicare alone growing at a rate of 8.0 percent per year.14 If this trend continues, the federal

Communities of Practice, Knowledge Junctions, and Web 2.0 93

government will fund over 50 percent of all national health spending by 2016.15 But why is the public share of the bill growing? CMS attributes the growth first to the recession, then to initial surge of baby boomers becoming eligible for Medicare.16

There is one interesting point to be made after quoting all of these historical and projected statistics. Even though the health spending share of the GDP is expected to reach 20.2 percent by 2018, CMS projects the health spending growth rate will continue to slow down through 2010. The national health spending growth rate has slowed down every year, from 6.9 percent in 2004 to a projected 4.6 percent in 2010.17 It was mentioned previously that the Medicare spending growth rate was expected to be 8.0 percent per year for 2008 and 2009. However, Congress has prevented cuts in physician payment rates since 2003, and if this trend continues, CMS projects the Medicare spending growth rate will continue to decline, but not as quickly. This is projected to result in a Medicare spending growth rate of 6.4 percent and a total national health spending growth rate of 4.6 percent in 2010. 18 Without physician payment rate cuts through 2018, the resulting national health spending share of the GDP is projected to increase to 20.5 percent.19

It is a different story from 2011 and beyond. CMS projections show national health spending will accelerate beginning in 2011. The national health spending growth rate will increase from 4.6 percent in 2010 to 5.6 percent in 2011 and is projected to continue growing each year reaching a growth rate of 7.2 percent by 2018.20 In dollars, national health spending was over $1.7 trillion in 2003 and is projected to grow to $4.4 trillion in 2018.21 Spending per capita in 2003 was $5,967 and is projected to reach $13,100 by 2018.22 Between 2011 and 2018, the public spending portion of national health spending is attributed to Medicare as more and more baby boomers become eligible for the program with the projected Medicare growth rates moving from 6.2 percent in 2011 to 8.6 percent in 2018.23 Medicaid spending projections are slightly different in that spending is projected to

Better Data for Better Care page 94

slow down moving from a 9.6 percent increase in spending in 2009 to a projected 7.8 percent growth rate in 2012; however, Medicaid spending will begin growing again in 2013 up to 8.9 percent in 2018.24 The slowing pace of Medicaid spending between 2009 and 2012 is attributed to an expected improvement in economic conditions (e.g., lower unemployment rates). But the CMS predicts an increase in the number of older and disabled people eligible for Medicaid will make up a larger portion of the overall enrolled Medicaid population, thus causing the slow rise in Medicaid spending through 2018.25 Putting a dollar value on the growth rates, Medicare spending is projected to grow from $228.7 billion in 2003 to almost $931.9 billion by 2018.26 Medicaid spending is projected to grow from $270.3 billion in 2003 to $800.7 billion by 2018.27 The following charts provide a graphical representation of annual projections reported by the CMS. None of these projections take into account changes put in place by any healthcare reform legislation. Irrespective of the breadth of reform, it is expected that reimbursement adjustments will take place to reflect the current fiscal realities facing both the federal and state governments.

Figure 2. Projected Growth in Healthcare Spending

Source: National Health Expenditure Projections 2008-2018, Centers for Medicare and Medicaid Services (CMS), U.S. Department of Health and Human Services; February 2009.

Communities of Practice, Knowledge Junctions, and Web 2.0 95

Figure 3. Growth in Medicare/Medicaid Spending

Capital Spending Challenge

Federal, state, and local governments are not the only entities impacted by economic down-cycles. The recession’s impact on hospitals has made it more difficult for them to provide Medicare and Medicaid services. Why? Medicare and Medicaid re-imbursement rates do not cover 100 percent of the costs for providing health services; therefore, as more people become eligible for Medicare (baby boomer population) and Medicaid (recession-induced increase in people living below the income

Better Data for Better Care page 96

eligibility threshold) the total uncovered cost to provide these services increases. In addition, reimbursements from private insurers, normally counted on to make up the shortfall in Medicare and Medicaid payments, decreases. In summary, hospitals then treat more patients where reimbursement rates do not cover costs while treating fewer profitable patients provided by private reimbursement. This leads to losses for hospitals. In addition to the impact of Medicare and Medicaid funding from the federal government, the American Hospital Association reported impacts to hospitals in terms of their financial health resulting from challenges in capital spending, decreased demand for services, increased support requests by physicians, and a decline in patient and operating margins.28

Hospitals have had to pay more for borrowed money, making it more difficult to finance facility and technology improvements. Hospitals attributed this difficulty to several factors resulting from the 2007-2009 economic down-cycle: increased interest expenses for variable rate bonds, increased collateral requirements, an inability to issue bonds, difficulty refinancing auction rate debt, an inability to rollover or renew credit, an acceleration of debt, and an inability to withdraw funds held by financial institutions.29 For example, hospitals experienced a 15 percent increase in interest payments on loans from 2007 to 2008.30 In late 2008, credit markets froze making financing very difficult even after interest rates were lowered and money became available. Banks and investment houses did not want to lend funds to each other, as they did not trust the accuracy of financial statements. Without basic trust among lenders, credit cannot be extended to borrowers. This forced hospitals, at that time, to reevaluate plans for capital expenditures and postpone or cancel projects such as building new facilities, renovating existing ones, or buying and implementing information technology solutions.31

Another challenge that affected hospital budgets relates to the decline in hospital investment portfolios. Just as with many other entities, healthcare and non-healthcare alike, portfolio losses mounted following the stock market crash of 2008. Hospitals

Communities of Practice, Knowledge Junctions, and Web 2.0 97

reported a decline from aggregated investment gains of $396.1 million in 2007 to investment losses of $831.5 million in 2008.32

As a result, some hospitals had to increase pension plan funding from other sources.33 Non-profit hospitals with large endowments use funds generated from those endowments to fund both new projects and ongoing operations. With the decline in endowment investments, many hospitals were caught short in funding and had to either curtail or cancel existing and future projects. Hospitals that relied on these endowment-generated funds to finance their operations experienced the most severe disruption of activities, as patient care could not be suspended or delayed.

Volume Challenge

Hospitals reported a moderate to significant decline in elective procedures and admissions and a decline in discharges, inpatient surgeries, and emergency and ambulatory visits.34,35 This decreased demand for services reflects the sharp rise in U.S. unemployment rates, which also increased the demand for uncompensated care and the need for subsidized services.36 Hospitals reported an eight percent increase in uncompensated care from $853.5 million in 2007 to $923.6 million in 2008.37

Physician Support Challenge

Physicians have certainly felt the impact of the economic down-cycle of 2007-2009. The effect on physicians presents another challenge to hospitals as the number of requests by physicians for financial assistance is on the rise. Hospitals reported a 56 percent increase in physician requests for financial help between 2007 and 2008.38 Some of the specific requests made by physicians included 1) Asking for compensation increases for on-call or other services provided to hospital, 2) Physicians seeking hospital employment, 3) Physicians seeking to sell their practice, and 4) Physicians seeking to partner on equipment purchases.39 As hospitals depend upon the revenue

Better Data for Better Care page 98

generated by physicians admitting patients, it is critical that the physician community remain stable. Therefore, upon physician requests, hospitals explored available avenues to assist physicians in keeping their practices financially healthy. In spite of this mutual interest, anti-trust legislation limited what hospitals could do to assist physicians.

Financial Health Challenge

The combination of a decrease in patient demand for services and losses from investments challenges hospitals to make changes to maintain their financial health. Hospitals reported a decline in total operating and patient margins from +6.1 percent in 2007 to -1.6 percent in 2008.40 This forced hospitals to search for efficiencies to deal with the decline. Some of the specific ideas reportedly considered included: 1) Cutting administrative costs, 2) Reducing staff, 3) Reducing services, 4) Divesting assets, and 5) Exploring a merger.41

Geographic Impact The Dartmouth Institute for Health Policy and Clinical

Practice Atlas Project has documented geographic variations in the distribution of medical resources across the United States. The project uses Medicare data to provide comprehensive information and analysis about national, regional, and local markets, as well as individual hospitals and their affiliated physicians.42 Using information from this project and other sources, Atul Gawande, MD, set out to determine exactly why the increasing costs of healthcare has become the biggest threat to our nation’s economy. Dr. Gawande is a surgeon and a writer for the New Yorker Magazine, and is a staff member of Brigham and Women's Hospital and the Dana Farber Cancer Institute.43 In his research, he found McAllen, Texas to be one of the most expensive healthcare markets in the country.44 He used McAllen as the focal point in a case study in which he compared demographics and other factors with another location in Texas to help him draw conclusions about this cost conundrum. Could factors such as demographics, behaviors, malpractice claims, or

Communities of Practice, Knowledge Junctions, and Web 2.0 99

care quality be the blame for the McAllen’s higher costs? The results of his case study, as described hereafter, point to overutilization of medical resources.

As previously mentioned, McAllen is described as one of the most expensive healthcare markets in the country second only to Miami, Florida.45 Medicare spent more than $15,000 per enrollee in McAllen in 2006, which was about twice the national average. Gawande dug a little deeper to determine the contributing factors to McAllen’s position as the most expensive Medicare region in the country.

When Gawande studied McAllen, his attention was first drawn to the unhealthy behaviors of its residents because McAllen reportedly rated 60 percentage points above the national average for heavy drinking and had a 38 percent obesity rate. To test his hypothesis, Gawande compared McAllen’s health status to another county in Texas, with similar demographics, only 800 miles away. That comparative county’s Medicare expenditures were half that of McAllen.46 Therefore, Gawande concluded that health status was not the cause of the variation in Medicare costs between the two locations.47 His conclusion is supported by the U.S. Congressional Budget Office (CBO) in its February 2008 report titled, Geographic Variation in Health Care Spending. The CBO report indicated an individual’s health status is a strong predictor of health spending for that person, but when individuals are aggregated into large regional groups, the variation averages out thus marginalizing health status as a possible explanation of spending variations. 48 49

Gawande then investigated whether McAllen Medicare beneficiaries obtained above average clinical outcomes. He theorized that better outcomes might be more costly to obtain, and this led to the higher per capita costs in McAllen. Although McAllen healthcare organizations seemed to have the latest technology, Medicare ranked the hospitals in Gawande’s comparative county higher in outcomes on all but two metrics.50

Better Data for Better Care page 100

Therefore, Gawande concluded that the quality of care in McAllen did not account for the variation in costs between McAllen and the other Texas county.

Gawande’s curiosity next led him to look at malpractice issues as several physicians shared their strong belief that costs were being driven by unnecessary malpractice claims. Quick research by Gawande had him discard this as a source of variation. Texas passed a law a few years earlier capping awards in malpractice claims. As a result, lawsuits were described as practically non-existent in McAllen.51

Finally, Gawande looked at overutilization as a cause of cost disparities. Gawande called on a colleague at Dartmouth’s Institute for Health Policy and Clinical Practice, and he called on D2Hawkeye, an independent company, and Ingenix, United Healthcare’s data-analysis company to help him analyze commercial insurance data for McAllen.52 What Gawande found was startling. Medicare data comparing McAllen to the other Texas county showed that in McAllen between 2001 and 2005, critically ill patients received almost 50 percent more specialist visits and were two-thirds more likely to see 10 or more specialists in a six-month period.53 And in 2005 and 2006, McAllen patients received 20 percent more abdominal ultrasounds, 30 percent more bone-density studies, 60 percent more stress tests with echocardiography, 200 percent more nerve-conduction studies to diagnose carpal-tunnel syndrome, and 550 percent more urine-flow studies to diagnose prostate troubles.54 The list went on leading Gawande to conclude that overutilization was the cause of McAllen, Texas being tagged healthcare’s most expensive place in the country.55

Spending variation drivers

The McAllen case study is very interesting in that it pointed to overutilization of medical resources as the root cause of out-of-control health spending. Can that conclusion be generalized across the rest of the country? And what drivers lead to overutilization? The only generalization that can be made is that

Communities of Practice, Knowledge Junctions, and Web 2.0 101

there are many drivers that are associated with cost variations across geographic locations, and the permutations of those drivers vary by locations as much as the cost variations. The American Hospital Association lists the drivers of spending variation as factors of payments rates and service utilization. The combination and interaction among these factors is unique to each region. 56

Figure 4. Spending Variation

Source: Geographic Variation in Health Care Spending: A Closer Look, Trend Watch, American Hospital Association; November 2009; Page 2

More research is needed on each of these specific drivers of spending variation in order to develop effective solutions.57 Can it be determined with any level of certainty or consensus whether or not health spending is associated with the quality of care?58 There are reports that conclude higher spending does not result in better care, which suggests cost cutting measures could be invoked without impacting the quality of care.59 Moreover, there are reports that indicate the quality of care improves with higher health spending.60 The Congressional Budget Office drew the following three conclusions from its analysis on spending and quality of care.

1. Areas with higher-than-expected Medicare spending per beneficiary tend to score no better and, in some cases, score worse

Better Data for Better Care page 102

than other areas do on process-based measures of quality and on some measures of health outcomes. 2. Patterns of treatment in high-spending areas tend to be more intensive than in low-spending areas. That is, in high-spending areas a broader array of patients will receive costly treatments. Those treatment patterns appear to improve health outcomes for some types of patients, but worsen outcomes for others. 3. Decreasing spending in high-spending areas would reduce geographic variation in health care spending and might or might not harm the quality of care in those areas, depending on how the reduction took place.

Source: Geographic Variation in Health Care Spending; Congress of The United States Congressional Budget Office; February 2008; Page 20

It seems obvious there is as much geographic variation in the association of health spending to the quality of care, as there are variations in health spending alone across geographic locations. Therefore, it appears virtually impossible to develop a universal solution that addresses both spending and quality for the entire U.S. health system and to expect that solution to decrease spending and improve outcomes for any given location alone. Future studies should focus on helping us better understand the drivers of health spending and their association to quality outcomes measures on a geographic basis. In the meantime, U.S. Senator Max Baucus (D-Mont.) has issued a call to take action now to curb the growth in health spending.

Call to Action

U.S. Senator Max Baucus issued a call to action in 2008 citing the need to control healthcare costs as one of his arguments for moving forward on health reform. In his report, he described how over the 10 year period, 1999 to 2008, insurance premiums more than doubled for families and employers. This increase is attributed to an expansion in the use of medical technology, higher rates of obesity, the aging of the population, and losses in health delivery industry productivity.61 He also reported the cost of uncompensated care in hospitals and clinics to be more than

Communities of Practice, Knowledge Junctions, and Web 2.0 103

$56 billion per year.62 Although spending $56 billion on uncom-pensated care may seem small as it represents only 2.8% of the $2 trillion in national health spending, it is $121 per month for a New Mexico family of four living on a median family income of $52,034. This is a relatively substantial amount of money when living from paycheck to paycheck.63

Perhaps spending could be lowered by transforming the healthcare system from a payer-centric approach to one with more emphasis on prevention and focus on the patient. This would include rewards for providers that deliver quality care with superior outcomes. Incentivizing hospital and physician use of health information technology may help jump start the process, but there must be an investment in critical research to identify or develop health information technologies that can lead to higher-value care.

One way to approach improving the quality of healthcare is to promote information transparency through the public reporting of costs and quality metrics. There are efforts underway to finally publish this type of data. The Joint Commission issued a call for a National Performance Measurement data system with the goal to improve the safety and quality of healthcare provided to the public.64 However, our current national infrastructure is not quite ready to support such a system as the country continues to develop health information exchange models that effectively facilitate the exchange of this information. Even if a national infrastructure was in place, many organizations struggle with gaining access to information inside their own institution making one wonder how they will be able to provide their data outside the institution in a standardized manner. There is no doubt that more information leads to better decision making.

“Consensus will be difficult to achieve, but common ground from which to build can and must be found.” Senator Max Baucus (D-Mont.), Call To Action: Health Reform 2009, November 12, 2008

Better Data for Better Care page 104

Cost of Doing Nothing

What will happen if nothing is done to address the rising costs of healthcare and to help the growing population of the uninsured and underinsured? We have known for several years of the urgency to do something about the rising cost of healthcare, and some things have been tried. If the system continues down its current track, the consequences could be devastating.

The costs of doing nothing have been calculated by the New America Foundation in a report published in 2008. The authors, Sarah Axeen and Elizabeth Carpenter, used 10 years of historical data to calculate a compound annual growth rate and use it to make projections through 2016.65 The upfront cost of reforming our healthcare system is insignificant compared to the cost of doing nothing at all. Our economy reportedly lost more than $207 billion because of the effects and outcomes related to the uninsured and their tendency to forego treatment.66 The New America Foundation estimates this is greater than the costs of ensuring that all Americans have quality, affordable health coverage.

The Foundation report also indicated health insurance will become more expensive for families over time with affordability not improving unless reform action is taken. For example, the estimated cost of the average employer-sponsored health insurance plan for a family will reach $24,000 by 2016 representing an 84 percent increase over 2008 premium levels.67 The Foundation report estimated this would force families to spend about 50 percent of their income to buy health insurance in 2016 while receiving less robust coverage. The report estimates the average deductible nationwide will increase 73 percent to almost $2,700 by 2016 and average copayments will climb to $30.

Recovery from the 2007-2009 economic down-cycle will not solve the fundamental economic challenges presented by our broken healthcare system. Entrenched patterns exist within the delivery and reimbursement of healthcare such that only a major

Communities of Practice, Knowledge Junctions, and Web 2.0 105

overhaul of the system can address the issue of ever-increasing costs.

Going back to the McAllen story, Gawande concluded his article by linking quality to accountability. He said, “The lesson of the high-quality, low-cost communities is that someone has to be accountable for the totality of care. Otherwise, you get a system that has no brakes. You get McAllen.”68 Whom should we hold accountable? Vendors, providers, payers, politicians, or consumers? Look in the mirror.

1 Three Top Economists Agree 2009 Worst Financial Crisis Since Great Depression; Risks Increase if Right Steps are Not Taken, D. Pendery, (2009-2-29). Reuters 2 Statement for the Record For the Committee on Finance, U.S. Senate, State And Local Fiscal Challenges, Rising Health Care Costs Drive Long-Term And Immediate Pressures, Statement of Stanley J. Czerwinski, Director Strategic Issues, the United States Government Accountability Office; November 19, 2008 3 Report to Congressional Committees, State and Local Governments Growing Fiscal Challenges Will Emerge during the Next 10 Years, the United States Government Accountability Office; January 2008 4 http://www.cms.hhs.gov/MedicaidEligibility/07_IncomeandResoure Guidelines.asp, Centers for Medicare and Medicaid Services, U.S. Department of Health and Human Services 5 http://aspe.hhs.gov/poverty/index.shtml, Poverty Guidelines, Research, and Measurement, Assistant Secretary for Planning and Evaluation, U.S. Department of Health and Human Services 6 Health Care IT Spending Will Climb In Next Five Years, M. K. McGee; Information Week, January 14, 2008 7 Labor Force Statistics from the Current Population Survey, U.S. Bureau of Labor Statistics; Data extracted on: December 5, 2009 8 Statehealthfacts.org, a project of the Henry J. Kaiser Family Foundation; http://www.statehealthfacts.org/comparemapreport.jsp?rep=48&cat=1 9 Ibid

Better Data for Better Care page 106

10 Statehealthfacts.org, a project of the Henry J. Kaiser Family Foundation; http://www.statehealthfacts.org/comparemapreport.jsp?rep=49&cat=1 11 National Health Expenditure Projections 2008-2018, Centers for Medicare and Medicaid Services (CMS), U.S. Department of Health and Human Services; February 2009 12 Ibid 13 Ibid 14 Ibid 15 Ibid 16 Ibid 17 Ibid, Table 2 18 Ibid 19 Ibid 20 Ibid 21 Ibid, Table 3 22 Ibid 23 Ibid, Page 2 24 Ibid, Table 3 25 Ibid, Page 2 26 Ibid, Table 3 27 Ibid 28 Report on the Economic Crisis: Initial Impact on Hospitals, American Hospital Association, November 2008 29 Ibid, Slide 4 30 Ibid, Slide 5 31 Ibid, Slide 6 32 Ibid, Slide 7 33 Ibid, Slide 8 34 Ibid, Slide 9 35 Ibid, Slide 10 36 Ibid, Slides 11 and 12 37 Ibid, Slide 13 38 Ibid, Slide 14 39 Ibid, Slide 14 40 Ibid, Slide 15 41 Ibid, Slide 16 42 http://www.dartmouthatlas.org/index.shtm; Dartmouth Institute for Health Policy and Clinical Practice, The Dartmouth Atlas of Health Care 43 http://www.gawande.com/bio.htm

Communities of Practice, Knowledge Junctions, and Web 2.0 107

44 The Cost Conundrum: What a Texas town can teach us about health care, A. Gawande; The New Yorker Annals of Medicine; June 1, 2009 45 The Cost Conundrum: What a Texas town can teach us about health care, A. Gawande; The New Yorker Annals of Medicine; June 1, 2009 46 Atul Gawande: The Cost Conundrum Redux; A. Gawande; The New Yorker Annals of Medicine; June 23, 2009 47 The Cost Conundrum: What a Texas town can teach us about health care, A. Gawande; The New Yorker Annals of Medicine; June 1, 2009 48 Geographic Variation in Health Care Spending; Congress of The United States Congressional Budget Office; February 2008; Page 12 49 Ibid, Page 13 50 The Cost Conundrum: What a Texas town can teach us about health care, A. Gawande; The New Yorker Annals of Medicine; June 1, 2009 51 Ibid 52 Ibid 53 Ibid 54 Ibid 55 Ibid 56 Geographic Variation in Health Care Spending: A Closer Look, Trend Watch, American Hospital Association; November 2009 57 Ibid, Page 1 58 Ibid, Page 13 59 Ibid; The American Hospital Association referenced the following literature: Fisher, E., et al. (2009). Health Care Spending, Quality, and Outcomes: More Isn’t Always Better. Hanover, NH: The Dartmouth Institute for Health Policy & Clinical Practice; and Gold, M. (July 2004). Geographic Variation in Medicare Per Capita Spending: Should Policy-Makers be Concerned? Princeton, NJ: Robert Wood Johnson Foundation. 60 Ibid; The American Hospital Association referenced the following literature: Cooper, R. (2009). States with More Health Care Spending Have Better-Quality Health Care: Lessons About Medicare. Health Affairs, 28(1), w103-w115. 61 Call To Action: Health Reform 2009, Senate Finance Committee Chairman Max Baucus (D Mont.); November 12, 2008 62 Ibid, Page 4 63 The median family income estimate for New Mexico was obtained from the U.S. Census Bureau,

Better Data for Better Care page 108

http://www.census.gov/hhes/www/income/medincsizeandstate.html 64 Healthcare At The Crossroads: Development of a National Performance Measurement Data Strategy; The Joint Commission; 2008 65 The Cost of Doing Nothing: Why the Cost of Failing to Fix Our Health System is Greater than the Cost of Reform; Sarah Axeen and Elizabeth Carpenter; The New America Foundation; November 2008; Page 122 66 Ibid, Page 4 67 Ibid, Page 5 68 The Cost Conundrum: What a Texas town can teach us about health care, A. Gawande; The New Yorker Annals of Medicine; June 1, 2009

Healthcare Key Performance Indicators 109

Healthcare Key Performance Indicators (KPI)

Many smart people are engaged in discussions about many different factors that will influence the future of healthcare. Narrowing down these discussions into a short chapter about the future of healthcare is a daunting task. Most agree healthcare is an extraordinarily complex industry, as evidenced by the challenges confronting the industry today and by how long these challenges have plagued the industry. This chapter will attempt to group many of the speculative discussion points found in current literature into what is believed to be the three primary areas of concern: Access to care, quality of care, and the cost of care. There are a few key concepts worth mentioning to help set the discussion stage for consolidating the access, quality, and cost concern areas.

Because some challenges seem to go unresolved over a long period of time, the primary force of change for these challenges tends to be government intervention. The role of government in health information technology is to provide policy, and not to provide innovation. When an industry is not able to overcome or resolve certain challenges, some believe the government should intervene to fix the problem. And intervene it has by way of enacting laws such as the Health Insurance Portability and Accountability Act of 1996 (HIPAA), the American Recovery and Reinvestment Act of 2009 (ARRA) and the Health Information Technology for Economic and Clinical Health (HITECH) Act found embedded within the ARRA. These laws give healthcare executives justification for intervention1 These laws are used by healthcare managers as justification for how certain projects and initiatives are prioritized thanks to intervening mandates by government to bring about optimization of organizational resources, reduce costs, improve quality and patient safety, and ensure the security of information and the privacy of personal health information.

One of the areas that will have a significant impact on the future of the healthcare industry is education. Education impacts

Better Data for Better Care page 110

all three categories of access, quality, and cost. One specific attempted government intervention was the 10,000 Trained by 2010 Act introduced by Representative David Wu (D-OR). The bill requires the Director of the National Science Foundation (NSF) to award competitive grants to institutions of higher education (IHEs) to establish or improve undergraduate and master’s degree health care information programs, attract students to such programs, and provide them with experience in government or industry related to their studies. It also requires the Director to award grants under the Scientific and Advanced Technology Act of 1992 for the purposes of two existing grant programs that provide funds to associate-degree-granting IHEs for improving education in advanced technology, science, and mathematics; and establishing centers of excellence in such subjects that serve as information clearinghouses and models for other educational institutions. The House of Representatives passed the bill on June 6, 2007; however, no other action occurred on the bill. Representative Wu introduced a new bill, H.R. 461, on January 13, 2009 under the same name and it was referred to the House Science and Technology Subcommittee on Research and Science Education.

The University of Minnesota Academic Health Center (AHC) also recognized the need for intervention in healthcare education programs. In 2004, the state convened meetings with participants from 12 of its communities to discuss the health professions workforce, issues in the educational pipeline, and their relationship to the future of healthcare and economic vitality.2 The report found five challenges relative to the changing demographics of the state.

Aging Population and Workforce

The report indicated that Minnesota’s population is growing and aging with the state’s population growing by 12 percent by 2010. The largest population increase expected is in the 65- to 84-year-old age range. Retirees seem to be moving to smaller, rural communities, where health professionals are in short supply

Healthcare Key Performance Indicators 111

and facing an increasingly aging population and a smaller than average increase in young adults. The health professions workforce is aging along with the state’s population. Thirty percent of the current health professions workforce in northwest Minnesota is 50 years of age or older. This translates into reduced working hours, increased worker disability, and in some cases older health professionals spending winter in warmer climates. It is estimated that by 2020, Minnesota will need 1,359 additional physicians to keep healthcare access at its 2000 level.

Workforce retention

It is difficult for new health professionals moving to rural communities to make personal connections within the community beyond their professional colleagues. This results in high turnover rates and an increase in recruiting costs for the community. That is why some communities turn to call schedule flexibility, spousal employment opportunities, and affordable housing to attract other new health professionals.

Intergenerational changes

New generations of health professionals seem to be placing greater value on developing a work/life balance by seeking flexible work weeks and schedules to accommodate participation in family and recreational activities.

Diversity

The report indicates Minnesota is a diverse state with diversity impacting healthcare delivery. Statewide, Somali, Hmong, Sudanese, and Hispanic populations are growing rapidly, and its Hispanic population reportedly grew 166 percent during the 1990s. As a result, sixty different languages are spoken in some rural school districts leading health professionals to look for new resources, such as health interpreters, to support the delivery of care to these groups.

Dr. (Lieutenant Colonel, U.S. Army) Hon Pak presented his own set of external factors and drivers impacting future

Better Data for Better Care page 112

healthcare delivery.3 Like the University of Minnesota, he spoke about the aging population of baby-boomers and the explosion in chronic healthcare conditions (e.g., diabetes, high blood pressure, and heart disease). He mentioned pay-for-performance initiatives and consumer empowerment because of their influence on the creation of new ways to gain access care, participate in care, and move away from the traditional paternalistic model of medicine. He also described an increasing demand for healthcare that is contributing to the substantial growth in inflation and the economic pressure to do more with less.

The Healthcare Information Management and Systems Society (HIMSS) presented its list of factors that will impact the healthcare information technology industry over the next five years.4 Specifically, Paramore, et al indicated that the federal drivers for the future of healthcare included the American Recovery and Reinvestment Act (ARRA) signed by President Obama on February 17, 2009.5 As mentioned previously, embedded within the ARRA is another piece of legislation called the Health Information Technology for Economic and Clinical Health (HITECH) Act.6 The HITECH Act included three sets of provisions relative to health information technology.

The Office of the National Coordinator for Health Information Technology (ONCHIT) was created by an Executive Order from President Bush in 2004.7 Under this Executive Order, the National Coordinator was instructed to develop, maintain, and direct a strategic plan that would guide the nationwide implementation of interoperable HIT in the public and private healthcare sectors. Because the office was created by an executive order, it could have ceased to exist upon the election of the new president. Through the ARRA, the office has been made permanent in order to carry out the role of national health information technology coordination.

The ARRA also authorized funding related to health information technology at a sum of over $33 billion for funding the health information technology infrastructure, electronic

Healthcare Key Performance Indicators 113

health record adoption, training, dissemination of best practices, telemedicine, and for the inclusion of health information technology in clinical education. Never before has this much money been authorized to support health information technology. For example, before the ARRA, the budget for the ONCHIT was roughly $50 million; now, it is currently $2 billion. What a way to start a trend and accelerate improvements.

The ARRA would also provide funds to states for low-interest loans to help providers finance health information technology, and it includes Medicare and Medicaid financial incentives intended to encourage doctors, hospitals, health clinics, and other entities to adopt and use certified electronic health records. However, in addition to adopting and using information technology, entities must demonstrate “meaningful use” of information technology in order to be eligible to receive incentives. If they chose not to adopt and use certified electronic health records, there would be financial penalties.

Within the ARRA are provisions that expanded the HIPAA privacy and security requirements as an additional government intervention to strengthen the protection of personal health information. It established a federal security breach notification requirement for health information that is not encrypted or otherwise made indecipherable. The law requires that an individual be notified if there is an unauthorized disclosure or use of their health information, and extends the business associate requirements to certain entities that were not originally considered under HIPAA. The entities are now subject to the same privacy and security rules as providers and health insurers, and they must be treated as Business Associates under HIPAA and require business associate agreements.8

The ARRA also gave patients the right to request an accounting of disclosures of their health information made through an electronic record. It requires the Secretary of Health and Human Services to promulgate regulations regarding accounting of disclosures that take into account the “interests of

Better Data for Better Care page 114

individuals” in learning the circumstances under which their protected health information is being disclosed and takes into account the administrative burden of accounting for disclosures.

With regard to the sale and marketing of Protected Health Information (PHI), the ARRA provides new restrictions on marketing using PHI and on the circumstances under which an entity can receive payment for PHI. The Act also provides an individual the right to have access to certain information about them in electronic format, and it modifies the distribution of certain civil monetary penalties collected and provides for enforcement of HIPAA by the State Attorney General and local law enforcement.

Separate from the ARRA, the U.S. federal government issued a final rule in January 2009 that requires conversion from the HIPAA Version 4010 transaction codes set to the 5010 version by January 1, 2012, and it requires conversion from ICD-9 to ICD-10 by October 1, 2013.9

When the requirements of the ARRA and the final rule for updating HIPAA transaction and codes sets are overlaid onto the same calendar, it paints a picture of a perfect storm brewing for providers (e.g., Integrated Delivery Systems, hospitals, primary care and specialty practices, pharmacy benefit managers, laboratories, radiology centers, health plans, clearinghouses, healthcare system vendors, system integrators, and all other business associates.10

Healthcare Key Performance Indicators 115

Figure 5. ARRA / HIPAA Timeline

These few key concepts (government justified intervention, education, establishment of the Office of the National Coordinator for Health Information Technology (ONCHIT), passage of the ARRA, and privacy and security issues) span across each of the three primary areas of concern. The first key performance indicator to tackle in this chapter is access to healthcare.

KPI 1 - Access

Shared Decision-Making & Responsibility for Patient Healthcare

In a series of papers describing what a decade of successful change in healthcare could look like in 2019, altfutures.com described three underlying problems that must be addressed with respect to healthcare delivery. The first problem mentioned was too many Americans do not have access to health coverage and accessible medical care.11 The paper suggests a shift is due in the way we think healthcare should be delivered.12

Better Data for Better Care page 116

Figure 6. Shift to new Healthcare Paradigm

Source: 2019 Healthcare That Works For All: Healthcare Delivery in 2019; altfutures.com; http://www.altfutures.com/2019_Healthcare _That_Works_For_All

Specifically, the paper suggests the healthcare system of 2019 is centered on the ‘Health Home.’ It goes on to describe the Health Home as a collaborative team of health professionals led by a primary care specialist (e.g., family physician, internist, pedia-trician or advanced practice nurse) with special training. The team is responsible for identifying health risks, promoting early intervention, and assuring all the patient’s health issues are addressed. The team also is expected to help patients make sense out of care interventions from multiple providers and payment systems. The team approaches chronic diseases using evidence-based modalities that help maintain a healthy quality of life and prevent any further complications. The goal is for the care provided to be equitable, culturally sensitive, and customized to the needs and values of the patient. To get there, the paper suggests societal consensus to providing healthcare access for all must be moved from concept to reality.

Healthcare Key Performance Indicators 117

The American College of Rheumatology (ACR) took the same position regarding providing access to healthcare for all. In a position paper published in February 2009, the ACR reported that arthritis, rheumatic and musculoskeletal conditions are the most common chronic diseases with arthritis shown to be the leading cause of disability in the United States.13

The ACR also indicated a significant decrease in the percentage of people (workers and dependents) with employment-based health insurance, from 70 percent in 1987 to 59 percent in 2006. The ACR report stated this was the lowest level of employment-based insurance coverage in more than a decade. The paper attributed this decline to the 37.7 million workers who were uninsured for preventable reasons; not all businesses offer health benefits, not all workers qualified for coverage, and many of the workers could not afford insurance even when coverage was offered. This poses a threat to the health and safety of those who do not have insurance, or are underinsured, because they do not have access to preventive care. The snowball continues to grow because what ends up happening is disease being diagnosed in late stages, and once diagnosed, there is a risk of receiving less or no therapeutic care, which increases morbidity and mortality rates as compared to those with insurance. Ultimately, this translates into higher health care costs and reduced worker productivity. The ACR’s conclusion was that providing universal access to insurance is necessary to ensure access to affordable health care.14

Therefore, the ACR supports universal access to affordable insurance coverage through a choice of competing private or public comprehensive health insurance plans. It also supports the establishment of a nationally standardized minimum benefit package across public and private health plans, similar to the Federal Employee Health Benefit Plan. To realize these goals, the ACR recommends the future healthcare system is one where pre-existing condition clauses are eliminated from all health plans and where health plans are truly portable.15

Better Data for Better Care page 118

Steven Siegel described his trip to a Costco and related it to his vision for the future of healthcare. His first vision came to him in the parking lot. Costco is known as a bargain store. Steven observed automobiles of many makes, models, and conditions leading him to conclude that “everyone likes a bargain” and they are even willing to pay an annual fee to have access to those bargains. Therefore, Steven concluded that the future healthcare system must have no barriers to access.16

Social Networking and Healthcare Social networking is beginning to make its mark on

healthcare. One company believes networked devices combined with connected people equates to healthier communities.17 The notion that we are too busy to be healthy may have some merit. The Fast Company paper says the notion can’t be attributed to our lack of awareness that we need to be healthy, but we feel a much greater sense of pressure because we are more connected than ever before in the history of healthcare. We carry cell phones, iPhones, Black-berrys, laptops, netbooks, and numerous other personal digital devices that keep us connected virtually anywhere in the world. The paper postulates that advances in wirelessly connected devices and social networking platforms will make the job of a “family health manager” much easier, more meaningful, and more effective.18

People who are connected could take advantage of using networked devices to not only manage their own care, but manage the care of their family members as well. Wireless devices are already hitting the market and are in use by some more mature health systems (e.g., Kaiser Permanente).19 Examples provided by the paper include:

• Using near real-time health data access to provide rolling assessments and to alert patients of changes to their health risk based on biometrics assessment and monitoring (blood pressure, weight, sleep etc).

• Using wireless scales and activity monitors to gather information about health and behaviors and feed this information into desktop software, web applications, and social networks.

Healthcare Key Performance Indicators 119

Of course privacy and security are of concern to many who are hesitant to use wireless technology. One cannot read about interoperability or information exchange without reading about concerns regarding privacy and security. The emphasis is there; let’s hope it’s able to further advance and reach a point of acceptance to help us make the most of wireless information technology.

Does social interaction affect the healing process? According to the Fast Company, research shows greater social engagement helps people live longer, healthier lives.20 Even the Department of Defense is conducting research to determine the impact of how social networking and virtual worlds can help in the treatment of post traumatic stress disorder and traumatic brain injury diagnosed for wounded warriors.21 This area of interest still requires more studies to determine the real health benefits of interacting in virtual environments; however, virtual communities (e.g., LinkedIn22, Facebook23, Plaxo24, Twitter25, and Second Life26) already allow us to stay connected with those in our social and professional networks around the world and seem to be opening the door to new ways of connecting healthcare teams with patients and with other geographically separated healthcare teams.

“Continuous versus episodic monitoring of health can lead to better health outcomes.”27 In the future healthcare system, there will no longer be rushed visits to the doctor. Technology will help allow better monitoring of our health and help us make better informed health decisions. Network-connected devices and social networking could serve as a safety net by “… making us feel less alone, more empowered, and safer as we navigate the complex world of health and contribute healthier lives.”28 Collecting health data from mobile applications, embedded sensors, or other devices are converging to create new leading edge approaches to healthcare. We will see more and more stakeholders get involved in the wellness of communities using clinically based algorithms, data visualization, and community sharing.

Better Data for Better Care page 120

KPI 2 - Quality

Improving Care Coordination among Providers Going back to the series of papers describing what a decade of

successful change in healthcare could look like in 2019, the second underlying problems that must be addressed with respect to healthcare delivery is that our health system is not focused on prevention and effective coordinated interventions for chronic diseases. The idea put forth here is one of process improvement. Continuous process improvement can help to ensure safety, quality, and value and reduce disease, prevent complications, and minimize waste. Therefore, it behooves the Home Health team in collaboration with the patient and family to ensure only necessary care is performed avoiding inappropriate end-of-life inter-ventions.29

The Institute of Medicine (IOM) makes multiple recommend-ations to improve quality. A July 2006 IOM Report Brief suggests that in order for a new model of healthcare delivery to work and prevent medication errors, there must be more two-way com-munication between caregivers (e.g., doctors, nurses, pharmacists and other providers) and patients, and patients must take a more active role in the care process.30 Since 2001, the IOM continues to suggest the use of information technology as a key component of improving the quality of care, to ensure healthcare is safe, effective, patient centered, timely, efficient, and equitable.31 Kim Slocum, a Fellow of the Healthcare Information and Management Systems Society (HIMSS) and Director for Strategic Planning & Business Development at AstraZeneca LP, suggested that as healthcare starts to shift from an input driven to an output driven model, outcomes assessment tools will drive meaningful changes and “empowered” consumers will buy healthcare services solely on a price basis if they don’t have quality information readily available.32 Therefore, the future model of healthcare delivery should look something like that suggested by Lieutenant Colonel (Dr.) Hon Pak of the US Army.

Healthcare Key Performance Indicators 121

Dr. Pak described the future model of healthcare delivery as predictive, preventative, preemptive, and participatory.33 Specifically, he quoted work by Dr. Leroy Hood of the Institute for Systems Biology who predicted nanotechnology and molecular biology will allow us to screen a single drop of blood for thousands of genetic markers and diseases giving us the ability to predict the likelihood of illness over a person's lifetime. This information would then be used to take action to prevent the disease from occurring (e.g., begin a preventative medication or receive an immunization in childhood). If prediction and prevention efforts failed, caregivers would preempt disease with highly personalized medicines at the earliest possible stages of illness before symptoms are reported. Ultimately, patients will participate more in the management of their own wellness, prevention, and treatments.

As Steven Siegel continued describing his trip to a Costco, his fourth vision came to him in the food section. Steven observed that with so many choices, deciding what to do can be difficult. In Costco, customers are offered free samples to help the customer in their decision. Therefore, he envisioned a future healthcare system where free or low cost prevention services could serve as an incentive to better health and wellness. This would effectively help reduce overall healthcare utilization.34

KPI 3 - Cost

Transactional v. Transformational Management Referring back one more time to the series of papers

describing what a decade of successful change in healthcare could look like in 2019, the third underlying problem that must be addressed with respect to healthcare delivery is that we nor the healthcare industry are incentivized to wisely use our money and make appropriate allocations among the competing needs of society.35 The healthcare system in 2019 may include specialized care and surgical procedures that are paid for with single comprehensive fees for everything. For example, the standardized surgical procedure fee would be based on the

Better Data for Better Care page 122

diagnoses and procedures required and cover all of the involved providers including anesthesiologists, surgeons, nurses, and technicians. The fee would also cover all required lab tests, images, drugs, and all hospital or clinic services. In specialty care, a single comprehensive fee would be paid for a defined period of time and would cover all costs associated with management of that disease under control of the specialist. Under this type of reimbursement system, providers would be forced to manage operations within a fixed fee by standardizing processes toward efficient and high quality outcomes. This will result improvements in outcomes with cost savings and competition among caregivers based on quality.36

Today, Bridges to Excellence (BTE), a non-profit organization, encourages significant leaps in the quality of care by recognizing and rewarding healthcare providers who meet the recommendations of the IOM 2001 report of delivering safe, timely, effective, efficient, equitable, and patient-centered care. BTE participants include large employers, health plans, and others who share the goal of improving healthcare quality through measurement, reporting, rewards, and education. Top-performing doctors, based on their commitment to meeting standardized treatment protocols, could earn up to ten percent in annual bonuses paid by participating employers and health plans. They are also highlighted in provider directories to help patients and their families identify doctors with proven outcomes in treating particular illnesses, or whose patient care and support systems are rated as exemplary. As of the writing of this book, 17 states and the District of Columbia have BTE programs with over 12,000 physicians recognized. Fifteen national health plans have licensed BTE and are incorporating the BTE principles into their own pay-for-performance programs and more than 80 employers are participating in BTE nationally.37

As the pay for performance movement continues to grow, there are other trends worth mentioning relative to cost containment. LECG, a global expert services and consulting firm

Healthcare Key Performance Indicators 123

headquartered in Emeryville, California, published what it refers to as megatrends that will influence the future of healthcare.38

• Megatrend 1: There will be a continual shift from employer-based coverage to federally provided coverage.

• Megatrend 2: Per unit reimbursement for healthcare products and services will decline.

• Megatrend 3: Health information technology will mature rapidly. • Megatrend 4: Standardization of the healthcare industry will be

led by the federal government. • Megatrend 5: Providers will focus on efficiency improvements

before they address quality of care improvements.

The short-term effects of these mega trends include a significant reduction in profitability for private insurers and health plans. In the long term, it is expected there will be increased interest in plan consolidations as health plans work to reduce administrative costs and increase provider payments.

Failure to Change “The U.S. is entering into a period where fundamental change

in our healthcare system is possible. Fundamental change is needed to make our healthcare system sustainable both fiscally and morally. Our current path is leading to increasing and unsustainable costs while at the same time contributing to the growth of health inequities. A dialogue of what we as a nation truly value in healthcare is needed to begin this reform. The danger of not articulating these values broadly is that healthcare reform will fail due to lack of support, and the system will bankrupt the nation. Generational tensions will rise over the inherent unfairness of poor, young workers paying high taxes to support healthcare for a growing group of relatively wealthy elderly. America will grow weaker economically and politically. People worldwide will lose respect for the moral standing of America, and at home people will lose faith in the country’s historic commitment to a value proposition long held to be self evident.”39

Better Data for Better Care page 124

1 Austin and Boxerman’s Information Systems for Healthcare Management, Seventh Edition; G. Glandon, D. Smaltz, and D. Slovensky; 2008; Association of University Programs in Health Administration (AUPHA) and Health Administration Press (HAP); pages 114-117 2 Facing The Future Of Health Care: A report detailing conversations with twelve Greater Minnesota communities; B. Brandt, J. Stumpf Kertz; University of Minnesota Academic Health Center Office of Education, Minnesota Area Health Education Center 3 Information Technology Problem Statement: Government’s Perspective; LTC Hon Pak, USA, MC; MCIM 2007 4 HIT in the Next Five Years: Managing the Mandates of 5010/ICD ICD-10, ARRA, Privacy and Security and Healthcare Reform; M. Paramore, J. Miller, and P. Bradley; HIMSS Financial Systems Education Webinar Series, September 30, 2009 5 The American Reinvestment and Recovery Act of 2009 (Public Law 111-5); February 17, 2009 6 The American Reinvestment and Recovery Act of 2009 (Public Law 111-5), Title XIII 7 Presidential Executive Order 13335 — Incentives for the Use of Health Information Technology and Establishing the Position of the National Health Information Technology Coordinator; April 27, 2004 8 The American Reinvestment and Recovery Act of 2009, Title XIII, Subtitle D, Part 1 9 Health Insurance Reform; Modifications to the Health Insurance Portability and Accountability Act (HIPAA) Electronic Transaction Standards, Final Rule, and HIPAA Administrative Simplification: Modifications to Medical Data Code Set Standards To Adopt ICD–10–CM and ICD–10–PCS, Final Rule; January 16, 2009 10 HIT in the Next Five Years: Managing the Mandates of 5010/ICD ICD-10, ARRA, Privacy and Security and Healthcare Reform; M. Paramore, J. Miller, and P. Bradley; HIMSS Financial Systems Education Webinar Series, September 30, 2009 11 2019 Healthcare That Works For All: Healthcare Delivery in 2019; altfutures.com (http://www.altfutures.com/2019_Healthcare_That_Works_For_ All); page 1 12 2019 Healthcare That Works For All: Healthcare Delivery in 2019; altfutures.com (http://www.altfutures.com/2019_Healthcare_That_Works_For_ All); Appendix B, page 5 13 The Future of Health Care in the United States; American College of Rheumatology; February 2009, page 1

Healthcare Key Performance Indicators 125

14 The Future of Health Care in the United States; American College of Rheumatology; February 2009, page 4 15 The Future of Health Care in the United States; American College of Rheumatology; February 2009, page 4 16 Visions Of The Future Of Health Care: Scenes From A Trip To Costco; S. Siegal; from the VISIONS FOR THE FUTURE Of The U.S. Health Care System; pages 55-56 17 The Future of Healthcare is Social; J. Kilian and B. Pantuso; Frog Design Fast Company; www.fastcompany.com/futureofhealthcare 18 The Future of Healthcare is Social; J. Kilian and B. Pantuso; Frog Design Fast Company; www.fastcompany.com/futureofhealthcare, page 2 19 See multiple Kaiser Permanente case studies at http://xnet.kp.org/future/ahrstudy/index.html\ 20 The Future of Healthcare is Social; J. Kilian and B. Pantuso; Frog Design Fast Company; www.fastcompany.com/futureofhealthcare, page 4 21 http://t2health.org/ 22 The Future of Healthcare is Social; J. Kilian and B. Pantuso; Frog Design Fast Company; www.fastcompany.com/futureofhealthcare, page 4 23 http://www.facebook.com/ 24 http://www.plaxo.com/ 25 http://twitter.com/ 26 http://secondlife.com/whatis/ 27 The Future of Healthcare is Social; J. Kilian and B. Pantuso; Frog Design Fast Company; www.fastcompany.com/futureofhealthcare, page 10-11 28 The Future of Healthcare is Social; J. Kilian and B. Pantuso; Frog Design Fast Company; www.fastcompany.com/futureofhealthcare, page 11 29 2019 Healthcare That Works For All: Healthcare Delivery in 2019; altfutures.com (http://www.altfutures.com/2019_Healthcare_That_Works_For_ All); page 2 30 Institute of Medicine (IOM) Report Brief, July 2006 31 Crossing the Quality Chasm: A New Health System for the 21st Century, Institute of Medicine (IOM) of the National Academies; March 1, 2001 32 Healthcare Information Technology—Some Thoughts On Future Trends; K. Slocum; HIMSS Annual Conference Presentation, March 4, 2004

Better Data for Better Care page 126

33 Information Technology Problem Statement: Government’s Perspective; H. Pak; Open Source Solutions for Multi-Center Information Management Workshop (MCIM 2007); May 1, 2007 34 Visions Of The Future Of Health Care: Scenes From A Trip To Costco; S. Siegal; from the Visions For The Future of The U.S. Health Care System; pages 55-56 35 2019 Healthcare That Works For All: Healthcare Delivery in 2019; altfutures.com (http://www.altfutures.com/2019_Healthcare_That_Works_For_All); page 2 36 2019 Healthcare That Works For All: Healthcare Delivery in 2019; altfutures.com (http://www.altfutures.com/2019_Healthcare_That_Works_For_ All); page 3 37 http://www.bridgestoexcellence.org/Content/ContentDisplay. aspx?ContentID=163 38 Healthcare Megatrends: The future of healthcare financing and delivery; H. Miller, A. Vandervelde, and G. Russo; LEGC, LLC; 2009 39 2019 Healthcare That Works For All: Healthcare Values in 2019; altfutures.com (http://www.altfutures.com/2019_Healthcare_That_Works_For_ All); page 4

Better Data for Better Care 127

Reprinted from Evolvent Magazine Table of Contents An Introduction to Cloud Computing in the Federal Government 128

METC IT Operations and Support Center 134

Traumatic Brain Injury 140

Security Challenges and Threats 146

Behavioral Health 152

Leveraging Healthcare IT to Achieve Better Value 155

The Future of the Personal Health Record 160

Virtual Healthcare Artifact and Image Management 167

Thought Synch: Proof of Need for a Virtual Data Framework Vision 171

An Information Protection Strategy: Beyond Perimeter Defenses 177

Virtual Data Network 188

The National Response Framework: Where Do Hospitals Fit? 194

Training—Awareness—Motivation 202

Building Enterprise-Scale IT Support Services 208

Faster Help for the Recovering Wounded 212

Better Data for Better Care 128

An Introduction to Cloud Computing in the Federal Government

The term cloud in cloud computing is used as a metaphor for the Internet, which is traditionally depicted as a cloud in computer network diagrams. Cloud computing involves the provision of dynamically scalable and often virtualized resources as a service over the Internet.

Cloud computing customers no longer require knowledge of, expertise in, or control over the internet based technology infrastructure that supports them. For them cloud computing means no upfront investment in servers and other infrastructure, and depending on the service model, reductions in software investment as well. They pay only for what they need when they need it, as opposed to attempting to predict future business growth and purchasing IT infrastructure ahead of time.

Cloud computing providers deliver business applications online, hosted on virtualized servers, and are able to save their customers significant amounts of money, because recent advances in server virtualization and data center technology mean that new capability can be delivered much more quickly, flexibly, and less expensively.

“Cloud computing has become the phrase du jour," says Gartner senior analyst Ben Pring, echoing many of his peers. The problem is that (as with Web 2.0) everyone seems to have a different definition. Here is the National Institute of Standards and Technology (NIST) definition:

Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.

An Introduction to Cloud Computing in the Federal Government 129

NIST also classifies cloud computing into three different service models:

1. Cloud Software as a Service (SaaS): The customer is provided the provider’s applications running on a cloud infrastructure. The customer does not manage or control the underlying cloud infrastructure or even individual application capabilities, with the possible exception of limited user-specific application configuration settings. 2. Cloud Platform as a Service (PaaS): The customer deploys their own applications, but they must be created using programming languages and tools supported by the provider. The consumer does not manage or control the underlying cloud infrastructure, but has control over the deployed applications. 3. Cloud Infrastructure as a Service (IaaS): The customer is free to deploy and run arbitrary software, to include different operating systems, but the underlying cloud infrastructure is still managed by the provider.

Salesforce.com is by far the best-known example of cloud SaaS among enterprise applications, but SaaS is a common delivery platform for many other Human Resource applications, because so much of HR is so common across different organizations. More recently SaaS has expanded into many other areas, most notably "desktop" applications such as Google Apps and Zoho Office. SaaS is more attractive to start-ups and smaller companies that do not already have large investments in enterprise applications. Larger companies are likely to have systems tailored to their needs in place, which makes it harder to justify the switch to SaaS in most but not all cases.

Evolvent is integrating a cloud SaaS customer service support module into a private SharePoint portal for a “military startup” the new Medical Education and Training Campus (METC) in San Antonio, TX.

Closely related to SaaS, web service providers offer interfaces that enable developers to exploit functionality over the Internet, rather than delivering full-blown applications. A couple well

Better Data for Better Care 130

known examples are Google Maps and the Postini spam filtering service.

PaaS services are constrained by the provider's design and capabilities, but the customer gets predictability and pre-integration. The iPhone application development model is a good non-cloud example of PaaS: customers can leverage the iPhone’s GPS, motion sensor, touch screen, etc. to quickly write powerful applications – but those applications can only be hosted on an iPhone. Cloud examples of PaaS include Salesforce.com's Force.com platform, Coghead, and the new Google App Engine. For extremely lightweight development, cloud-based mashup platforms abound, such as Yahoo Pipes or Dapper.net.

IaaS services are of particular interest to government agencies, because so much of the functionality inside government applications are generally not shared with industry as a whole, and therefore industry is less likely to offer them as a SaaS/PaaS service. USA.gov, one of the busiest federal government websites, is hosted on Terremark’s “Enterprise Cloud” IaaS platform. In addition to Terremark, other notable IaaS providers include Amazon, IBM, AT&T, Savvis, and Rackspace. Of course, government agencies also have a need to use SaaS/PaaS, as the demand for services registered at GSA’s Apps.gov demonstrates.

None of this means that on-premise applications and infrastructure are going away. On a practical level, there are far too many existing applications that can’t be cost-effectively rewritten to run on a public cloud. On a strategic level, many applications contain data too sensitive to run on a cloud – even private clouds in some cases. Finally, there are a number of legal, regulatory, and security issues that may not make cloud computing practical for some organizations. A recent InformationWeek Analytics survey revealed that 39% of IT staff respondents stated public clouds would never be used by their organization, and 24% said they would not even consider private clouds.1 The debate about the appropriateness of hosting sensitive data in private clouds is particularly relevant to the

An Introduction to Cloud Computing in the Federal Government 131

federal government. Although federal agencies are leveraging cloud computing for some less sensitive data, FISMA and other federal Information Assurance policies must be updated before cloud computing can be used for more sensitive data.

Cloud computing is not an all-or-nothing proposition. Industry is slowly migrating toward a blended computing model that will combine the best elements of cloud services with internal IT systems – though the internal IT systems will likely use the same virtualized architectures as public cloud services. The customer service support module (cloud services) of the METC SharePoint portal (internal IT) noted above is a good example of this blended model.

Cloud Computing in the Federal Government

Federal CIO Vivek Kundra stated in a December 2009 interview:2

We'll be announcing our cloud computing strategy shortly …We can't continue proliferating data centers at the rate we've been. What we have to do is look at the root cause of the proliferation –part of it is that we haven't really led with delivery of services or platforms. The idea is to create government-wide platforms, whether they're on data or travel or collaboration.

There is a great deal of activity surrounding the subject of cloud computing in the Federal Government. In September 2009 GSA launched Apps.gov, a cloud services “storefront” for federal agencies. The concept is promising, but critics point out that behind the pretty new webpage, not much has changed to speed up GSA’s acquisition process for commercial cloud services. Just because commercial cloud vendors can create a new virtual server in a matter of minutes doesn’t mean the government can buy that virtual server any faster than it buys other services. For an excellent introduction to cloud computing in the Federal Government, watch this 2 minute video: http://bit.ly/4ZFr56.

Better Data for Better Care 132

NASA has initiated an open source platform cloud computing project named Nebula. It offers SaaS, PaaS, and IaaS services to NASA agencies. NASA built its own cloud platform because an analysis determined that no commercial provider could meet NASA’s requirements. However, Nebula is designed to be very interoperable with commercial cloud services so that it can leverage web services provided by them when required. More information about Nebula is available at: http://nebula.nasa.gov/

As noted above, the USA.gov website is now hosted on a commercial cloud platform. Martha Dorris, acting deputy associate administrator for the General Services Administration’s (GSA) Office of Citizen Services and Communications, stated that the move will cut USA.gov’s infrastructure costs by as much as 90% and improve its flexibility significantly3. GSA pays for a base level of service on an annual contract, and then may flex capacity based on unpredictable demand, paying only for what it needs – when it needs it – for anything above the level of service in the base contract.

The Defense Information Systems Agency (DISA) has a cloud computing platform named RACE, which stands for Rapid Access Computing Environment. RACE permits DoD agencies greater agility in developing, testing, and fielding new systems. However, critics point out that RACE is significantly more expensive than commercial cloud providers, which eliminates most of the cost savings associated with the move to cloud computing. Of course DISA has additional security requirements that must be complied with, which explains a portion of the additional cost. Nevertheless, some commercial cloud providers are now able to provide DoD-level security at prices far below those of DISA’s RACE.

Terremark is specifically targeting the federal market, and has opened a massive secure datacenter 60 miles outside of Washington D.C. in Culpeper Virginia. Security at this facility is equal to military standards, and includes a rapid response armed security force, fences, 10’ earth berm surrounding the entire

An Introduction to Cloud Computing in the Federal Government 133

campus with 150-foot building setbacks, very limited physical access to the hardware, malware analysis, digital forensics, vulnerability assessment penetration testing, management of firewalls, etc. Because Terremark’s physical, environmental, personnel, system, and security processes meet all FISMA control items, the Certification & Accreditation process that federal agencies must comply with can be more easily completed.

A Value Proposition for Federal Agencies

Federal agencies that want to leverage the many benefits of cloud computing may want to consider a team like the following:

• A medium-sized systems integrator like Evolvent that understands software development trends and specializes in helping federal clients build innovative information technology solutions to mission requirements. A systems integrator large enough to be able to pull internal subject matter experts for the job at hand, and small enough that each project benefits from detailed oversight by senior management.

• A commercial cloud provider like Terremark that understands both cloud computing technology and federal security requirements. Terremark is rated in the leader’s quadrant on Gartner’s most recent analysis of cloud IaaS providers.

Teamed together, Evolvent and Terremark offer federal agencies a one-stop shop for the development and execution of web applications that leverage a cloud computing platform.

1 Practical Analysis: Don't Rush To Cloud Computing, Information Week, 11 January 2010 2 Federal CIO Kundra Talks IT Strategy, InformationWeek, 19 December 2009 3 Federal Web Portal Moves to Cloud Computing Platform, Government Technology, May 1, 2009

Better Data for Better Care 134

METC IT Operations and Support Center

Background

In 2005, the Defense Base Closure and Realignment Committee (BRAC) recommended that all enlisted military medical training collocate and consolidate training at Fort Sam Houston, in San Antonio, Texas. The new school is known as the Medical Education and Training Campus (METC). The 2005 BRAC report stated:

METC will result in reduced infrastructure and excess system capacity, while capitalizing on the synergy of the co-location of similar training conducted by each of the three Services. In addition, the development of a joint training center will result in standardized training for medical enlisted specialties, enhancing interoperability and joint deployability.

A ground-breaking ceremony was held at Ft. Sam Houston on July 10, 2008 to mark the official beginning of facilities construction. METC will be the largest consolidation of service training in DOD history, and is scheduled to open in phases between June 2010 and September 2011. When complete it will be the world's largest medical education and training institution. The average daily student load is expected to be 9,000, with an annual throughput of 47,000 students.

The METC campus requires 2.1 million square feet of new construction, with 1.2 million square feet of instructional and laboratory space. The student campus will include a Processing Center, Post Exchange, recreation center, and gym. It will resemble a typical university campus environment, including six new dormitories with 1,800 rooms each.

IT Operations at METC

Evolvent is providing METC with the following IT integration services: Campus Support Center, desktop support, data center, asset management, wireless, AV/VTC, and Information

METC IT Operations and Support Center 135

Assurance. In addition, training and transition support will be provided to METC personnel for the operation of these services and equipment.

Major Manny Dominguez, METC CIO, wants to ensure that the IT infrastructure supporting METC leverages the latest technology in order to deliver the best service and reduce costs. With this in mind, Evolvent will be installing 5,000 Virtual Desktop Interface (VDI) “desktops” prior to classes beginning in June 2010. A VDI infrastructure has many benefits over traditional desktop PCs, including easier maintenance, better security, and cost savings from reduced electrical demand.

Evolvent will assist METC government staff during the implementation period with the following:

• Install, configure, and operate the METC data center • Evaluate, monitor, and report system performance, to include the

generation of run time reports as requested, and the logging and reporting of all downtime

• Implement Government approved operating procedures and recommend improvements as appropriate

• Manage and optimize the network • Maintain a backup library and associated records of media used

in the backup process. Assist the government staff in configuring and optimizing the backup solution

• Assist government staff to plan, coordinate, implement, operate, test, and demonstrate all Information Assurance (IA) efforts

• Track, document, and deliver all software, software configurations, and software licenses used by the Wireless Network System

At completion of system installation and acceptance, Evolvent will provide user training to approximately 50 METC IT personnel who will be assigned the responsibility of IT operations. This training will be conducted via over-the-shoulder observation by government personnel as Evolvent personnel conduct system configuration, installation, troubleshooting, and maintenance activities for all subsystems. Additionally, training materials will

Better Data for Better Care 136

be provided for METC IT staff to use in training METC students, staff, and faculty members on use of classroom IT systems.

Campus Support Center

Another critical piece of the METC vision being executed by Evolvent is a single point of entry Campus Support Center for all METC service activities (IT and non-IT). Evolvent’s Campus Support Center solution provides multiple communications channels for staff and students to submit service requests: live chat, web interface, telephone, e-mail, fax, and in person (Figure 1). The Campus Support Center interfaces with all IT services: desktops, communications systems, data center, enterprise applications, Information Assurance, multimedia, and the METC Portal. Significantly, many non-IT services are also supported including Registration, Knowledge Management, Program Management, Academic Services, Staff and Student Services, Corporate Communications, Acquisition, HR/Personnel, Facilities, and Administrative Affairs.

The functionality of the Campus Support Center is enabled by an innovative integration of technology provided by BMC Remedy, T-Metrics, and RightNow CX:

BMC Remedy: In addition to core help desk functionality, BMC Remedy provides the Campus Support Center interoperability with Enterprise Configuration Management and Asset Management systems. This integration enables more efficient IT Operations processes. BMC Remedy is compliant with the Army’s LandWarNet NetOps Architecture (LNA), and has previously obtained Certificate of Net-worthiness (CoN) and Approval to Operate (AtO) on DoD networks.

T-Metrics: The Campus Support Center uses a JITC approved Interactive Voice Response (IVR) telephony system from T-Metrics that is compatible with existing telephony equipment at Fort Sam Houston. It includes a contact management system that provides records of all Campus Support Center contacts through multiple communications channels, including the recording of all

METC IT Operations and Support Center 137

service calls. In addition the IVR system provides a web-based interface for administrative functions, real-time reporting, and operational metrics which will be prominently displayed on three LCD monitors strategically located in the Campus Support Center. With this information, managers and staff can invoke acute response tactics when warranted by task load, and develop corrective actions to facilitate long-term improvements. Key parameters provided include: calls in queue, percentage of calls abandoned, average duration of calls, average handling time, average talk time, wait time before an agent answers, and wait time before the caller abandons the call.

RightNow CX: Perhaps the most innovative aspect of the Campus Support Center is the self-service (Tier 0) support provided by RightNow CX. Seamlessly integrated into the SharePoint-based METC Portal, RightNow CX provides end-user self-services supported by knowledge management components in multiple file formats including text and multimedia. It will help students find answers to common questions, reducing Campus Support Center emails and phone calls. It will also send surveys after a support interaction to track customer experiences, as well as to students at the end of courses to gather instructor feedback. RightNow CX is delivered to METC via a cloud-based Software as a Service (SaaS) platform called the RightNow Government Cloud. Use of this cloud platform enabled METC to deploy the RightNow CX service within 30 days and save money. In addition, because RightNow CX is delivered as SaaS, future enhancements and new features are delivered seamlessly to METC – no patches or system administration required by METC staff. Major Dominguez states:

“The RightNow Government Cloud is highly secure and DoD compliant. METC is paving the way for other DoD organizations to use cutting-edge technology without investing significant budget and IT resources.”

When Tier 0 support is not enough, requests for Remote Assistance, Desk-side Support, and Expert Support are all routed

Better Data for Better Care 138

to the appropriate support tier depending on the work order. The IVR will route user calls to the Campus Support Center staff member with the appropriate skills using caller responses to a set of voice questions. Remote assistance tools are used for diagnosis and resolution of user-reported issues. Field personnel are also available to be dispatched if required to accomplish the work order. Through this integrated set of systems and processes, expert support is made available to users as needed.

Incident escalation consists of four tiers of user support with the following roles:

Tier 0 Self Help provided through an on-line knowledge base accessible through the METC Portal

Tier I Provide basic and general support to user inquiries, incident reporting, and service requests

Tier II Provide higher-level technical and analytical support in general areas: communications, reporting, etc.

Tier III Provide subject matter expert (SME) level of support in specific domain areas

Users as well as the Campus Support Center staff will be given visibility into ticket status, from inception to resolution. They will be able to retrieve ticket status via multiple media modes including text and voice. Accurate communication of ticket status and performance trends that show overall user satisfaction are critical to the success of the Campus Support Center.

Summary

The METC vision is to create a collaborative, non-competitive learning environment that reflects, respects and preserves each service’s culture. The different service missions, skills and capabilities combined are strengths used to shape a better force for the future.

“Evolvent is helping METC create a modern healthcare campus where the focus is on training our military’s enlisted medics to save lives and provide the best treatment possible at our world-wide medical facilities. Evolvent is building an innovative campus support center as a one-stop contact center

METC IT Operations and Support Center 139

for all questions, so students can spend their time training – not worrying about where to find information.” Major Manny Dominguez, PhD, USAF, CIO, METC

Better Data for Better Care 140

Traumatic Brain Injury

The imperative on Iraq’s signature military medical issue

Traumatic brain injury (TBI) is said to be one of the “signature injuries” of the conflict in Iraq, and accounts for a larger proportion of troop casualties than it has in previous wars fought by the United States. According to the Defense and Veterans Brain Injury Center, the U. S. military formally diagnosed 2,121 cases of TBI between October 2001 and January 2007. But the incidence of TBI among troops may be much higher than these official statistics suggest, largely because of the increasing use of the signature weapon of the Iraq war: the improvised explosive device (IED). Neurologists say that the Pentagon’s figures are based on the number of recorded penetrative head wounds, and exclude the closed head wounds also caused by IEDs, which are much more difficult to diagnose, and which may far outnumber the other types of brain injuries. The actual numbers of troops with TBI may therefore be much greater than the official estimates. — Health & Medicine, Neuroscience, May 14th 2007

As of June 30, 2007, DoD reported a total of 3,294 soldiers suffering from traumatic brain injuries or TBIs. Of those, 3,094 (or 94%) sustained their injury in OIF, while 200 (6%) sustained their injury in OEF. Of the total injuries in OIF and OEF, 195 (6%) were counted as “severe,” and an additional 180 (5%) were counted as “penetrating.” — CRS Report for Congress, August 17th, 2007.

Any wound impacts a soldier and a family at home. This sacrifice continues each day to those who may never be known to any of us on a personal level, but our society as Americans will feel the impact from these wounded families and war fighters for generations to come. The nature of many of the wounds from today’s conflicts however is seemingly far from certain simply because they are often wounds of the mind. How does one quantify the impact or even measure the extent of the wound?

Traumatic Brain Injury 141

How does one ensure access to care for a vastly different type of injury than we have built our systems to treat?

While we are all called to service in different times and places, those of us who work each day with our military medical system face a vastly different challenge than what might have been expected just a few years ago. Every conflict brings its unique healthcare concerns and our military medical professionals are some of the nation’s most dedicated public servants, yet a confluence of factors unique to the current Southwest Asian conflicts is driving an unprecedented demand on our system that we are only beginning to face.

The statistics cited at the beginning of this article are frightening indeed when one considers that for much of the last two decades we have suffered fundamental decay in our nation’s healthcare infrastructure. Medical technology has improved dramatically. Pharmaceuticals are delivering amazing results that would have been unimaginable not long ago. Yet our public and private sector care delivery systems suffer from a host of ills, all of which place our nation in a perilous condition from which to engage the returning veteran with traumatic brain injury (TBI) or even post-traumatic stress disorder (PTSD).

Much of the challenge facing military medicine today however is to properly identify, measure, diagnose, and establish a treatment protocol or therapeutic regimen that is effective across a spectrum of TBI and PTSD conditions. In this article, we present topical areas of interest from the efforts of a consortium of thought leaders from military medicine, the VA, private industry and health care professionals. Evolvent is a part of this consortium and supports its efforts through technology integration, clinical informatics, and telemedicine expertise. We are proud of this association as we believe that there is no

Better Data for Better Care 142

more important issue which we can address to serve those who are wounded and those who are in harm’s way.

Evolvent is a founding member of the Traumatic Brain Injury Consortium or TBIC, whose objective is to identify, validate, test and transition integrated technologies that comprehensively solve the complications associated with Traumatic Brain Injuries. TBIC is a consortium of Industry, Academicians, Laboratories and Not-For-Profit entities coming together and committed to helping solve the TBI problem for our country and our veterans. TBIC’s senior advisory group consists of Veteran Groups’ leaders, retired military medicine Surgeons General, Academicians and Business leaders insuring independent reviews of TBIC activities. TBIC’s current 5 focus areas include:

• Sensor systems that characterize the forces at time and place of injury

• Advanced bio-markers that provide early identification and initial documentation of pathology

• Diagnostic imaging techniques • Therapeutic and rehabilitative technologies to facilitate

maximum recovery for these patients • Integration of each of these technologies into health care

records, policy making, and medical decision support tools

Figure 7. Integrated Solution

The graphic above represents the relationship of each of these elements and how TBIC brings it all together in a comprehensive set of solution constructs.

Traumatic Brain Injury 143

Our team understands that much can be done now to help those who are affected by TBI, while also understanding that future Research and Development (R&D) efforts remain equally important. Our commitment is to ensure that every shred of knowledge is leveraged for both the patients’ welfare while simultaneously addressing Executive and Congressional mandates. The scale of the problem must motivate our industry and leaders to do everything possible to address this signature injury of the Afghanistan and Iraq conflicts.

Sensor Systems: Maturing sensor systems can now characterize the trauma forces at time of injury via miniature discrete sensors - some of which can fit into earplugs or headgear and worn by Soldiers, Marines, Sailors or Airmen. These systems can be downloaded in the field for immediate quantification and initial medical assessment of injury. TBIC members appreciate that this injuring force characterization and documentation into integrated medical records and trauma registries are the essential first steps in being able to properly diagnose, treat and rehabilitate those injured and suffering from all levels of TBI. Further, by integrating blast forces with subsequent diagnostic and therapeutic investigations, more rapid advances will be realized since injury severity can be correlated with outcomes. An accurate force characterization establishes the first part of critical medical infrastructure required to really begin addressing this catastrophic injury. Further, this comprehensive integration of cumulative forces is critical for diagnosis, treatment and rehabilitation not only for acute injuries, but for the latent ramifications of cumulative sub-threshold exposures that still injure but only manifest symptoms over time.

Bio-Markers: Once sensor systems characterize forces in excess of safe levels, advanced screening medical markers deployed in the field will initially help guide combat medics, health care providers and field commanders as to who needs medical attention or evacuation versus those who can safely continue serving. TBIC is working with our National Labs and Academia to develop sensitive and specific Bio-markers for

Better Data for Better Care 144

forward deployed early screening of TBI using a combination of Bio-Tronics, Nanotechnology and Proteomic technologies.

Diagnostic Imaging: Soldiers identified as likely to have suffered a TBI or mTBI injury by these advanced sensor systems, early bio-markers or clinical assessments will require definitive diagnostics that declare beyond any doubt the extent of their injuries. New diagnostic imaging techniques using MRI along with other modalities have just finished FDA review and received certification for various similar neuro-pathology applications. TBIC is very excited to correlate these newest advancements into clinical trials that offer the promise of much earlier definitive diagnosis. Not only are these newest digital imaging techniques more sensitive, but integration has already been successfully demonstrated in moving diagnostic images between both the DoD and VA healthcare systems. With this milestone, soldiers being treated can now realize a more transparent transition as they move between both the DoD and VA healthcare systems. Their TBI data can be wherever they receive treatment. This properly documented triad of exposure, diagnosis and treatment is crucial to our veterans receiving their just disability and compensation ratings. TBIC’s commitment of integrating and automating all relevant TBI data into treatment records will greatly facilitate our warriors ‘getting’ through the system, and just at the time when they are perhaps the least able to help themselves.

Rehab / Therapeutics: Finally, TBIC is committed to finding, validating and transitioning any and all treatment modalities that improve any aspect of mTBI, or TBI rehab for our injured warriors. Patients frequently suffer from memory loss, decreased higher order executive functions, depression, labile emotions along with many more related symptoms. There are promising rehabilitative technologies that are being used around the world to accelerate the re-learning frequently required after these types of neuronal injuries. Much more remains to be investigated.

Traumatic Brain Injury 145

Figure 8. Solution Components

The graphical element above describes the integration of all of these components in our solution construct. Most of these critical Sensor Systems, Advanced Diagnostic and Rehab tools described above are available for transition in the very near term (less than 1 year).

Evolvent and our partners in TBIC are committed to the transition of these technologies after independent validation confirms each technology’s contribution in solving the TBI problem. TBI is a problem that will be with us for a long time. Evolvent and our TBIC partners are committed to making a difference for our war fighters, veterans and their families on this critical issue.

Better Data for Better Care 146

Security Challenges and Threats

Information technology has changed the way we work, the way we communicate with one another, the way we educate, the way we entertain, and the way we interact in just about every aspect of our lives. Although it has changed our lives subtly, it presents some powerful implications for the future. We know from a historical perspective with other technologies that a new technology often results in effects we did not anticipate, a.k.a. disruptive technology. Automobiles significantly improved our mobility but changed the character of our cities. Advancements in medical technology have forced us to confront life and death decisions of a kind never faced before.

Computer technology, while improving nearly every aspect of our lives, has created significant security challenges to counter invisible threats – viruses, hackers, terrorists, and foreign military digital warriors who are trained to disrupt or take control of targeted computer technologies. Hostile external and internal threats arrayed against the United States are extensive and sobering, presenting our government and nation’s industry with increasingly serious challenges to the security of technology systems and the information they process and store.

These threats span all types of operational venues to include traditional espionage, terrorism, cyber terrorism, foreign military cyber attacks, organized crime, technology transfers, internal system users, and rapidly developing technological changes. This makes all kinds of sensitive information vulnerable, such as classified government information, industry’s emerging scientific and technological breakthroughs, and financial, medical, and personal information.

Our rising reliance on information technology has increased the value of information systems and infrastructures as a target for exploitation. As noted in recent events, the private, public, and

Pentagon Hit by

Unprecedented Cyber Attack

Security Challenges and Threats 147

government sectors show that uninformed decisions have lead to information loss, a compromise of personnel data, legal con-sequences, an adverse impact to military operations, and the need for costly repairs.

CEOs and government leaders in the end, are responsible for protecting organizational assets: people, intellectual property, sensitive information, infrastructures, networks, and computing resources. This has become more important and increasingly difficult with the rise in the number and sophistication of cyber threats. It is important to understand the risks, whether created by threats and advancing technology1, in order to logically deal with them in a way that leads to the desired security results – a secure infrastructure that protects information in any form at rest or in transit.

The security challenges and threats faced by the government, industry, and commercial sectors are nearly identical. Except for unique military hardware and space platforms, which all use computer technology, the government uses the same computer hardware and office automation software used by industry and the commercial sectors. Connected together by vast networks, all government, industrial, and commercial technology systems face the same security challenges and are vulnerable to the same threats. Several countries have established Information Warfare (IW) operations whose mission is to defensively counter electronic warfare attacks from other countries and offensively initiate a technology attack on any country’s military, industrial and other sectors’ networks and systems. Malicious code and viruses cannot distinguish between a government, industry, or commercial system making all systems and networks vulnerable.

For example, the action by the Department of Defense prohibiting external hardware (e.g., flash drives, thumb drives, CD/DVD, etc.) was in reaction to the recent impact of a malicious code that had been around for the past few years, despite the virus protection file that had been issued over a year ago designed to protect against this code. Examples of IW operations were

Better Data for Better Care 148

illustrated in the wake of the Russian- Georgian conflict where strong evidence indicates that Russia had coordinated a cyber attack against Georgia’s Internet infrastructure. Other examples include China being blamed for launching hacker and malicious code ventures against government, industry, and commercial sectors, and Israel, Hamas, and Iran continue to launch IW attacks against each other.

Although computer technology is providing us with wondrous improvements to our lives, the invisible threats from malicious codes and foreign military IW operations present unique security challenges that cannot be resolved with or countered by perimeter- based technological security solutions alone. One threat that may seem small in comparison with IW activities and malicious code are the vulnerabilities routinely found in software. IW operations and malicious code look for software vulnerabilities that can be used as a doorway to gaining control of a server or computer.

Without software vulnerabilities, the impact of malicious code would be minimal to computers and networks, and IW operations might be limited to denial-of-service attacks that flood the Internet and networks with directed information overload. The threat from software is not just with the application design where the programmer may have left a “back door” programmed in the software, the threat may also come from software patches, which are released frequently. Resource strained IT departments may resort to installing high priority updates for servers and computers first, and leave the less threatening patches for later. IW personnel and hackers who understand the impact of resource constraints will focus on exploiting this routine-related vulnerability to probe for weak points that will allow them to install malicious code that allows the hacker to potentially take “undetected” control of several thousand computers.

After taking control of multiple severs and/or computers, hackers are then able to initiate a distributed denial-of-service attack that historically has been able to shut down a network. In

Security Challenges and Threats 149

January 2003, the “Sapphire” worm, also known as “SQL Slammer”, spread rapidly throughout the world in a very short time frame. Sapphire was designed to look for a specific buffer vulnerability in Microsoft SQL servers. Microsoft was able to proactively identify the threat and release the necessary security patch to fix the problem six months prior to Sapphire’s appearance. However, the world was slow to respond. The rapid spread of this malicious code clearly illustrated a lack of proactive patch management capability by many organizations across all sectors. Prior to this incident, our security team implemented for a government client a defense-in-depth security process that included a proactive patch management process. As a result of the procedures we designed and implemented for our government client, Sapphire had no impact on their worldwide email and video network conference operations – no downtime for any system!

Although virus protection is designed to protect systems, typical virus definition files are only designed to detect “known” viruses and do not afford protection against new and evolving malicious code. New virus definition files are normally released only after a recognized negative impact to computers and servers, and usually one to several days after the virus/malicious code has hit the World Wide Web. This obviously presents a significant challenge to securing networks. One of the most neglected threats is the “insider.”

Virus protection software is only good if it is properly kept up to date. If not, computer users, through Internet access, may download infected programs, music, or pictures, or they may connect to their personal web mail creating potential backdoors around its firewalls and intrusion protection systems and creating an unimpeded avenue for intruders and malicious code to gain undetected access to the organization’s network resources. Additionally, firewalls and intrusion protection systems will only help to protect the network if someone – an insider – has properly configured the devices and is constantly managing the systems. In other words, does the insider stand up a firewall and

Better Data for Better Care 150

never change the “out-of-the-box” configuration settings? Insiders will not normally act foolishly if they are properly educated on computer threats and the requirements to adhere to government, industry, or commercial entity policies. They must also be made aware of the repercussions of non-compliance – to the network, the cost of repair, and even their career. The challenges to securing systems and networks are tremendous. Our experience has shown that it cannot be resolved with a technological solution alone. It is easy to fall into this limited defense-in-depth approach that is solely designed to create a network infrastructure that is only resistant and resilient to external attacks.

Although this approach does help, it has proven unsuccessful this past fall by failing to prevent a malicious code attack that impacted the Department of Defense. Our experience tells us that a true enterprise “Defense-in- Depth” process must include a holistic view -- people (insiders and outsiders), facilities, policies, education, awareness, and technology. The process must also convey clear organizational understanding and in-depth knowledge of its vulnerabilities and threats and the associated risks and impacts on its critical lines of business. In the end, the desired goal is to protect systems from inside and outside threats and to secure information, whether that information is government classified, protected health information, industrial proprietary data, or personal information.

Protection from constantly changing threats requires implementation of sound security practices that are flexible enough to handle these threats and technological changes. Evolvent’s core of subject matter experts in IW, physical, personnel, information, and technological security – whose experience dates back to the late –1960s – implement these practices for our clients on a daily basis

Our knowledge has allowed us to put into effect a holistic defense-in-depth security posture for commercial and government clients. Through a comprehensive “lessons learned”

Security Challenges and Threats 151

understanding on the changing threats and challenges to security, our security methodology has kept our clients’ information secure and flowing smoothly.

Better Data for Better Care 152

Behavioral Health

The National Defense Authorization Act (NDAA) for Fiscal Year (FY) 2006 and 2007 authorized the Secretary of Defense to develop an innovative, internet-based delivery of behavioral health tools to improve or augment military and civilian healthcare systems in providing early diagnosis and treatment of Post-Traumatic Stress Disorder (PTSD) and other mental health conditions.

In order to address behavioral health challenges resulting from PTSD, TBI, and mTBI, the Military Health System must leverage information technology to raise casualty care to transformational levels.

As the practice of medicine has evolved, so has the introduction of multiple and sometimes recurring concussive forces on warriors on the non-linear battlefield. In order to address these behavioral health challenges resulting from PTSD, Traumatic Brain Injury (TBI), and Mild TBI (mTBI), the Military Health System (MHS) must leverage information technology to raise casualty care to transformational levels.

Evolvent has been an integral part of MHS’ effort to reach these levels. We recognize, however, that our support of transformational advancements in the quality and productivity of the MHS requires tools and techniques developed from computer science and engineering in the social and behavioral sciences. Evolvent Technologies, Inc. is currently working three key programs for the MHS component called Defense Health Information Management Systems (DHIMS) and the Army’s Medical Command with respect to behavioral health: the Neurocognitive Assessment Tool (NCAT), the Army Telehealth Program, and Virtual Worlds Second Life® are all described in the following paragraphs.

Behavioral Health 153

NCAT

This program is designed to help DHIMS improve the delivery of neuropsychological health assessment services to our active duty military personnel. Evolvent’s technical solution was carefully crafted to provide an NCAT capability that enables baseline and post-event screening to occur at both theater and sustaining base echelons of care.

Our solution meticulously incorporates the features and capabilities of the Automated Neuropsychological Assessment Metrics (ANAM) to ensure a delivery consistent with the current infrastructure, operating and integration environment and an implementation tightly aligned with current DHIMS processes and methodologies. Several years ago, the DoD recognized the pressing need for a computer-based neuropsychological assessment capability and invested in the research and subsequent development of components of the ANAM.

Evolvent’s technical solution was carefully crafted to provide an NCAT capability that enables baseline and post-event screening to occur at both theater and sustaining base echelons of care.

The components of ANAM were then licensed by the U.S. Army to the University of Oklahoma Center for Human Operator Performance for further development. Evolvent has been an integral part of past and recent efforts to develop and deliver this critical capability. With the recent wars in Afghanistan and Iraq, the criticality of deploying this capability has taken on a new sense of urgency and importance.

Our team and our solution are therefore characterized by a personal and professional commitment to delivering a proven, world-class neurocognitive assessment capability. The goal of this effort is to develop the optimal IM/ IT integrated stand-alone, connected, and web-based solution to deploy, manage and

Better Data for Better Care 154

operate a neurocognitive assessment tool in Theater, within the United States, and overseas locations at military treatment facilities and the farthest forward treatment echelon practical and allowable.

The completed system will operate on a hardware and software architecture maintained by the Medical Communications for Combat Casualty Care (MC4) component in Theater and will permit functional users to administer, store and retrieve assessment test result data to be used to support both diagnostic and research activities. NCAT assessment results will be centrally stored and available to all authorized NCAT users throughout DoD.

Leveraging Healthcare IT to Achieve Better Value 155

Leveraging Healthcare IT to Achieve Better Value

Dr. Brent James, a highly respected healthcare quality expert, believes that throughout most of human history, doctors have done more harm than good. Prior to the adoption of germ theory and the scientific method in the 19th century, caregivers often caused patients to bleed or vomit to attempt healing. Since then the scientific method has revolutionized medicine science several times over, but there is one important aspect on which we still fall short. After a new treatment has been validated by clinical trials, decisions on which treatment to use in which situation are largely left to the judgment of practitioners, as opposed to a rigorous scientific measurement of their comparative effectiveness.1

Advanced IT is available to caregivers today, yet we still struggle with realizing value and best case outcomes in healthcare. Consider how IT has affected the finance community; we can travel virtually anywhere in the world and still have access to all of our financial information. But if we do not have the same level of access to our basic medical information when away from home, we put our lives at risk should we need medical attention, regardless of the skills of the physician providing treatment.

Healthcare information technology (HIT) is the most commonly mentioned prerequisite for generating sustainable progress in achieving greater value in healthcare for processes that improve quality, monitor outcomes, provide clinical decision support, develop evidence, track costs, streamline paperwork, improve care coordination, and facilitate patient engagement.2 It’s one thing to promote HIT to achieve these goals, but how we implement HIT will determine whether or not we actually achieve those goals.

The topmost focus on improving the nation’s healthcare system must be on patient safety and the quality of care provided. However, we should not undervalue the impact of improvements HIT can have on the bottom line. One particular study published in a 2005 Health Affairs journal article indicated a potential

Better Data for Better Care 156

savings of $77.8 billion per year across the industry if a national health information exchange (HIE) network were fully implemented.3 Considering today’s fragmented United States healthcare system, can a fully implemented HIE be achieved in a reasonable timeframe, e.g., by 2014? Imagine if you will, that it is achievable, then what is the true value of using IT in healthcare?

The Center for Information Technology Leadership (CITL) conducted a study to answer the question, “What are the demonstrated benefits of a given system or application?” The findings revealed an absence of literature relative to assessing the true value of HIT. It revealed that most studies focused on one or two benefits of a given HIT rather than the three dimensions of financial, clinical, and organizational benefits.

The CITL defines HIT value as the sum of financial, clinical, and organizational benefits directly resulting from the implementation of a given HIT.4 Effective HIT decision making requires capturing all of the tangible and intangible benefits to derive real evidence of its value. We already know how well the industry collects data, but converting that data into information that demonstrates the value of HIT will help enable the effective management of healthcare values and outcomes. It will also help providers make better investment decisions, and in turn facilitate better decision-making about adopting “best practice” HIT solutions.

A comprehensive representation of technology value is derived by collecting and analyzing data. With the help of IT, our healthcare industry has become adept at data collection, but is definitely behind the power curve in effectively analyzing its data. The National Research Council recently described the healthcare industry as “…an information- and knowledge-intensive enter-prise.” The dilemma revolves around the industry’s ability—or lack thereof—to transform this data into information and knowledge.5 Although there have been some successful HIT implementations, even the most successful implementations seem to fall short. Because of the transactional nature of our healthcare

Leveraging Healthcare IT to Achieve Better Value 157

system, its IT systems do not effectively assimilate the data, information, and knowledge necessary to provide optimal, situational appropriate care to us when we’re sick. Worse, these systems fall well short of proactive value to help prevent us from becoming sick.

Providers get lost amid all the data, tests, and monitoring equipment trying to understand a particular patient’s health status. This is where we need innovation that focuses on using HIT to produce the outcomes we need, rather than implementing HIT for the sake of implementing HIT. We must move beyond a focus on electronic medical record (EMR) adoption to focusing on information flow and communication.6 It is one thing to talk about being able to share information between different organizations, but if an individual organization cannot efficiently share its own information internally, how is connecting it to another organization going to help it meet its goals and objectives?

Take for example the Joint Commission’s call for a National Performance Measurement data system.7 The goal of the system is to improve the safety and quality of healthcare provided to the public. Our current national infrastructure will not support such a system, and even if a national infrastructure was in place, most organizations are not ready to exchange information inside their own institution, much less outside the institution in a standardized manner. The result is a situation where data is collected in fragmented ways resulting in an incomplete view of performance quality for the organizations or individual healthcare providers.

Our industry must embrace HIT and the strategies of knowledge management that have made other industries successful. We need medical knowledge embedded into our clinical workflows to provide true clinical cognitive support to providers at the point of care in order to improve clinical outcomes. We need to embed medical knowledge into our finance workflows in order to provide finance professionals with the

Better Data for Better Care 158

clinical inputs needed to make truly informed investment decisions. We need to embed medical and financial knowledge into our administrative workflows in order to provide healthcare executives with just the right clinical and financial information needed to improve the overall operations and efficiency of our healthcare organizations. We need to embed medical, financial, and administrative knowledge into our research workflows in order to accelerate research that provides the best value and the best outcomes.

Since 2004, two U.S. Presidents have issued a call to have an electronic medical record for every American by 2014. We must provide demonstrable evidence of value generation and best-case outcomes if we expect HIT to help us reach that goal.

As healthcare industry thought leaders and subject matter experts tackle these tough value and outcomes questions, we begin to see applications emerge that hold promise in moving the industry forward in terms of squeezing true value out of HIT. For example, the EMR is rapidly becoming the focal point for moving towards effective value and outcomes management. With that said, the majority of US hospitals are still in the early stages of EMR adoption.

Just as the topmost focus on improving the nation’s healthcare system must be on patient safety and quality, the current trend towards clinical decision support systems must be based on patient-centered, evidence-based, contextual, transparent, and learning principles. Decision makers are currently faced with vague rules poorly tailored to the evidence. Getting there is a function of having the right information available on the costs, outcomes, and strength of the information. These principles must be baked into healthcare information systems to help assess value and provide guidance for decision-making.8

Effectively managing the values and processes in healthcare using IT promises huge dividends. It promises to reduce medical errors, improve healthcare outcomes, and save costs along the way. Just as physicians have learned over time not to discount

Leveraging Healthcare IT to Achieve Better Value 159

the importance of science in identifying care guidelines and treatments, the HIT community must also embrace the science behind measuring the value of IT to the healthcare system. What could we do with $77 billion dollars saved in healthcare costs?

1 David Leonhardt, Making Health Care Better, The New York Times, November 8, 2009 2 Value In Health Care: Accounting for Cost, Quality, Safety, Outcomes and Innovation, Institute of Medicine of the National Academies, An IOM Learning Healthcare System Workshop Preliminary Discussion Brief, March 2009 3 The Value of Health Care Information Exchange and Interoperability; J. Walker, E. Pan, D. Johnston, J. Adler-Milstein, D. W. Bates, and B. Middleton; Health Affairs – Web Exclusive, The Policy Journal of the Health Sphere, January 15, 2005 4 Finding the Value in Healthcare Information Technologies, D. Johnston, E. Pan, B. Middleton; Center for Information Technology Leadership, 2002 5 Computational Technology for Effective Health Care: Immediate Steps and Strategic Directions; William W. Stead and Herbert S. Lin, Editors, Committee on Engaging the Computer Science Research Community in Health Care Informatics Computer Science and Telecommunications Board Division on Engineering and Physical Sciences, The National Research Council of the National Academies; The National Academies Press, January 9, 2009 6 Toward Health Information Liquidity: Realization of Better, More Efficient Care From the Free Flow of Information; S. Penfield, K.M. Anderson, M. Edmund, and M. Belanger, Booz Allen Hamilton White Paper, January 2009 7 Health Care at the Crossroads: Development of a National Performance Measurement Data Strategy; The Joint Commission White Paper; 2008 8 Value In Health Care: Accounting for Cost, Quality, Safety, Outcomes and Innovation, Institute of Medicine of the National Academies, An IOM Learning Healthcare System Workshop Preliminary Discussion Brief, March 2009

Better Data for Better Care 160

The Future of the Personal Health Record

How do we get from the typical static PHR of today to the idealized PHR? How do we best combine public policy, technology, incentives, and education to maximize the chance that our health care system can deliver a PHR that is not tied to any single care delivery organization (CDO), but offers easy access to all of a patient’s clinical data wherever it resides? How do we leverage better patient data for a better patient experience?

Voluntary Universal Healthcare Identifier

In order to efficiently and safely collect a patient’s clinical data from different CDOs and different electronic medical record (EMR) systems into one PHR, it is necessary to uniquely identify a patient across different CDOs. The best way to do this is to implement a universal healthcare identifier. Demographic matching algorithms now used for this purpose are prone to error, with potentially catastrophic results on patient safety. For political reasons, it is unlikely the government will attempt to establish a mandatory universal identifier anytime soon. This leaves a voluntary universal healthcare identifier (VUHID) as the best near-term solution.

Global Patient Identifiers, Inc. (GPII) has a proposed mech-anism in place to implement a VUHID. Visit http://gpii.info/ to learn details about how VUHID would work. Our concern in this article is the benefit of a VUHID to individual patients and public policy that may be needed in order to bring VUHID into main-stream use.

The benefits of VUHID for a patient are:

Convenience: Once a VUHID is issued, patients should no longer have to repeatedly supply identifying information, even when seeing a new physician for the first time.

The Future of the Personal Health Record 161

No patient identification errors: The VUHID reduces Enterprise Master Patient Index (EMPI) demographic matching errors, estimated to occur 8% of the time.

Reduced risk of identity theft: Participating in the VUHID system means a patient’s medical information is linked using the unique identifier rather than by name, address, birth date, social security number, telephone number, etc. This eliminates the potential for breaches of confidentiality associated with demographics-based exchange of such vital data.

Better control of medical privacy: A key benefit of the VUHID system is the ability to provide private identifiers (PVIDs) to an individual. Private identifiers are anonymous. They work by enabling electronic linkage of various types of clinical information without revealing a patient’s identity. For example, if a physician orders an HIV test and does not want anyone to know for whom the test was ordered, a PVID is attached to the sample before sending to the laboratory and the laboratory would return the result to the physician based only on the PVID.

Better medical care: In support of the vision of a Nationwide Health Information Network (NHIN), the VUHID makes it feasible for physicians to have complete and accurate medical information available at the point of care. It has repeatedly been shown that a physician equipped with complete medical information is able to provide more efficient and less error-prone care for their patients. The VUHID plays a key role in making optimum care feasible by eliminating confusion concerning a patient’s identity. We see the beginnings of this capability in today’s health information exchanges (HIE) around the country.

There are a lot of tough policy issues surrounding health care, but the VUHID is not one of them. It can be argued the VUHID (or something very similar) is a prerequisite both to the future of the PHR and ensuring the safe transmission of patient data between CDOs and HIEs using the NHIN. We would support federal government actions to move the industry towards the

Better Data for Better Care 162

adoption of a VUHID system, perhaps by including VUHID interoperability into EMR accreditation requirements.

Access to Clinical Data from Multiple CDOs

The long term answer here is ubiquitous EMRs that can securely port patient data into a PHR at the request of the patient through the NHIN, which would become the arbiter of data standards and formats.

In the meantime, the technology available today that enables the transfer of clinical data from multiple CDOs into a PHR and accommodates varying states of CDO EMR maturity is Health Postbox Express (HePoEx). It is a plug-in to HealthVault that allows CDOs to securely send clinical data to a patient’s HealthVault account. HePoEx supports delivery of records in the Continuity of Care Record (CCR) format for any CDO that can support the CCR format and allows clinical data to be sent in almost any other electronic format. If the provider does not have an EMR, the system can handle scanned copies.

To receive their health records, patients provide their HePoEx ID to their provider, along with the authorization form for Release of Information (ROI). The provider or CDO staff then log on to www.hepoex.com with their provider account, enters the HePoEx ID provided by the patient, and uploads the clinical data. The uploaded data is converted into a single encrypted package and uploaded to HealthVault.

The patient then receives an email with a 16-digit pass code and detailed instructions to retrieve the clinical data. After the pass code is verified, the patient is required to provide the answer to the secret question they previously registered with HePoEx. After providing the correct answer (two-factor authentication), the patient is able to import the clinical data into his or her Health Vault record.

This model is attractive to CDOs because no IT infrastructure setup or training is required. Any provider staff can request an account, open a browser, and start sending health records

The Future of the Personal Health Record 163

securely within minutes. This decreases turnaround time for ROI requests resulting in increased patient satisfaction. For the majority of providers that do not yet have EMRs, it is cost effective when compared to the time and money spent making paper copies. Automatic audit reports for completed ROIs assists CDOs with HIPAA compliance.

Often when a patient requests a ROI from a provider, it is specific to a referral. A limitation of the HePoEx model is that it only works if the referral provider agrees to look at the patient’s PHR. If the referral provider insists on paper copies because looking at clinical documentation in a PHR is too much of a disruption to their CDO’s workflow, then this model does not work. To turn the tables, if patients start taking their business to specialty providers that incorporate this type of business process, and away from those that do not, then CDOs would quickly adjust.

Secure Patient-Provider Communication

According to an August 2009 survey by Lightspeed Research, 1 in 2 Americans would like the ability to email their provider to ask questions and request prescription refills. Women are more likely than men to desire this online communication. In addition to the obvious convenience of saving time, 49% of patients desire written versus verbal instructions from their provider in order to make it easier to comply with treatment recommendations. 49% of patients are also unwilling to pay for an email consult, which explains why integrated payer-provider CDOs like Kaiser are so far ahead of most other CDOs with third party payers in this regard.

Although many providers are at first resistant to the idea of opening their practice to email, after trying it, most find re-sponding to an email is easier than responding to a phone consult because it is easier to budget their time. For example, if a provider has 5 minutes available before a scheduled office visit, he or she may take time to respond to an email consult, but not pick

Better Data for Better Care 164

up the phone and respond to a phone consult as there is no telling how long that phone call would take.

Hopefully, the medical home initiative and reform towards paying providers to manage medical conditions instead of visits will incentivize better and more widespread patient to provider communications in the not too distant future. The technology is ready as soon as the business model is ready.

Deliver Trusted Health Education Resources

Patients want information on their (and their loved ones’) health concerns. They want to stay up to date as medical science evolves and new treatments or medications are released related to their health concerns. Providers want their patients to get trusted health information, so they do not have to waste valuable time in the exam room debunking fad treatments.

Thanks to technologies such as Facebook and LinkedIn, most patients are familiar with the concept of “subscribing” to topics of interest and receiving updates in one location. It is not surprising that PHRs and third party websites are copying this model.

If the patient has a question, with one mouse click he or she can initiate an email to the provider that references the content he or she just read. This streamlines the business process not only for the patient but also for the provider, because the provider can more quickly understand the clinical context of the request (as opposed to a query based on, “my brother told me he heard about some new drug on CNN”).

The Future of the Personal Health Record 165

Deliver Social Networking Functionality

“Our profession, at its core, is fund-amentally flawed relative to how today’s world communicates and functions. The infrastructure of health care needs a total repair from the ground up. It needs to be Facebook’d and Wiki’d” Dr. Jay Parkinson, Hello Health

Social networks offer CDOs a new tool to connect with healthcare consumers. As consumerism continues to take hold and consumer-directed health plans become more commonplace, CDOs need to remain in constant communication with the patients they serve in order to keep those individuals loyal to their facility or practice. Social networks allow for multifaceted, regular communication with patients that assists in building and maintaining this necessary loyalty. As consumer control over the spending of their healthcare dollars increases, their behaviors will more closely mimic their behavior in purchasing other products and services.

"We can't expect government to build a network. Government didn't build the Internet. Government didn't build PCs… It’s you and me." David Kibbe, MD, American Academy of Family Physicians

PHRs will ultimately drive the sharing of data between HIEs

There is a great deal of innovation going on right now that is dramatically increasing the ability of patients to manage their own care and better communicate with their providers. The synergy between this innovation and the economic trends towards patient empowerment will ultimately become an unstoppable

Better Data for Better Care 166

force driving toward the idealized PHR. The idealized PHR (or something close) can deliver enough benefits to enough patients that it truly could becomes the “data broker” between HIEs in the event the NHIN is unsuccessful.

Virtual Healthcare Artifact and Image Management 167

Virtual Healthcare Artifact and Image Management

One of the top challenges echoed throughout the healthcare industry and in the halls of the legislative branch of government is the need for true “interoperability” – the need for free healthcare information exchange and truly informed decision making at the point of care. The technology is available today, but solutions seem to be mysteriously elusive. We have seen some successful Electronic Medical Record (EMR) related information technology implementations, but even the most successful implementations fall short in terms of making ALL of a patient’s information available at the point of care. Why can’t artifacts and images be made available? Because they typically rest scattered throughout the system in disparate repositories managed by disparate organizations, completely inaccessible by the point-of-care providers who are doing their best given a limited set of information. It is no wonder mistakes are made, it is no wonder people are hurt, and it is no wonder healthcare spending is reaching twenty percent of the gross domestic product. Just as in times of national crises and need, the Military Health System (MHS) is answering the call of duty—needed at the point of care.

To give you an idea of how many points of care we are talking about, consider this: More than 137 thousand MHS staff supports over 9.1 million beneficiaries from 412 medical clinics, 414 dental clinics, and 65 military hospitals. Whether in Air Force or Army medical clinics and hospitals within the United States, aboard the USNS Mercy hospital ship sailing the seas, on the ground in the remote mountains of Afghanistan or on a beach under hostile fire, the MHS is determined to find a solution. That is why they are looking to partner with innovative companies like Evolvent Technologies, Inc., who understand the MHS’ mission and that of the four Services to develop these much needed solutions.

The MHS was interested in a solution that would enhance the Department of Defense’s (DoD) Electronic Health Record (EHR) by including aspects of the health record from any media. This military EHR evolution is expected to help bridge the gap

Better Data for Better Care 168

between DoD and other Federal agencies (e.g., Veterans Affairs) to not only improve the quality of care, but also to “close the loop” in the continuum of care by making ALL of a patient’s necessary information available anytime regardless of where medical attention may be needed.

The military’s EHR is in place, but it is in different states of awareness with respect to artifacts and images (A&I). Here is how it currently works: the automation effort in a military healthcare setting begins with patient appointment scheduling, registration, check-in and discharge, and accounting and billing. The next step is computerized physician order entry (CPOE) with the implementation of modules for laboratory, radiology, pharmacy, and other ancillary orders. As a patient traverses the encounter workflow, there may be a multitude of other patient A&I generated through the continuum of care such as waveforms (i.e., EKGs), x-rays, imaging studies, photographs, scanned documents, video and audio files. This is where a major split in the patient “record” occurs; the A&I generated as a result of the encounter are typically not attached or associated with the EHR in any way. Instead, they are stored electronically and on paper in various disparate systems, including the patient’s paper medical records. This scenario describes why the MHS desires to enhance the EHR such that it becomes a complete electronic health record by incorporating A&I, free text documents, and structured data.

Currently, DoD patients may have paper medical records stored in different locations. An encounter record may be stored in one system, ancillary data in another system, and radiological images and imaging studies for various modalities (e.g., Computed Tomography, Magnetic Resonance Imaging, Cardiovascular Imaging, Nuclear Medicine, and Dental Imaging) are stored in Picture Archiving and Communication Systems (PACS) and are not easily accessible by the point-of-care providers. The healthcare provider typically has to search in multiple systems and request medical records from multiple locations, if there is time, to fuse the needed information for a complete picture of the patient. Further-more, some data are not

Virtual Healthcare Artifact and Image Management 169

easily accessible to the DoD clinicians outside the clinical department where the A&I were acquired. You probably already have an idea of the end result, but in summary one can only form a complete view of a patient’s current health status by expending a great deal of resources manually searching, gathering and assimilating a patient’s information, which is all over the place, into a complete medical record. This reiterates the MHS’ dire need to establish a single, virtual location for all patient-centric medical information. Can this be done? In short, YES!

Patient data can be captured, reformatted into an electronic version based on standards, and linked via metadata to the patient’s electronic health record. Capturing, storing and managing all of this A&I would bring the complete medical record together through the combination of the computable and non-computable data into a single system. As a first step towards this vision, the MHS awarded Evolvent a contract to develop a scalable, maintainable, efficient and effective technical solution that allows essential healthcare A&I generated during the healthcare delivery process to be accessible across the DoD. Furthermore, they also want to make the information available to Department of Veterans Affairs medical facilities and other Federal agencies by connecting it to the proposed National Health Information Network (NHIN). This objective supports an initiative started by former President George Bush and carried forward by President Barack Obama, which is to create an electronic health record for every American by 2014.

This new contract, called the Healthcare Artifacts and Images Management Solution (HAIMS), tasks Evolvent to help design and develop an enterprise-wide imaging system that links all patient A&I to their electronic health records. It is expected to offer healthcare providers a single source for all patient centric data. Radiological images stored in PACS will be linked and artifacts, including scanned documents, EKGs and other file formats, will be efficiently stored and managed. The need for specialists to share images will be met by providing rapid access

Better Data for Better Care 170

to image studies and workflow tools, and shared A&I will be accessible across the DoD and VA in real time.

The design will use a sophisticated architecture designed to cost-effectively address complex issues such as a globally dispersed system, network latency, interrupted communications, large image and file sizes, redundancy, and storage. The metadata will be dis-played via a web application module where providers will search for available A&I, and via a pick list can pull and display A&I on their clinical workstation. This architecture will also enable the scanning and capture of any and all other types of image data, to include paper records provided by out-of-network care, sharing of data with the VA and other future Federal agencies, and certainly the historical paper medical record. HAIMS will link patient data that addresses the needs of both primary care clinicians and specialists.

Specialists will be able to review and interpret A&I and generate reports that will aid providers in their treatment plan. HAIMS will help give providers a complete medical record giving them the entire medical picture from a single source enabling them to provide safer care at the right time no matter where the point-of-care may take them.

Thought Synch: Proof of Need for a Virtual Data Framework Vision 171

Thought Synch: Proof of Need for a Virtual Data Framework Vision

Evolvent thought leaders have written over the last couple of years about the current data rich healthcare environment and how data continues to pile up despite organizations’ best efforts to harness the power of those data for the benefit of patient care and business intelligence. We are in sync with others in the healthcare industry who have also written about this dilemma expressing from different vantage points, a need for a solution. Our vision of a Virtual Data Framework (VDF) transcends current literature on this dilemma offering a vision for resolution. My Evolvent colleagues and I, along with many other industry thought leaders, have discussed the healthcare industry’s adeptness at data collection, but we posit it is behind the power curve in effectively using its data.

In a recent report from the National Research Council (NRC), the modern healthcare industry is described as “…an information- and knowledge-intensive enterprise.”1 But the dilemma revolves around the industry’s ability—or lack thereof—to transform these data into information and knowledge. An objective of the NRC study was to determine how computer science-based methodologies and approaches might help towards a resolution. Although the researchers found some successful information technology (IT) implementations, they found even the most successful implementations fell short. Because of the transactional nature of our healthcare system, its IT systems do not effectively assimilate the data, information and knowledge necessary to provide optimal, situational appropriate care to me when I’m sick. Worse, these systems seem to do nothing proactive to prevent me from becoming sick. It stands to reason, then, that the NRC report would describe a healthcare system where providers get lost amid all the data, tests, and monitoring equipment trying to understand my particular health status.

So how can IT be used to provide the cognitive support health-care providers need? The NRC report offers excellent strategic

Better Data for Better Care 172

suggestions and echoes what many of healthcare’s thought leaders are saying, but it does not offer any potential tactical ideas. Evolvent’s VDF vision fills this tactical gap between where thought leaders are strategically trying to point organizations and the execution support organizations need to get there.

A well-planned, well-developed, and deliberately executed VDF will help give healthcare providers the transparent tools they need to develop a holistic picture of each patient under their care, whether it is a reactive or a proactive (preventative) picture.

Health Information “Liquidity”

The Federation of American Hospitals and Booz Allen Hamilton collaborated on a White Paper that provided a strong case for the free flow of information.2 Although the paper’s focus was on health information exchange (HIE) between organizations, the full benefits of inter-organizational HIE cannot be realized until each organization is able to achieve free flow of information internally. I agree with the idea that implementing electronic health records (EHR) is needed to help an organization move away from a paper-based system. However, the White Paper suggests, we must move beyond a focus on EHR adoption to focusing on information flow and communication. I agree and would take the thought a step further by stating we must move beyond a focus on IT adoption in general.

Implementing a Virtual Data Framework across an organization allows the organization to focus on outcomes.

Implementing a VDF across an organization allows the organization to focus on outcomes. A VDF takes advantage of innovative IT products developed on approved IT standards that will help an organization not only achieve internal business efficiencies and improved clinical quality and safety, it will better position the organization for sharing information within the

Thought Synch: Proof of Need for a Virtual Data Framework Vision 173

National Health Information Network and help our country achieve its goal of having an electronic health record for every American by 2014.

National Performance Measurement Data Strategy

The Joint Commission (JC) embarked on a public policy initiative in 2001 with its mission being to improve the safety and quality of healthcare provided to the public. As part of this initiative, the JC published a white paper in 2008 to frame the issues associated with establishing a national performance measurement data system.3 I agree with the JC’s assessment that the current national infrastructure will not support such a system. Even if a national infrastructure was in place, most organizations are not ready to exchange information inside their own institution, much less outside the institution in a standardized manner. The JC describes a situation where data are collected in fragmented ways resulting in an incomplete view of performance quality for the organizations or individual healthcare providers. The report goes on to say IT “…could alleviate much of the burden associated with data collection as long as the systems have been designed with the requisite functionality to support performance measurement activities.”

A VDF would make use of the latest innovative technologies capable of pulling together and transforming these fragmented data – whether from EHR systems, PHR systems, registries, etc. – within the organization into information and knowledge to assist with measuring an organization’s own performance against national patient safety and quality standards. It will also facilitate an organization’s ability to transparently share its data with a national performance measurement data system using a scalable and adaptable framework that easily allows for adjustments when new performance measurement reporting requirements are developed and implemented across the healthcare industry.

Better Data for Better Care 174

The Value of Healthcare Information Exchange and Interoperability

I believe the topmost focus on improving the nation’s healthcare system must be patient safety and the quality of care provided. However, we cannot undervalue the impact improvements can have on the bottom line. One particular study published in a 2005 Health Affairs journal article indicated a potential savings of $77.8 billion per year across the industry if a national health information exchange network were fully implemented.4 What does fully implemented HIE mean? The article described it as, “…electronic data flow between providers (hospitals and medical group practices) and other providers, and between providers and five stakeholders with which they exchange information most commonly: independent laboratories, radiology centers, pharmacies, payers, and public health departments.”

Bringing the right information in the right format to the right action officer or decision-maker.

Because there are so many different entities with which an organization needs to share data, the effort and degree of complexity needed to establish independent interfaces is overwhelming. Implementing a VDF consisting of the right foundation – a service-oriented architecture (SOA) – and the right information management toolset would provide a practically seamless and transparent solution to helping an organization realize some of these reported HIE savings.

Evolvent’s Vision: Virtual Data Framework (VDF)

There is no doubt multiple data collection and information management products have penetrated the commercial and federal healthcare markets, but have they really helped organizations juggle their multiple data sources to improve enterprise data reconciliation, sharing, access, and reuse?5

Thought Synch: Proof of Need for a Virtual Data Framework Vision 175

The recent industry reports referenced in this article have validated our thought leadership in this area by indicating the problem really does exist and urgently needs a solution. Our thought leaders are in sync with others in terms of possessing a clear understanding of the industry’s information dilemma. However, we like to take things a step further to help our customers put thoughts into practice, and developing a VDF vision reflects this ethos. We have described our pursuit of a real business intelligence solution6 that provides a flexible, intuitive, and common view across multiple data sources (clinical and business), a solution that reduces the time it takes to integrate new data gathering modalities into the portfolio, a solution that provides real-time information access to meet decision support requirements, and a solution that tactically and securely answers the question “how” when in current healthcare industry information management practices.7

We have described the need for a solution that uses the right mix of information management tools to effectively transform Data into Information into Knowledge and into Wisdom making the results accessible across the enterprise for truly informed clinical and business decisions and performance measurement – “…bringing the right information in the right format to the right action officer or decision-maker.”8 We have also described the complexities of the tool selection process, which is further comp-licated by the lack of interoperability not only within organizations, but also between organizations and how this results in increased costs and propagation of organizational inefficiencies such as Overuse of Labor, Incompatible Systems, Data Inaccessibility, and Complexity and Dysfunction.9

Consider closely our VDF vision and how it could solve these interoperability and integration challenges. Put on your “critical thinker” cap and ask us the tough questions. We are known for our innovative thinking and have been engaged by multiple clients who are tired of the information management status quo. Our nation is expecting an IT miracle as we are being rapidly propelled towards a national health information network.

Better Data for Better Care 176

Evolvent has the right team of innovative industry thought leaders and subject matter experts, including physicians and vendor partners to make a significant difference for the common good. Our VDF vision is only one example of where we can make a difference.

1Computational Technology for Effective Health Care: Immediate Steps and Strategic Directions; William W. Stead and Herbert S. Lin, Editors, Committee on Engaging the Computer Science Research Community in Health Care Informatics Computer Science and Telecommunications Board Division on Engineering and Physical Sciences, The National Research Council of the National Academies; The National Academies Press, January 9, 2009 2Toward Health Information Liquidity: Realization of Better, More Efficient Care From the Free Flow of Information; S. Penfield, K.M. Anderson, M. Edmund, and M. Belanger, Booz Allen Hamilton White Paper, January 2009 3Health Care at the Crossroads: Development of a National Performance Measurement Data Strategy; The Joint Commission White Paper; 2008 4The Value of Health Care Information Exchange and Interoperability; J. Walker, E. Pan, D. Johnston, J. Adler-Milstein, D. W. Bates, and B. Middleton; Health Affairs – Web Exclusive, The Policy Journal of the Health Sphere, January 15, 2005 5A Virtual Data Framework Vision, J. Daniels, Advance for Health Information Executives, Vol. 12, Issue 11, Page 37, November 1, 2008 6In Pursuit of Business Intelligence: The Virtual Data Network (VDN) Vision; Nisar Baig, Evolvent Magazine, Vol 1, 2008 7, 8, 9 No IT solution will be the “magic bullet” until the industry agrees on interoperability standards and IT product vendors develop to those standards. Evolvent understands this limitation and is committed to helping its customers successfully navigate these waters.

An Information Protection Strategy: Beyond Perimeter Defenses 177

An Information Protection Strategy: Beyond Perimeter Defenses

In light of recent events across the United States involving information breaches which put Protected Health Information (PHI) and Personally Identifiable Information (PII) in the hands of those who would otherwise use the information for personal gain, public and private sector agencies struggle to put into place protective measures to stop the hemorrhaging. Evolvent Technologies, Inc. has placed significant focus over the last nine years on leveraging technology enabled business security processes to protect business intelligence.

This article will demonstrate an effective approach to information protection which will safeguard information and mitigate the likelihood of unauthorized disclosure of information. Through a clear understanding of the threat and a technology enabled process driven approach available from Evolvent and our partner Fortify, organizations can achieve information protection. An Evolvent program termed the “Information Protection Project” focuses on information protection well beyond the traditional perimeter approach commonly applied to secure information. Our method is a structured approach to building a layered defense for protection of data. We understand the challenges of today’s interconnected global information environment, “a risk accepted by one, is shared by many.” Through a collaborative effort on everyone’s part – from the end user through the project sponsor – we can reach this information protection goal.

A best practices approach to information protection should start with target research and development of protection systems that are adaptive and ubiquitous. Rigid system solutions are eventually bypassed, compromised, or trashed as the environment in which they must work evolves.

Some dynamic and innovative measures, taken by the realistic threat, demand of us a better science of understanding complex

Better Data for Better Care 178

systems or, at minimum, tools for helping to understand their dynamic operation.

The Threat

Technology enhancements have enabled greater efficiency in our business processes. At the same time, we have increased our dependency on technology and thus our vulnerability. The threat of compromised information systems in critical business operations poses an even greater threat to our businesses. Technology has enabled attacks against our way of life from within our organizations and abroad. While purely perimeter based defenses have been somewhat effective, they have forced criminal elements to focus more on the internal vulnerabilities to reach their own personal goals. The inability of businesses to secure the very systems that they have become wholly dependent on could well be the catalyst that exploits this weakness. Information assurance is the application of controls to mitigate the risk of exposure of our information systems.

An Information Protection Strategy

The information protection strategy encompasses an industry best practices layered defense. If you can envision your data as a three dimensional ball with three rotation concentric circles of defense, then an understanding of the advantages of a layered defense should become clear. Where no single mechanism is infallible in protecting data, multiple mechanisms leverage the benefit of synergy to protect data. This defensive posture allows organizations to present a unique picture to the hacker at each attempt reducing their ability to penetrate your organizational defenses while raising the risk of exposure by repeated attempts of extended duration. This layered defense creates significant risk to the attacker and thus eventually renders the attack unaffordable.

• Our approach focuses on four key essentials to Information Protection: Security Investment Strategy

• Information Assurance • Software Security Assurance

An Information Protection Strategy: Beyond Perimeter Defenses 179

• Vulnerability Management

A Security Investment Strategy

If given one dollar to increase the protection of information for your enterprise, Evolvent recommends spending that dollar across five broad but important categories. Spend the first fifteen cents on defining the enterprise information protection policy. Employees and administrators alike need to obtain a consistent understanding of the organization’s security policy to support and integrate those security goals with business strategies. We recommend spending the next forty cents on marketing to raise awareness of the information protection program by allocating twenty cents on marketing to the general users and the other twenty cents to market to the information technology professionals in the organization’s IT shop. Evolvent’s Security Awareness process helps our clients reduce overall security risk to their operation process by integrating security into people’s daily business decisions – motivating them to support and implement security criteria. We draw upon over thirty years of extensive experience in security to create world-class awareness campaigns designed to meet government and corporate objectives and regulatory requirements.

Our Security Awareness programs are designed for all employees, at any organizational level. Spend ten cents on defining what needs protection and what your internal and external threats are to information protection. Understanding what data is critical and prioritizing the protection will lead to informed decisions about data protection investments. And, understanding the data relationships and dependencies is needed to ensure recovery operations can occur. A well-defined information protection program takes all of these things into account and offers a measure of resiliency which guarantees the information can be brought back on-line within the desired parameters, leaving nothing to chance.

Spend the next twenty cents on technology. The IPP is a good fit here. An information protection system which uses role-based

Better Data for Better Care 180

permissions, coupled with hardware encryption that is adaptive, truly exemplifies the models in which future network and data protection systems will follow. Last, but certainly not least, spend fifteen cents on process improvement. Streamlined security business processes, which employ a life-cycle approach to security, not only end unnecessary redundancies, they also reduce points of vulnerability. Application system code reviews is a good example of this. These processes also facilitate the integration of information protection into the organization’s strategic planning process, thus aligning protection initiatives in IT with the business goals of the organization.

Information Assurance

Information Assurance involves much more than integrity, availability, and confidentiality of data and more than protecting a computer information system. Information Assurance is a systematic approach to protecting an organization’s intellectual property. Evolvent is a growth organization with an impressive track record of providing Information Assurance products and services. We are specialists in providing Information Assurance products and services to government, commercial, and military agencies. We bring an unparalleled depth of expertise and resources in Information Assurance to provide best value security products and services for our clients.

Our approach to Information Assurance encompasses a continuous four phase life-cycle approach. This methodology, defines the activities, services, technology, and project management processes required to assess and, provide blueprint for organizational success. The key activities involved in this approach align directly with the accepted standard DoD, federal, and commercial approaches (DIACAP and NIST) and include:

• Assess: Focus on understanding the design and readiness of:

− Environment (systems, networks, applications, facilities, and personnel)

− Processes (policies and plans) − Related risks (internal, external, and emerging)

An Information Protection Strategy: Beyond Perimeter Defenses 181

• Address: Include review of policies, plans, evidentiary docu-mentation training, implementation of countermeasures and architectures that address the risk associated with the organizations’ security posture.

• Test: Ensure that the requirements are executed in accordance with requirements and organizational priorities (verification/validation). The team can perform verification and validation assessments to ensure that the security protective measures perform as intended.

• Monitor , Plan , and Improve: Provide critical near and long-term guidance along with enhanced process and workflow, reducing risk, schedule and cost. Our information assurance process leverages our strength and the strength of the organization we are supporting. This includes:

− A solid life-cycle methodology and approach − Identification of risks and vulnerabilities − Analysis of current or planned implementations against

requirements − The next steps that must be taken to support requirements − Planning including detailed recommendations that are

collaboratively derived

Software Security Assurance

According to the January 2009 edition of the CERT Common Sense Guide to Prevention and Detection of Insider Threats, 3rd Edition – Version 3.1, forty-five percent of the 176 cases analyzed:

• 44% of cases involved IT sabotage • 14% of cases involved theft or modification of information for

financial gain • 14% of cases involved theft or modification of information for

business advantage

The report further stresses that “insider IT sabotage is a threat to any organization that relies on an IT infrastructure for its business, regardless of the size or complexity of the configuration.”

This is the face of the emerging threat. As organizations have focused mainly on perimeter defenses, the threat has focused

Better Data for Better Care 182

their attention inward. Potentially one of American organizations greatest threats is one to which we are giving little attention. The U.S., through the H1-B program, gives foreigners access to the most advanced components of America’s technology industry. An area specifically at risk is software design. Control of compilers is a key part of a secure process for software development. There are several examples where perpetrators, rogue software developers, have inserted trapdoors in financial software to later steal money undetected. This carefully orchestrated and planned attack has become more commonplace while our organizational policies and government laws have remained behind.

A tested and proven methodology to address this risk is through Software Security Assurance (SSA). SSA is a complete end-to-end approach for securing applications against the threat of cyber-attack. Software Quality Assurance ensures that software will function and SSA addresses the immediate challenge of removing vulnerabilities from operational applications and for the ongoing systemic challenge of producing secure software. It also ensures that all software - whether it is developed in-house, outsourced, or acquired from vendors, even the open source community - is in compliance with internal and external security mandates.

Organizations that have adopted comprehensive SSA programs have experienced a measurable reduction of risk in existing applications and significant reductions in vulnerability counts coming from new software releases and procurements. This process reduces the overall cost of ensuring information security throughout the organization. These reductions are realized from eliminating production vulnerability management costs and reducing dependency on manual external application assessments. Time to compliance is also accelerated.

The Fortify Advantage

Fortify offers customized solutions to build SSA programs based on a three-pillared approach rooted in Fortify’s three core competitive competencies – the Fortify 360 Product Suite, a

An Information Protection Strategy: Beyond Perimeter Defenses 183

world-class Global Services division, and the cutting edge Security Research Group. These three pillars are rooted in Fortify’s Framework*SSA, the institutional knowledge, experience, and expertise harnessed from working with over 500 customers on developing their SSA programs.

Summary of Benefits

The key benefits of developing an SSA program with Fortify are measured reduction of risk and it’s cost effectiveness.

Data Protection

Our Information Protection Solution also incorporates a specially developed hardware tool called Tactically Unbreakable COMSEC (TUC) for protecting data in transit and data at rest. TUC is a hardware device based on reprogrammable logic which allows the device to be fully adaptable to the changing environment of cyber security. There are compelling advantages of using hardware rather than software for encryption – hardware does not introduce the latency associated with pure software encryption solutions.

Functional Requirements

Key management and crypto implementation makes systems far more secure than the average database driven key management procedures of legacy encryption implementations. The keys are randomly generated at session initialization. There are no keys stored in a database and subject to compromise by brute force attack. Our protection program can (at client request) leverage existing authentication processes already in place at the client organization. Our implementation requires no infrastructure upgrade or increased cost for infrastructure adjustment of perimeter defense mechanisms. The data at rest capabilities exist for all removable media as well as encryption of the hard drive by file type, folder name, hard drive partition, or entire volume as required. The unique nature of our implementation makes the execution of this encryption scheme based wholly on business rules and requires no input by the user.

Better Data for Better Care 184

System Requirements

Tactically Unbreakable COMSEC (TUC) is a multi-form factor capability based on Field Programmable Gate Arrays (FPGA) which cannot be corrupted by attackers. This hardware nature offers the following advantages:

• TUC cannot be corrupted or accessed by hackers • On-FPGA PowerPC, it is physically and programmatically

isolated • All “Trusted Processor” programming, data and buffers isolated • Digital certificates and public/ private keys are stored on Trusted

Processor TUC module – totally physically isolated • Hackers cannot access or corrupt digital certificates or keys

− Used to validate/certify all key host software and data files − Used to validate, encrypt and digitally sign all protected files

• Establishes a legal “Chain of Certification” which can be tested and proven

• May be used as certification for system software, data files and downloads

• Every item digitally signed, encrypted, and run time verified

During a protected communication between TUC hardware devices, TUC automatically sets up a virtual private tunnel for passing encrypted information. This process takes place during system initialization without user intervention.

Interface Requirements

Interface issues with the TUC hardware devices are non-existent. Because TUC operates at the data link layer of the OSI Model, it is agnostic to events at the application or presentation layer where interface requirements exist. TUC also operates below the operating system layers and thereby works with MAC 10.x, Microsoft Windows, industry standard internet browsers and Microsoft Office applications.

When leveraging current authentication capabilities, TUC becomes a two factor authentication capability.

An Information Protection Strategy: Beyond Perimeter Defenses 185

The system is also interoperable with network appliances (VPN, Firewall, IPS/IDS) working at level 3 of the OSI model. TUC supports smart card readers and can be tailored to integrate in the boot sequence of a Microsoft workstation or server environment. The system will also integrate with a broad range of operating systems, tape drives, and back-up policy types.

Vulnerability and Compliance Management

The protection components, all of them, of any secure environment are not infallible over time. As the threat increases their capability, our protection systems need to be dynamic and capable of change to shore up new vulnerabilities that emerge with increased threat. Our engineers perform vulnerability and attack and penetration assessments in internet, intranet and wireless environments for our clients. We also perform discovery and scanning for open ports and services, application penetration testing, and interact with the client as required throughout the engagement. Our teams provide reports documenting discoveries during the engagement and debrief these discoveries to the client with mitigation recommendations at the conclusion of each engagement. Our SMEs also provide FISMA program management and database maintenance and support for compliance management. The entire effort resulted in success for our client with respect to ensuring data integrity, monitoring and fixing compliance, helping users and managers with correcting deficiencies, and compiling FISMA reports for submission to OMB and Congress.

EVASE™

Along those same regulatory lines, having recognized the need to manage vulnerabilities to an enterprise protection strategy while adding value through helping federal organizations with FISMA reporting, the Evolvent team developed the Evolvent Vulnerability Assessment Security Engine (EVASE™). EVASE is a FISMA compliant reporting tool that can be used by a wide range of personnel throughout the entire agency.

Better Data for Better Care 186

EVASE is a multi-functional, secure, web-based information assurance and cyber security software package fully complemented by Evolvent’s cyber security consulting services. EVASE is intuitive for users and requires minimal training to implement and use. The resulting functionality permits hierarchical user and management control of data submissions; allows users to enter data as draft or works-in-progress awaiting management approval; and permits management with the capability to review, edit, and submit FISMA related information to a higher headquarters while segmenting data at lower echelons.

EVASE Tracker, a dynamic web-based vulnerability engine, uploads ISS® and Harris Stat® vulnerability scan results for managing, controlling, and tracking security vulnerabilities. Tracker can also be configured to manage vulnerability assessments from other management tools for mitigation tracking. Tracker allows managers to review hundreds of vulnerabilities by operating system, host/IP, and severity. Managers can then assign resources to mitigate those vulnerabilities, assign completion dates, estimate labor costs, prioritize resources, and track completion rates as individual system security improves. EVASE Score adds even more.

This web-based application allows clients to first create a database of their systems (system inventory) and then perform a FISMA equivalent security self-assessment based on National Institute of Standards and Technology (NIST) standards. This is a critical capability in managing your IT environment. Our experience with clients from public to private sector tells us that many organizations do not have accurate inventories of their computing environments. EVASE optionally includes an integrated documentation control tool and form manager that uses any client’s forms (e.g. OMB M-04-25) to route among participants, assignment of input and review/approval tasks, and tracking the progress and status at any organizational level. Reporting and output can be customized by the client to report by form or Microsoft office product, such as Word or Excel. This documentation control tool permits assignments of documents by

An Information Protection Strategy: Beyond Perimeter Defenses 187

individual or position, automatic email notification, and status reminders.

EVASE Scan combines Evolvent’s cyber security consulting services with EVASE software technology. This integration provides clients with efficient and cost-effective information assurance services, to include vulnerability scanning and miti-gation, certification and accreditation, Certificate of Networthiness applications, development and staffing of Network Security Operations Centers (NOSC), and general security services advisements.

Better Data for Better Care 188

Virtual Data Network

In Pursuit of a Single, Unified Data Integration Platform

Data collection and information management applications continue to abound in commercial and federal organizations. Many organizations struggle with multiple data sources that threaten organizational data quality, and each day that goes by the dilemma worsens as data continues to pile up. Wouldn’t it be great if an organization could implement a solution that provides a flexible, intuitive, and common view across multiple data sources? Wouldn’t it be great if an organization could significantly reduce the time it takes to integrate new data gathering applications into the portfolio, and provide real-time information access to meet decision support requirements? Most companies would agree that having standardized, responsive, and user-intuitive access to all of its enterprise data sources from one place would significantly enhance its business intelligence capabilities. This article explores how enterprise data can be made available by using an integration platform and innovative collection of information management tools to more effectively and efficiently transform data into information and knowledge for faster business decisions.

Selecting the right data modeling and analysis tools is an arduous, complex effort through which many organizations struggle. As such, organizations must repeat the tool selection process multiple times just to create a set of tools to build a complex portfolio of applications to meet all of its data analysis and decision support requirements. To further complicate the process, most often the tools are unique, not easily integrated, and force the organization to implement even more applications to mitigate the disparities or just accept a technology “wag the dog” state of affairs. This obviously increases costs and contributes to a multitude of organizational inefficiencies. See if you can recognize some of these inefficiencies.

Virtual Data Network 189

Mr. Jones has access to so many different applications that he has difficulty remembering his passwords. After being with the company for six months, he becomes embarrassed by having to contact the help desk a couple of times a week for a password reset. He knows the company hired him in part because of his strong organizational skills; therefore, he set out to devise a plan that would help him keep track of all his many passwords. First he considers creating a spreadsheet with all his passwords and just lock the spreadsheet with a password. He then thinks about why he is having to devise a plan, “I’ll probably forget this password, too” and remembers what he heard in newcomer’s orientation about the company’s policy against storing one’s passwords in a file on the computer. To avoid the risk of getting caught violating this policy, he decides to try using a 3x5 index card, writes his passwords on the card in pencil because he knows the passwords change every 90 days, and keeps the card in his “brain book.”1

Ms. Smith uses many of the same systems as Mr. Jones; however, her reports are either not consistent with his, or are often times redundant and cause much confusion among her corporate leadership because of the differences. Not to be outdone by Mr. Jones, however, she refuses to share those ad hoc reports she created that senior leadership seemed to like.

Acme Hospital Corporation is proud of its multi-state reach. The CEO occasionally travels from state to state visiting the corporation’s many hospitals. During an office visit with one hospital’s administrator, the discussion seemed to focus on the financial statements of the hospital. The administrator displayed on his computer monitor to the CEO a report to try and explain why the hospital seemed to be over executing its budget. However, the CEO became frustrated because the report he was looking at looked nothing like the report he was accustomed to viewing in his corporate office. The CEO quickly realized by the end of his trip that something must be done to standardize the company’s data analysis processes.

Better Data for Better Care 190

Evolvent recognizes these challenges and conceptualizes creating a virtual data network (VDN) that supports the integration of heterogeneous data. This approach has attracted a great deal of attention from some of our clients as a mechanism for making data—structured and unstructured—more accessible and accurate at a lower cost across the enterprise. The VDN is envisioned as an extended infrastructure—a data integration platform with a robust information management toolset—that leverages metadata to support existing data warehouse and/or data mart operations. The optimal outcome of such a construct is a virtual view of all enterprise data transformed from multiple data sources into information and knowledge. In a White Paper published by the Healthcare Information and Management Systems Society (HIMSS), Wickramasinghe, et al. described it as “…embracing the techniques of data mining and the strategies of knowledge management.”2

A VDN can optimize an organization’s business processes and create an effective business analytics capability. For example, retrieving immediate results for a developed query from virtually any system fosters an environment in which every member of an organization can help monitor operating costs, provide improved patient care decisions, or manage any number of operations using a standardized look and feel system. If built using existing data warehouses/ marts, data does not have to be moved or copied from source systems, additional storage is normally not required, and companies do not have to keep duplicate versions in sync. An incorporated data integration platform would maintain a central metadata directory that formats, transforms, maps and aggregates data between different sources and target(s) during a workflow session.

In mergers and acquisitions, companies are pressed to quickly leverage a new customer base to establish credibility, build loyalty, and capitalize on cross-sell and up-sell opportunities. For example, two newly merged divisions of the same company—each with its own customer base—are looking for cross-sell opportunities between the divisions. The company would be faced

Virtual Data Network 191

with the daunting task of consolidating vast stores of customer names, addresses, business history, and other information stored in incompatible and geographically dispersed systems. A data integration platform would provide the data access, cleansing, transformation, and movement services required for an optimal transition. Additionally, it would function as the nucleus of its Customer Data Integration (CDI) initiative.3 A VDN would be much more than just another way to access data.

Tools must be integrated to transform the data into actionable information and knowledge within an information and process improvement continuum. Evolvent calls the collection of tools an “Information Management Toolset.” Wickramasinghe, et al. refers to it as an “Intelligence Continuum” describing it as, “ … a collection of key tools, techniques and processes of today’s knowledge economy; i.e. including but not limited to data mining, business intelligence/analytics and knowledge management.”4 The toolset addresses issues such as data integration and data quality associated with using different systems. These issues may include data fragmentation, data synchronization, data inconsistency and data silos. The suite of tools available in a VDN could mitigate these issues with processes such as real-time data cleansing. For example, new records are evaluated upon entry into the system to determine whether the data are new or just modifications of existing data, and if necessary, reformats the data into the organization’s standard data formats.

Business rules integrated into the VDN are applied during this evaluation to ensure the highest possible levels of data quality. Therefore, the data are already clean and ready for the application of analytical tools within the VDN toolset by anyone in the organization from anywhere.

The toolset resides in the data integration layer of the architecture and consists of the tools that provide data mining, business intelligence and analytics, and knowledge management services within the VDN. This toolset can be built around and scaled to the needs of the organization. The process for selecting

Better Data for Better Care 192

an appropriate toolset mix for the organization can be made simple to minimize the time it takes and the resource costs associated with choosing the right VDN solution.

The Need for Integration

The need for integration is greater than ever. Organizations seek to link applications and legacy systems with newer applications in order to improve efficiency and effectiveness of internal and external processes. Among the most common problems and challenges are tightly coupled systems, vendor lock-ins or integration dependencies, inconsistent and platform dependent interfaces and complicated migrations. Data access in these technologies is usually pre-built, so adding new data sources requires developer effort.

The wide range of packaged integration alternatives among the various product categories makes it difficult for enterprise architects to select the best alternative for their integration needs. Scenario-based guidance on the relative suitability of different solution types among the individual integration products and vendors can help select best-of-breed solutions.

Selecting the best solution for a particular type of integration can be a challenging task. Should you select an enterprise application integration (EAI) tool or use an extract, transform, and load (ETL) tool for moving data? Should you consider an enterprise service bus (ESB), or a business services hub (information brokered) for coordinating exchanges with external business partners? The high degree of functional overlap found among these various categories of integration solutions makes this choice less than obvious. The “right” answer depends on a wide range of variables that are normally difficult to measure accurately. In these situations, architects should refine the product evaluation process by analyzing the particular integration scenarios they face. Employing a scenario based approach that covers a range of integration situations and the typical solution approaches to each will help enterprise architects determine the best course of action for their organization.

Virtual Data Network 193

Reviewing a range of integration scenarios and the software tools used in each will also help enterprise architects see how certain tools are better suited than others in solving specific integration problems such as data quality, integration, performance, security, and availability issues. However, if scenarios cannot be narrowed down to a single clear choice, then the best alternative should depend on the evaluation of other factors like the cost of the solution, the degree of urgency to deliver the needed functionality, the organization’s application strategy (packaged applications, custom development, or both) and the available development resources.5

Leveraging the legacy capabilities of existing initiatives through consolidation and integration of a virtual data network bring data out of the silo to users and decision-makers.

1 A “brain book” is a small portfolio many professionals carry with them that contains a notepad and certain information normally kept in your brain but often forgotten, e.g., business contact information, (his own and others), spouse’s birthday, office address, etc. 2 Defining the Landscape: Data Warehouse and Mining: Intelligence Continuum; A Work Product of the HIMSS Enterprise Information Systems Steering Committee; N. Wickramasinghe, R. Calman, and J. Schaffer; HIMSS, 2007 3 Data Integration in a Service-Oriented Architecture: A Strategic Foundation for Maximizing the Value of Enterprise Data; Informatica, November 2005 4 Defining the Landscape: Data Warehouse and Mining: Intelligence Continuum; A Work Product of the HIMSS Enterprise Information Systems Steering Committee; N. Wickramasinghe, R. Calman, and J. Schaffer; HIMSS, 2007 5 Evaluating Integration Technology Options; Ken Vollmer and Rob Kare; The Forrester Wave™, March 2007

Better Data for Better Care 194

The National Response Framework: Where Do Hospitals Fit?

The world changed on September 11th, 2001. We learned that a threat that gathers on the other side of the earth can strike our own cities and kill our own citizens. It’s an important lesson; one we can never forget. Oceans no longer protect America from the dangers of this world. We’re protected by daily vigilance at home. And we will be protected by resolute and decisive action against threats abroad.” President George W. Bush, February 2003

As I began preparing this article on the 6th anniversary of 9/11, I'm reminded of the danger we face today on our homeland. Six years later, we are still trying to reinforce disaster response and recovery capabilities. We have come a long way; but we still have a way to go as evidenced by lessons learned in recent hurricane seasons. In August, the results of an emergency planning study in US hospitals during 2003 and 2004.1 The report indicated that although 92% of the 739 participating hospitals had revised emergency response plans since September 11, 2001, only 63% had addressed natural disasters and biological, chemical, radiological, and explosive terrorism. Although 79% of hospitals were engaged in cooperative planning with other facilities, only 52% had written agreements in place to transfer patients during a disaster.

In the months following 9/11, we witnessed significant changes within the federal government. Congress enacted the Homeland Security Act of 2002 establishing the Department of Homeland Security (DHS). DHS is designed to minimize damage and assist in recovery from terrorist attacks that occur in the United States and act as the leader for natural and manmade crises and emergency planning. In 2003, more than 20 federal agencies became subject to the Secretary of DHS. At the same time, President Bush signed Homeland Security Presidential Directive-5 (HSPD-5), titled "Management of Domestic Incidents" directing the Secretary of DHS to establish a National

The National Response Framework: Where Do Hospitals Fit? 195

Response Plan (NRP).2 Within months, the new NRP replaced the aging Federal Response Plan.

The purpose of HSPD-5 is, "To enhance the ability of the United States to manage domestic incidents by establishing a single, comprehensive national incident management system." The NRP was predicated on the new National Incident Management System (NIMS) published in 2004. The NRP is referred to as an all-hazards plan: not focused just on the traditional spectrum of manmade and natural hazards-but also developed to address terrorist acts involving chemical, biological, radiological, nuclear, or high-yield explosive incidents." In the event of an incident, it integrates federal, state, local, tribal, private sector, and nongovernmental organizations into a concerted national emergency response effort.

Although NRP integrated all levels of government into incident management architecture, it required further refinement so DHS solicited the public for suggested changes. Respondents suggested that the initial NRP was bureaucratic, internally repetitive, and duplicative of details contained in NIMS. Respondents also indicated the NRP was insufficiently national in focus. Specifically, content was felt to be inconsistent with the promise of the word "Plan" in the title. In response, DHS changed the title from "Plan" to "Framework" so it might more accurately align with intended purpose. The resulting National Response Framework (NRF) is a natural evolution of the national response architecture.

The new NRF was released on September 10, 2007 for a 3o-day comment period. The same day, DHS Secretary Michael Chertoff testified before the Senate Committee on Homeland Security about his department's efforts in confronting the terrorist threat to the homeland six years after 9/11. He said, "...while we have successfully raised our barrier against terrorist attacks, the fad remains that we are still a nation at risk. "4 After indicating that our nation faces important choices, Secretary Chertoff posed several questions to the Senate Committee

Better Data for Better Care 196

members. How do we respond to ongoing threat? What actions are necessary to protect our country? How do we build upon our success to date? DHS addresses these questions through the proposed NRF, especially in its adoption of the word "Framework" in the title. This simple change captures the Department's intention of improving the integration of community, state, tribal and federal response efforts. While it reflects lessons learned and defines core principles for managing incidents, it broadens the focus from a purely federal plan to one that is truly national.

Response Doctrine: Key Principles

Engaged Partnerships. Leaders at all levels must communicate and actively support engaged partnerships to develop shared goals and align capabilities so that none allows the other to be overwhelmed in times of crisis.

Tiered Response. Incidents must be managed at the lowest possible jurisdictional level and supported by additional response capabilities when needed.

Scalable, Flexible and Adaptable Operational Capabilities. As incidents change in size, scope and complexity, the response must adapt to meet requirements.

Unity of Effort Through Unified Command. Effective unified command is indispensable to all response activities and requires clear understanding of the roles and responsibilities of each participating organization.

Readiness To Act. Effective incident response requires readiness to act balanced with an understanding of risk. From individuals, families and communities to local, State and Federal agencies, national response depends on the instinct and ability to act.

The NRF articulates five key principles of the nation's response doctrine. Rather than address these individually, this article focuses on the principle of readiness to act. This principle asserts that effectively acting with speed and efficiency requires

The National Response Framework: Where Do Hospitals Fit? 197

clear, focused communications and processes. These processes, procedures and systems must communicate timely, accurate and accessible information on an incident's cause, size and current situation to the public, responders and others. The NRF is part of a broader homeland security strategy with imperatives to prevent and disrupt, protect, respond and recover.

"So whether it's a naturally occurring event or a terrorist event, FEMA will be the incident manager at the scene, providing funding and command and control support in a disaster, whether caused by man or by Mother Nature."5 Tom Ridge

In March 2003, the Secretary of DHS, Tom Ridge, stated that the Federal Emergency Management Agency (FEMA) would take the incident management lead under the national response plan. FEMA was aligned directly under the DHS Secretary allowing the agency to focus on its mission of response and recovery. Recently, the current Secretary, Michael Chertoff testified before the House Committee on Homeland Security, "While Congress mandated changes to FEMA's organizational structure, we have made modifications to create a more nimble, better equipped organization."6

Chertoff realigned the U.S. Fire Administration, the former Office of Grants and Training, the Chemical Stockpile Emergency Preparedness Program, the Radiological Emergency Preparedness Program, and the Office of National Capital Region Coordination all under FEMA. FEMA manages the National Response Coordination Center (NRCC), its primary operations management center for most response incidents. It also serves as the focal point for national resource coordination. Emergency Support Function (ESF) teams are activated through the NRCC, including Public Health and Medical Services ESF #8. The Department of Health and Human Services serves as the coordinator for ESF #8 with the National Disaster Medical System (NDMS) to manage the federal medical response to emergencies and federally declared disasters.

Better Data for Better Care 198

FEMA also manages the National Integration Center (NIC) Incident Management Systems Division responsible for direction and oversight of NIMS. This organization oversees NIMS including development of compliance criteria and implementation activities. It also provides guidance and support to organizations, such as hospitals, as they adopt the system. NIMS is a core element of the national preparedness architecture. It is defined by DHS as a systematic, proactive approach guiding departments and agencies at all levels of government and the private sector to work seamlessly to prepare for, prevent, respond to, recover from, and mitigate the effects of incidents. NRP was predicated on NIMS with NIMS being published several months before NRP. NIMS was not meant to be an incident management or resource allocation plan. It is intended to be a strategic-level set of concepts, principles, and organizational processes that enable organizations to create effective and collaborative incident management capabilities.

NIMS states, "NIMS fosters the development of specialized technologies that facilitate emergency management and incident response activities and allows for the adoption of new approaches that will enable continuous refinement of NIMS overtime." Port Blue Corporation provides a great example of technology innovation in support of incident management. Their product called “Command Aware" incorporates the requirements of NIMS and the latest edition of the Hospital Incident Command System (HICS) into a powerful incident management system. It gives hospitals and health systems the ability to streamline emergency management programs. Combined with Evolvent's proven, successful expertise in business process reengineering, hospitals now have an innovative avenue for transformation of emergency management programs. According to the draft NIMS document, creating a common incident management framework gives emergency management and response personnel a standardized system for emergency management and response. Components were developed with consideration to adaptability so it could be used in any situation-whether it is requiring only local response or interstate and/or federal support. Examples include major

The National Response Framework: Where Do Hospitals Fit? 199

sporting or community events, hurricanes, or no-notice incidents such as earthquakes.

The NIMS framework is scalable and caters to coordination across multiagency, multijurisdictional, and multidisciplinary lines, especially during international border incidents. This same flexibility that allows coordination includes a level of standard-ization that facilitates integration and connectivity across jurisdictional and disciplinary lines. This common foundation fosters cohesion among various response agencies and org-anizations.

The most visible aspect of incident management is the command and management component. It includes directing incident operations; acquiring, coordinating, and delivering resources to incident sites; and sharing information about the incident with the public. The Incident Command System (ICS), included under this component, lays the groundwork for the Hospital Incident Management System.

The ICS provides capability for coordinated and collaborative incident management, even at the local level where most incidents are managed. The ICS integrates facilities, equipment, personnel, procedures, and communications into a standard organizational structure. The NIMS document describes the ICS as a fundamental form of management enabling incident managers to identify key concerns associated with an incident without sacrificing attention to any command system component. It gives emergency man-agement and response personnel the ability to make on-scene tactical decisions and take action under the organized command of appropriate authority. Organizations should use this construct to manage incidents, and guide processes for planning, building, and adapting to ICS structure.

Regardless of the standardized nature of ICS, there may be unique situations that challenge the structure. NIMS describes these as incidents that are not site-specific, are geographically dispersed, or evolve over periods of time-such as acts of biological, chemical, radiological, and nuclear terrorism. In these

Better Data for Better Care 200

cases everyone along the incident response continuum must display a remarkable level of preparation and coordination.

"Preparedness is a process of learning, adapting, and growing. We're facing new threats from terrible new weapons, and so we've got to learn how to become better prepared as we go forward. There have been difficulties along the Way. But we are dedicated to making that process better, to finding problems, and to fixing them."7 Michael Leavitt

As Secretary Leavitt noted, preparedness is a process of learning, adapting, and growing. Most would agree that we face new threats today. Most would also agree that we must learn to prepare. How can hospitals make that process better, find problems and fix them?

In some cases, hospitals are required to adopt ICS in emergency management programs. ICS for hospitals is found in the FEMA- adopted Hospital Incident Command System (HICS) developed by a multidisciplinary group sponsored by California Emergency Medical Services Authority.8 Originally the Hospital Emergency Incident Command System, it was developed in the 80’s to help hospitals prepare for various disaster types. The current version (HICS IV) dropped the word "emergency" from its title because the group recognized the value of having a foundation for responding to disasters, and organizing operations, preplanned events, and non-emergent situations.

The work group identified several objectives. One objective was to clarify components of HICS and its relationship to NIMS. The HICS guidebook states that hospitals which embrace the concepts and incident command design outlined in HICS are much better positioned with NIMS incident command design guidelines. It also allows participation in a system that promotes national standard-ization in terminology, response concepts, and procedures.

Another objective was to integrate CBRNE events into man-agement structure. The group intended to develop a standardized,

The National Response Framework: Where Do Hospitals Fit? 201

incident management system to address planning and response needs of hospitals, including rural and small facilities. Government agencies at all levels are required to adopt NIMS. Hospitals who receive federal funding or plan to apply for federal funding from the Health Resources and Services Administration (HRSA), the Agency for Healthcare Research.

1 Niska RW, Burt, CWO Emergency response planning in hospitals, United States: 2003-2004. Advance data from vital and healthstatistics;no391.Hyattsville, MD: National Center for Health Statistics.2007 2 Homeland Security Presidential Directive/HSPD-5, Management of Domestic Incidents, Office of the Press Secretary; February 28,2003 3 Manmade and natural hazards include fires, floods, oil spills, hazardous materials releases, transportation accidents, earthquakes, hurricanes, tornadoes, pandemics, and disruptions to infrastructures. 4 Michael Cherlo", September10,2007, Testimony of Secretary of Homeland Security before the Senate Committee on Homeland Security 5 Tom Ridge, March3,2003, Remarks by the Secretary of Homeland Security to the National Association of Counties 6 Michael Cherlo", September5, 2007, Testimony of Secretary of Homeland Security before the House Committee on Homeland Security 7 Mike Leavitt, September 25, 2006, Remarks by the Secretary of Health and Human Services at Bio Shield Stakeholders Meeting 8 The organizational participants of the HICS IV work group included the NIC, which was mentioned earlier in this article as being managed by FEMA under the Department of Homeland Security. It also included the Health Resources and Services Administration from the Department of Health and Human Services; the American Society for Healthcare Engineering from the American Hospital Association, and the Joint Commission on Accreditation of Healthcare Organizations. These organizations were accompanied by more than 70 subject matter experts from various organizations. http://www.emsa.ca.gov/hics/hics.asp

Better Data for Better Care 202

Training—Awareness—Motivation

A business may be doomed to failure if employees are not made aware of the computer security legal accountability requirements. Employees are a company’s most important asset, yet we fail to provide them with the critical information they need to be successful, which in turn helps a business to succeed. The prescription for assuring success is training.

In over 30 year’s involvement in the security arena, the most important element of any security program that I have observed has been the existence of a robust security training process. A benchmarks security program works 24 hours a day, with or without your security team being present, something that cannot be accomplished without a robust training process. A viable training process incorporates “Awareness” and “Motivation” principles that assure total buy-in to the security program.

Why Is Security Training So Important?

Several people may say “security training is very important,” and you may even be able to identify some of the major reasons that drive the need to conduct training, such as identity theft. There are number of published surveys, such as the annual FBI Computer Crime Survey, that clearly identify viruses, hackers, and criminals as major threats to computer systems and the information they process. In each and every survey, there has always been one major “dominator” threat consistently identified…the company employee, commonly referred to as the “insider threat.” Other than knowing about the computer virus protection software on their computer, most employees do not know what major threats exist and are probably not aware of the accountability requirements placed in corporate security policies.

A viable security training process includes an ongoing security awareness program that ensures all employees are aware of emerging threats and changes in security policies such as, HIPAA, FISMA, and Sarbanes-Oxley. Through a robust lifecycle security

Training—Awareness—Motivation 203

training process, employees are more apt to buy into security criteria that they may have previously thought to be a burden.

The [survey] responses indicate an overall substantial increase in respondents’ perception of the importance of security awareness training. 2006 CSI/FBI Computer Crime and Security Survey.

Computer security is a key business component involving people, processes, and technology. Establishing a solid foundation for a solid security training, awareness, and motivation program is the first step to assuring your business develops a security aware culture by educating your staff about security risks and their responsibilities!

The Evolvent CyberSecurity team members have developed numerous, highly successful, security training processes for commercial and military clients – for over thirty years. Our expertise will help each client to implement – Training + Awareness + Motivation = A Healthy Security Program.

Awareness and Motivation

It is human nature to be adverse to change of any kind. To affect a positive change, individuals must have motivation. How do you expect anyone to change or to support an idea unless they aware of the reason for the requirement and provided with motivation to willingly comply? To understand the security requirements, training needs to be structured to incorporate awareness and motivation –after all, training is a journey not a final destination. Security awareness is more than guiding people to realize there is a problem, it must also address motivational aspects to persuade people to take the next step, helping to correct the problem. The training process must be designed around the work environment, thereby making it clear to the employees how security requirements will deter or lessen potential security threats. The relevance of security training should be clear to both the individual and their employer. This can be done in many ways, not the least through the inclusion of

Better Data for Better Care 204

security within personal objectives and job descriptions that are tailored to the needs of the business. Our Evolvent Security team works closely with each client, helping them to determine the “who, what, why, when, where, and how” for their tailored security training process. Our security training process always factors in awareness and motivation thereby assuring employee buy in and creating a security aware culture.

The Who, What, and How Aspects of Training

In the preceding paragraphs, we covered the reasons why training is important. During your initial setup of the training process you will need to determine, with the help of experts, where and when the training should be conducted. There are two things you need to keep in mind about the when and where aspects of training and that is to ensure that training is conducted at a time in which it to be most receptive by the participants and conducted in a location where your employees will feel comfortable.

Who Needs Training?

It would be a cliché to say that everyone requires training, and in the case of security training, that cliché does hold true. Varying degrees of security training need to be presented to all corporate personnel, from the leadership team to the general user.

Upper Management / Senior Leadership

The first group of individuals is senior leadership, also referred to as upper management. The senior leadership training process needs to be designed to educate them on the requirements, why security important to them, and why their support is crucial to the success of the overall corporate security posture. They must be provided with a clear understanding of the legal and accountability aspects of the security requirements, possible cost factors that need to be considered, and how each employee should support the overall corporate security practice. Upper management support is crucial to the corporate security

Training—Awareness—Motivation 205

process through setting priorities and convincing staff across organization to follow security guidelines.

Regardless of measures an organization may take to protect its systems using technical computer security measures such as the use of passwords, biometrics, antivirus software, and the like, there will be risks of financial loss that still remain. 2006 CSI/FBI Computer Crime and Security Survey.

Information Technology Staff

The next group of individuals that require specialized security training is your IT staff. IT security training should focus on the standards, educating them about security issues and technologies compliance with the industry standards and guidelines established by leading organizations in their area. Security training for the IT staff should be geared around how security can make their job easier and not a burden. The IT staffs’ understanding of security usually revolves around security hardware and software solutions, which they probably find most interesting as it relates well to their other technology tasking.

However, there are a multitude of security criteria that IT specialists need to become familiar with in order to assure a complete security in-depth posture, such as, a viable patch management process, understanding why the employee hiring process must assure trustworthy employees are brought on board and what security requirements are they held accountable for. Our Evolvent Security Team is comprised of technicians who are applying their skills to the security arena. Our team fully understands the complexity of each technician’s job and how their job can be made easier by implementing solid security practices.

…29 percent of respondents attribute a percentage of losses greater than zero but less than 20 percent to actions of insiders. Hence, the remaining 39 percent of respondents attribute a percentage of their organization’s losses greater than 20 percent to insiders. In fact, 7 percent of respondents thought that insiders account for more than 80 percent of their organization’s

Better Data for Better Care 206

losses. …a significant number of respondents believe that insiders still account for a substantial portion of losses. 2006 CSI/FBI Computer Crime and Security Survey.

General User

The last group is the general computer user employee. This group of individuals is going to be the hardest group for which you will have to tailor a security training process. You will be dealing with a wide variety of interest, varying degrees of skills and knowledge related to computers, and coming from different work sections within the corporation all with different functions such as, finance, human resources, facilities management, and so forth. The motivation and awareness principles are truly increased within the general user security training process. Fortunately, the Evolvent Security Team have presented security briefings from small to extremely large audiences with tremendous success! The secret to their success is built upon many decades of experience, learning what will work and what will not work.

When dealing with the subject of security you need a security expert that can relate to the general user population and explain why it is important for their participation in the security program, not some formal instructor who has no security background and cannot answer the multitude of questions generated during a general user training session.

Summary

Security awareness is more than helping individuals to realize there is a problem, it is about motivating them to take the next step in the learning cycle. The information presented during training must be relevant to be clear to Senior Leadership, IT Staff, and a General User. You have probably all heard the following statement, “it’s not what you know, but who you know that counts.” For a viable security program, this statement should be modified to read: “it’s who you know and what you know that counts.” The successful security profession is built upon the

Training—Awareness—Motivation 207

sharing of knowledge to increase what you know thereby improving your security training process through information obtained from others who share their knowledge.

During the past 35+ years, our security professionals have built an information network of security peers. Our information security network has truly been beneficial to each of our clients as we are able to reduce their security expenditure costs while keeping them appraised on emerging threats and new policies.

Without the frills and fluff, our team provides a truly en-lightening and highly educational security training menu. We are capable of providing training on any given security subject, at any location, and for any size of audience. Security training is a key business process for assuring corporate success, lessening the opportunity for corporate embarrassment, or failure.

Better Data for Better Care 208

Building Enterprise-Scale IT Support Services

The use of computers and computer communications continues to be a growing need across our federal and commercial clients. Nowhere is this truer than in health care – public and private. Today regular system support is provided mostly at a local operation level with little coordination between operations and practically no support backup at an enterprise level. IT support has suffered also from little use of or best practices development. With labor shortages, expected staffing cuts in the public sector, and outsourcing contract costs reductions, the cost of current IT support is unsustainable and the quality is frequently sub-optimal.

In the face of planned consolidation and the reduced reliance or elimination of on-site labor – it is vital to many organizations to consider the development of an Enterprise Consolidated Help Desk Program (eCHD) to become the tier 0/tier 1 first call responder for technology support. This article introduces a methodology and process framework for building an eCHD program, testing it in a “prototype” phase to establish a framework and validated model for a new, metrics-driven, professionally supported eCHD.

The goal of an eCHD program is to define and refine a model for deployment of a finite number of consolidated help desks that will:

• Improve the level of service available to the end customer • Reduce cost substantially • Provide repeatable and measurable levels of support • Be scalable and adaptable to changing missions • Support a constantly changing set of priorities • Be constantly improving and evolving based on internal dev-

elopment and sharing of best practices

Evolvent research has shown it is vital to establish a strategic capability to analyze the current IT support structure and define the performance metrics required for the eCHD. The analysis will

Building Enterprise-Scale IT Support Services Title 209

establish a common technology tool set for the deployment of support services across the enterprise and define a common set of business rules and roles to create an evidence based path to enterprise support. In parallel, we recommend that organizations establish a pilot eCHD facility to evaluate the direction established by the analytical phase and prove the reliability and scalability of the design. A subset of the target support audience will be selected for exclusive interaction with the pilot eCHD activity to develop performance and scalability metrics. The end result will be a clear path forward and a eCHD design that can be scaled and managed to provide improved enterprise wide IT support at reduced cost.

The eCHD approach requires business process reengineering on a large scale, understanding of business rules development, cost measurement issues, performance metrics development, change management, and communications issues to ensure the mission of providing support services excellence to the enterprise.

Evolvent brings a unique understanding of the challenges and opportunities that the eCHD presents. We are actively partnered with many customers in providing local help desk support at numerous facilities around the world. We are experts in providing health care related IT consulting and systems in both public and private sectors. We have successfully helped a large integrated healthcare delivery network evaluate and define business process enhancements and deploy business process automation. We have leveraged our refined cost analysis methodologies to help realign IT resources across a major regional subsidiary organization. Most importantly, eCHD programs are a logical continuation of the total cost of ownership studies we have performed over the last seven years.

The establishment of a strategic analytical capability and a prototype eCHD program would be supported using Evolvent’s proven Process Management Framework, supported by Lean Six Sigma methodologies. The Framework illustrated in Figure 1 is designed to be performance driven and can be used

Better Data for Better Care 210

comprehensively or partially depending on the complexity of a process management initiative. It is based on the best practices in the professions of business change and management. Once implemented, the Process Management Framework cycle becomes a part of the way the eCHD will do business.

It is important to note that the Framework is not designed to function as a linear, waterfall-like process. In practice, each Phase supports a particular set of objectives of process management and once the organization has reached a core process management capability, the Phases will cycle continuously and simultaneously. For example, the Strategic Phase is first executed at the initiation of an organization’s process management initiative and then usually only repeats during the organization’s strategic planning cycle (typically once a year). The Analyze and Execute Phases are executed the first time to establish the baseline process management capability but then, these Phases are ongoing cycles as opportunities are identified for process improvements and/or new process design. Once an organization has established a core process management capability and can successfully function in the Operate Phase, it is a continuous cycle of process execution, monitoring, controlling and reporting. When a process gets out of control or performance objectives change, that process is pushed to the Analyze Phase, through the Execute Phase and then back to the Operate Phase once the improvement has been implemented. Knowledge gained in the Operate Phase is used to support goal setting during each Strategic Phase cycle.

The effort described in this framework is designed to establish baseline process management for the eCHD. Planning for the Execute Phase will be conducted during the same timeframe as the execution of the Strategic and Analyze Phases and will address the development of a reusable, accepted approach for process development and improvement for the eCHD. During the Execute Phase, the solution includes building a fully functioning pilot eCHD facility to test our approach for on-going process design and improvement (i.e. have executed the Execute Phase at least once using the selected approach). Once the approach has been

Building Enterprise-Scale IT Support Services Title 211

adopted, the eCHD will have a core process management capability in place and be able to scale up to the full Operate Phase. As the eCHD continues to follow the Framework, the organization will climb to higher and higher levels of process performance resulting in a continuous increase in value for their customers and stakeholders.

Through incorporating the best practices of process management, change management, and our experience in consolidating IT support services across major health care delivery organizations, we believe our framework can provide a secure, knowledge-driven approach to delivering more successful, cost-effective IT support.

Better Data for Better Care 212

Faster Help for the Recovering Wounded

Solving the Medical Evaluation Board Backlog and Business Process Issues in Military Medicine

Objective

Help America’s recovering wounded war fighter through deploying clinical expertise and Automation Tools to resolve the Medical Evaluation Board case backlog in the Military Healthcare System including: Senior clinical resource team to aggressively support backlog reduction Web-based process automation tools utilizing emerging technologies to make task management more accessible, more accurate and much faster at a lower cost

Basis

The issues within the Military Healthcare System (MHS) surrounding the current backlog in wounded soldiers trying to get through the Medical Evaluation Board (MEB) process are many and varied. Among the most serious issues are:

• Process inefficiencies leading to long lag times in expediting cases • Shortages of clinical resources and clinician time dedicated to

processing MEB claims • Limited management oversight and visibility of status/tracking • Primary care provider responsibility as an additional duty limits

case management capability

Tracking of status or delivery of information to VA systems not streamlined, many activities must be repeated because information or systems do not work together.

As change agents in a variety of military medical environments from the largest facility CIO role, to leaders of Navy, Air Force, and Army Medicine information technology organizations, a team can be formed with the understanding, agility, and integrity to rapidly resolve the MEB problem. This solution is summarized in two parts: functional and technical.

Faster Help for the Recovering Wounded 213

America’s Wounded Require Our Best Efforts in Treatment and in Transition

• Clinically, we need to clear the case backlog and clearly support the wounded through identifying their options, treatment plans, and rights under the law

• Technically, our programs should have the best information readily available to providers and patients alike – no one should be in the dark when they’ve served so nobly.

Functional Solution

A full functional solution to the ongoing MEB problem set is a long-term objective, but much can be done in the short-term through adequate specialty clinical and technical resources. Functionally, Clinical Case Management, Patient Advocacy, MEB Assessment Services, Reporting, Coding and Transcription Services, and a summary reporting capability could be put in place as a “pilot” at certain key facilities to develop, test and prove the concept of a specialist force. This specialist force would clear the immediate backlog and establish a new, improved standard of care for our soldiers.

Functionally, clinical case managers familiar with the MEB process and military medicine would be provided to clear the backlog and define parameters for a long-term solution to the problem. Functional proponents would work to establish:

• Case Management Standards and Interventions • Rapid Assessment and Patient Classification Protocols including

the development of an effective transition process and capability from in-theater to military treatment facilities

• Patient Advocacy Protocols and Standards • Patient Communications Program to ensure beneficiary and

family awareness • Ancillary Medical Services Support to enable provider efficiency

such as transcription and coding • Adequate Reporting Standards to ensure effective leadership

oversight

Better Data for Better Care 214

A pilot study would also implement a series of information technology improvements including a trial MEB automation program that could include a variety of technical services: Business Process Automation Tool already proven in the DoD Integrated Clinical Database program already proven to enhance processes and clinical case management Executive/Leadership Dashboards already proven in the MHS Knowledge Management platform already proven in the DoD Collaboration/Records Management tools to allow effective case management Classification Technologies proven in After Action Report environments to enable faster coding and clinical analysis.

Additionally, the functional improvements following the pilot study would allow the MHS to provide a consolidated, complete, and professional transition of each case to the Department of Veterans Affairs – establishing a far more effective and Soldier friendly transition from the MHS to the VA system.

Technical Solution

The information technology improvements described above include a variety of technical services already proven in either commercial or military healthcare – just not integrated or in support of a unified solution to such a critical problem as the proper care of wounded soldiers in a timely fashion. A team of clinical and clinical information technology leaders can deliver the following solutions from a pilot program:

• Business Process Automation Tool: Already proven in the DoD, this tool allows for effective assignment, tracking and reporting on individual tasks or cases and enables providers or leaders to know the performance metrics of the organization at a glance – thus resolving the visibility issue for cases lost, delayed or in the worst cases - simply forgotten.

• Integrated Clinical Database: This program is already proven to enhance processes and clinical case management across all three medical services and could be utilized for this pilot as a cost-effective means of supporting the clinical system needs of the provider.

Faster Help for the Recovering Wounded 215

• Executive/Leadership Dashboards: Already proven in the MHS community via an operational Knowledge Exchange, performance metric dashboards can be deployed very quickly to showcase exactly how the system is serving the critical needs of wounded soldiers.

• Knowledge Management (KM): Use a KM platform already proven in the DoD. This platform allows a much more rapid and collaborative environment to share lessons learned and support the provider community with better training and treatment information.

• Collaboration/Records Management: Secure, HIPAA compliant collaboration and Defense Department-certified e-Records tools support effective case management and electronic utilities for the provider community to discuss cases, share treatment information, and perform assessments critical to the speedy resolution of the MEB.

• Classification Technologies: Already proven in the After Action Report environments to enable faster coding and clinical analysis, classification technologies may also be useful in supporting a streamlined MEB process.

By leveraging a set of automation tools and a Business Process Automation framework through the capabilities referenced above, many key features can be delivered quickly:

• Log correspondence received from internal and external sources • Assign and route the correspondence to the responsible provider,

or a specific user for follow-up action • Assign due dates for various stages • Route the correspondence response to one or more office(s) with

collateral responsibility or to customer leadership for review and feedback

• Provide coordination input • Approve and sign the prepared response documents • Add comments • Attach electronic documents • Access comments and documents attached by other users • Track the status of the correspondence items using both audit

trails, and an intuitive graphical “map” updated in real-time for each process

Better Data for Better Care 216

• View the audit trail of active and completed (archived) process instances

• Report on correspondence processed with the application • Provide regular updates and correspondence (including care or

rehabilitation plans if necessary) to the beneficiary or their family

Additionally, the functional and technology improvements expected from the pilot study would allow the MHS to provide a consolidated, complete, and professional transition of each case from the operating theater to the Military Treatment Facility and where necessary to the Department of Veterans Affairs – establishing a far more effective and Soldier-friendly transition from the MHS to the VA system.

It is our duty to speedily assist America’s Wounded Veterans and their families to receive Treatment and Transition services.

Functional improvements should be clear, long-term, and systemic – not quick fixes. The program recommendations in this white paper establish a strong baseline for quality care – we should expect no less in service to our wounded.

Technically, patients and providers would be fully supported with information resources if this paper’s recommendations were implemented – a true information platform for world-class care.

Conclusion

A combined functional and technical solution to the current MEB problem within the MHS is an approach that will lead to lasting positive results for the wounded soldier and their families. This approach can be of great help to the patient and the provider as the military medical community seeks to resolve a critical, time-sensitive issue.

Author and Contributor Biographies 217

Author and Contributor Biographies

All authors and contributors to this work are officers or associates of Evolvent Technologies.

Primary Editor

Paul Ramsaroop

Primary Writers

Bill Oldham

Geoff Howard

John Daniels

J.D. Whitlock

Kim Bahrami

Primary Contributors

Al Hamilton

Monty Nanton

David Parker, M.D.

Better Data for Better Care 218

Paul A. Ramsaroop, President and Chief Operating Officer

Mr. Ramsaroop has served in positions of increasing responsibility within Evolvent after a long tour of duty in the IT field with several well-known companies, including McDonnell-Douglas, Boeing, Intellidyne, and Maryville Technologies. He also served as CTO of a dot-com startup, HealthCPR, which championed a personal health bank a few years before its time.

Paul’s career began with acquiring an intimate hands-on knowledge of application development, programming in various languages from Assembler to Visual Basic, and advancing from developer to a founding Executive of Evolvent. Within Evolvent, Paul is currently responsible for daily operations and program delivery, along with employee and client relations.

Paul’s well known and reliable service attitude and de-termination to do what it takes to accomplish a task, makes him the man to turn to when seeking the tools and guidance to be successful. Paul is highly skillful at managing programs and the people who run them.

A few of Paul’s accomplishments prior to his executive duties within Evolvent are:

• Leading a non-profit humanitarian effort in Panama partnering with the US Military

• Senior Web Operations Manager for DoD TRICARE enterprise Website

• Technical lead within the McDonnell-Douglas’ CADD system development team

• Technical Program lead launching the AFMS Knowledge Exchange (Kx 1.0) (https://kx.afms.mil/)

Author and Contributor Biographies 219

Geoff Howard, Chief Technology Officer

Mr. Howard is a proven technology leader with a diverse background covering technology, management, and science over the past 20 years.

Mr. Howard’s technical experience covers all aspects of software architecture and devel-opment with a focus on Internet technologies and Service Oriented Architectures, as well as security, hardware, and networking. Within the medical space, Mr. Howard’s experience covers Electronic Health Records, Health Information Exchange, and Healthcare Imaging.

He has significant involvement in both the legacy and future of the Military Health Service Electronic Health Record, and with the Health Information Exchange between the DoD and the Veterans Administration. His experience on varied complex projects has led to an ability to apply technology and best practices effectively to accomplish business and mission objectives.

Since joining Evolvent in 2003, he has led or directed engage-ments for the MHS, Veteran’s Administration, Air Force, Navy, Department of Energy, Department of Agriculture, Department of Homeland Security, and Department of State and brings insight into architecture and best practices from these engagements across the federal space.

Better Data for Better Care 220

John Daniels, Chief Information Officer

John’s military and civilian healthcare experience spans 24 years and includes national recognition as a Modern Healthcare Up & Comer Award recipient. He last served as CIO for the USAF Academy Hospital during which time the hospital was the first and only Air Force hospital named on the 25 Most Wired Small and Rural list by Hospital & Health Networks (2005 and 2006). He was Chair of the USAF Medical Information Services Working Group that drove the IT direction, standards, and policies for 74 facilities worldwide.

John served as Senior Fellow and Vice President of IT Operations at the Harris County Hospital District (Houston, Texas) providing IT support for 3 hospitals and 24 ambulatory sites, and was CIO for DoD’s former seven-state Health Services Region 5 ensuring successful execution of the IM/IT strategic plan for a $3.4B managed care support contract serving over 650,000 beneficiaries.

Mr. Daniels is a Fellow of both the Healthcare Information and Management Systems Society (HIMSS) and the American College of Healthcare Executives (ACHE).

Author and Contributor Biographies 221

J. D. Whitlock, VP, Technology Programs

Mr. Whitlock. has 14 years experience in healthcare administration, including assign-ments in group practice management and patient administration prior to specializing in healthcare information technology (HIT). A retired Air Force Lieutenant Colonel, J.D. spent the last 5 years of his Medical Service Corps career managing a portfolio of HIT projects at the Air Force Medical Service Office of the CIO. These projects integrated Information Management and Knowledge Management disciplines to improve operational efficiency, promote transparency, and enable decision support for expeditionary, clinical, and business operations throughout the Air Force Medical Service.

In 2007, J.D. deployed to Craig Joint Theater Hospital, Bagram Airfield, Afghanistan, as Chief, Patient Administration and Aeromedical Evacuation. This combination of operational ex-perience with in-depth HIT knowledge permits him to quickly understand the needs of his customers and deliver results that meet mission requirements.

J.D. began his military career in the U.S. Navy as a Surface Warfare Officer. He holds a Master of Public Health from UCLA, and an MBA in Management of Information Systems from the University of Georgia. He is certified as a Project Management Professional (PMP) and Certified Professional in Healthcare Information and Management Systems (CPHIMS).

Better Data for Better Care 222

Kim Bahrami, Senior Vice President

Kimberly Bahrami, Senior Vice President, has 17 years experience with DoD Military Health Systems and is knowledgeable and experienced in MHS, TMA and Service Medical operational, business and technology environments, architectures, processes and operations. From 2001-2004, she served as the State Chief Information Officer for the State of Florida where she interfaced closely with DoD and other federal organizations and businesses engaged in national, state and local public healthcare initiatives and post 09/11 planning and recovery efforts.

Kim led a wide range of development, integration, and managed service outsourcing initiatives including enterprise portal transformations, web application design, development and customization efforts, ERP implementations; enterprise technical services desk consolidations, network convergence projects and enterprise security programs.

Ms. Bahrami offers special expertise in healthcare information management solution and services development with extensive experience in multi-customer, multi-vendor environments - aligning IT resources with business goals and developing and implementing service architectures for extended government enterprises.

Author and Contributor Biographies 223

Al Hamilton, Vice President, Strategic Federal Healthcare Operations

Mr. Hamilton served in the U.S. Army for over 20 years as a Medical Service Corps Officer. He has held challenging positions such as Chief Information Officer, Director of Human Resources, Operations Manager, Professor teaching in a Master of Business Admin-istration and Master of Healthcare Administration programs through Baylor University, and advisor to the Executive Staff of United States Central Command Headquarters and the Army Chief of Staff Headquarters

Al was engaged in the Joint Telemedicine Medical Network initiative to allocate dedicated bandwidth in support of telemedicine activities in deployed and mobile USCENTCOM.

As a key member of Evolvent’s management team, he will assist in strategically developing and executing technical and business solutions for Evolvent’s growing client base in Federal Healthcare.

Al provides thought leadership internally and externally to continually enhance Evolvent’s practice of bringing real value to it customers and working collaboratively to ensure efficient and effective solutions are implemented.

Hamilton holds a BS in Computer Science from Augusta State University, MSA in Software Engineering from Central Michigan University, and a Doctorate of Philosophy in Information Decision Systems/Decision Science from George Washington University.

Al recently was selected as a member of the Military Medical Society; Order of the Medical Military Merit, to attend Nobel Peace Prize Week, and selected to serve as a Fellow with the Commission on Accreditation of Healthcare Management Education.

Better Data for Better Care page 224

Monty Nanton, Senior Vice President

Monty Nanton, Senior Vice President, brings more than 24 years experience delivering healthcare and health information technology project management to Team Evolvent. A proven leader in military medicine with extensive experience in coordinating human, financial, and materiel resources.

Mr. Nanton’s military experience encomp-assed responsibilities from five commands in both tactical and TDA environments to executive staff positions in managing medical information technology for the peninsula of South Korea, Medical Center CIO roles including a tour at BAMC, and joint initiatives in medical surveillance throughout CONUS and deployed operations.

As a Senior Aviator, Monty’s exposure to the full spectrum of military healthcare from evacuation at point of injury to level 3 care, Mr. Nanton is well respected and a known SME in military medicine. He has a proven track record of technology management and implementing solutions that support organizational goals.

Better Data for Better Care 225

David Parker, M.D., Chief Medical Officer

Dr. Parker is a board-certified Family Physician who has taken that knowledge, 8 years of practice in the Military Health System and put it to work over the past 17 years in the clinical information technology arena. He has a demonstrated ability to bridge these three domains (clinical, military, and IT) to form innovative designs and solutions.

He currently serves as the Chief Medical Officer for Evolvent. He is widely known and respected in the health IT arena in the Department of Defense’s Health Affairs organization, and is constantly consulted as the person “who really knows what is what” with regard to the DoD’s electronic health record and associated systems. He now is taking the lead in bringing this experience to the broader set of Health IT Programs.

Dr. Parker has been at the forefront of the electronic health record (EHR) industry and a leader in the development of DoD health IT systems since the beginning of his career. He was the chief functional proponent and SME for the Clinical Integrated Workstation (CIW) project—the DoD’s first EHR effort/prototype. He led the functional design and requirements effort for DesertCare, the DoD’s first web-based EHR and one of the earliest to be utilized in “theater”. This application, discussed in Bill Gate’s book, Business @ The Speed of Thought, was deployed personally by Dr. Parker to a remote base in Saudi Arabia, where he installed the server and software, as well as trained the providers and technicians. This has enabled him to be productively conversant in web, virtualization, database, SOA and other technologies and architectures.

Dr. Parker is a graduate of The University of Texas at Austin and The University of Texas Medical Branch, Galveston, and completed a Family Practice Residency program at Scott AFB, IL.

Better Data for Better Care page 226

Evolvent Technologies Inc. Capabilities Evolvent is a focused systems integrator with particular

strength in federal healthcare systems. Our areas of focus include: EMR/PHR Implementation; Imaging Management Solutions Development, Implementation and Support; Clinical Data Repository Development, Implementation and Support; Health Data Exchange Application and Network Development Support; and CyberSecurity Engineering, Consulting and Support. Our services in support of the business areas are usually integrated product teams with experts in the following disciplines: Business Process Reengineering, System Integration, Application Dev-elopment and CyberSecurity.

Better Data for Better Care 227

NAICS Codes: 518210 Data Processing, Hosting, and Related Services

541512 Computer Systems Design Services 541513 Computer Facilities Management Services 541519 Other Computer Related Services

541611 Administrative Management and General Management Consulting Services

541614 Process, Physical Distribution, and Logistics Consulting Services

541618 Other Management Consulting Services 541690 Other Scientific and Technical Consulting Services 541990 All Other Professional, Scientific, and Technical Services

561320 Temporary Help Services 611420 Computer Training 611430 Professional and Management Development Training

611710 Educational Support Services 519310 Internet Publishing & Broadcasting & Web Search Portals 519190 All Other Information Services

523920 Portfolio Management 524114 Direct Health and Medical Insurance Carriers 524298 All Other Insurance Related Activities

541330 Engineering Services 541511 Custom Computer Programming Services 541712 Research and Development in the Physical, Engineering

and Life Sciences (except Biotechnology)

561110 Office Administrative Services 561210 Facilities Support Services 561440 Collection Agencies

561990 All Other Support Services 621999 All Other Miscellaneous Ambulatory Health Care Services