htm benchmarking guide - amazon...

33
Why Benchmarking Matters, and How You Can Do It HTM Benchmarking Guide

Upload: others

Post on 28-May-2020

9 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

Why Benchmarking Matters, and How You Can Do It

HTM Benchmark ing Gu ide

Page 2: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

Published by

Association for the Advancement of Medical Instrumentation 4301 N. Fairfax Drive, Suite 301 Arlington, VA 22203-1633 www.aami.org © 2015 by the Association for the Advancement of Medical Instrumentation Permission is granted to distribute or reproduce this report in its entirety for noncommercial and educational purposes. For other uses, or to order reprints of this report, complete the reprint request form at www.aami.org or contact AAMI, 4301 N. Fairfax Dr., Suite 301, Arlington, VA 22203-1633. Phone: +1-703-525-4890; Fax: +1-703-525-1067.

Page 3: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

© 2015 AAMI ▪ HTM Benchmarking Guide 1

HTM Benchmarking Guide Why Benchmarking Matters, and How You Can Do It

By: Ted Cohen, UC Davis Medical Center

Frank Painter, University of Connecticut

Matt Baretich, Baretich Engineering

Page 4: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

2 © 2015 AAMI ▪ HTM Benchmarking Guide

Benchmarking has been around nearly as long as the clinical engineering profession itself.

From the beginning, we have tried to measure our current performance against standards of

practice, against our own past performance, and against other healthcare technology

management (HTM) programs. It has not been easy.

One of the fundamental problems has been that we have measured our performance in ways

that differ from those of our colleagues. We also have long recognized that we are dealing with

“apples and oranges” when trying to compare our HTM programs, which we typically regard as

unique and special, with other programs. If our numbers look bad, we can say we are different

and they do not measure things the way we do.

But those days are gone. As HTM professionals, we need to measure our performance and be

able to demonstrate that it is competitive with performance in other healthcare facilities. This

HTM Benchmarking Guide, published by AAMI, is an effort to take a step in the right direction.

We look at the history and current need for benchmarking. We summarize the literature and

identify available tools. We provide examples from AAMI’s web-based Benchmarking Solutions

platform as food for thought and a starting point for discussion. Time to get moving!

— Ted Cohen, Frank Painter, Matt Baretich

Subject Matter Experts for AAMI’s Benchmarking Solutions

Page 5: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

© 2015 AAMI ▪ HTM Benchmarking Guide 3

Introduction

The U.S. healthcare system, and the healthcare technology on which it depends, is going

through complex changes, with sharp contrasts, in a variety of ways. The Affordable Care Act

and other federal actions such as the HITECH (Health Information Technology for Economic

and Clinical Health) Act and the ARRA (American Recovery and Reinvestment Act) have

created many regulations that include financial incentives to computerize healthcare data and to

expand the use of electronic medical records (EMRs)/electronic health records (EHRs).

At the same time, failure to abide by some of these regulations will in the near future result in

financial penalties. These financial and regulatory pressures, as well as the need to improve

U.S. healthcare, are driving healthcare organizations to hasten their computerization of many

functions. In addition, these pressures are driving further consolidation of hospitals and medical

providers into larger multihospital systems with large associated group practices. Meanwhile the

widespread use of technology continues to increase in healthcare and in all aspects of our lives,

including wireless communication technologies, wearable activity monitoring devices, video

streaming, and new automobile technologies.

How do these changes affect HTM? The consolidation of healthcare organizations has allowed

many organizations to create and grow their in-house technical abilities. Most healthcare

organizations are now large enough to have a well-trained and active HTM or Clinical

Engineering department, either "in-house" or available from the corporate or system level. The

HTM program may not directly service every technology (it’s rare that they can), but most can

support many of the more common technologies and many are capable of managing the service

and supporting all of the medical technologies (albeit with some help from the manufacturer and

other service providers).

One important result, as projected by the U.S. Bureau of Labor Statistics, is that the Biomedical

Equipment Technician occupational category (which the BLS refers to as Medical Equipment

Repairers) is projected to grow by 30% per year for the next 10 years and is listed as one of the

10 fastest-growing professions in the United States (www.bls.gov/ooh/installation-maintenance-

and-repair/medical-equipment-repairers.htm).

Another impact on HTM is that computerization in the healthcare delivery system has resulted in

a tremendous growth in the number of medical devices connected to the information technology

(IT) network. Not long ago, the majority of connected devices were a few large, expensive

Page 6: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

4 © 2015 AAMI ▪ HTM Benchmarking Guide

medical imaging systems connected to an image repository system or PACS (picture archiving

and communication system). Now the average hospital has hundreds or thousands of other

devices—including continuous patient monitoring devices, “smart” infusion pumps, glucometers,

and other products—that send and receive data over the IT network.

Another impact, and contrast, is in clinical alarm management. The Joint Commission issued a

National Patient Safety Goal for 2014 and 2015 calling on healthcare facilities to better manage

clinical alarms. With the increase in continuous monitoring in acute care areas and further

requirements for continuously monitoring patients on opioid medications likely forthcoming,

alarms are going to become increasingly prevalent. At the same time, there is pressure to

decrease the number of “nuisance” alarms in order to avoid “alarm fatigue,” which is the

desensitization to meaningless or low-priority alarms (aamiblog.org/2015/02/12/frank-e-block-jr-

will-surveillance-alarm-systems-for-opioid-induced-respiratory-depression-make-things-better-

or-worse). All of these changes drive a need for more engineers, technicians, and IT

professionals to work in healthcare to make these new complex “systems of systems” work

effectively.

How do all these changes affect HTM benchmarking as a management tool? HTM programs

are growing and need to do so efficiently. Benchmarking allows some guidelines and relevant

comparisons to similar organizations (peers) to be made regarding staffing and other factors in

a growing HTM or Clinical Engineering programs. Multiple metrics are needed to accurately

measure how well an HTM program operates. Financial performance, customer service,

equipment uptime, and other indicators all are needed. For example, the cost-of-service ratio

(COSR)—the ratio of the total of all internal and external service costs and repair parts costs

divided by the total of all medical device acquisition costs—continues to be the most well-

established financial performance metric. Device count has been shown to be a poor workload

indicator unless the equipment in the workload is all of similar cost and complexity.

Other contrasts on the HTM front include inefficiencies caused by the various, sometimes

contradictory, regulations from federal, state, local, and other regulators. Examples include

recent controversies regarding CMS (Centers for Medicare & Medicaid Services) regulatory

changes mandating that, under most circumstances, preventive maintenance procedures and

intervals be completed according to manufacturer recommendations

(www.aami.org/publications/BIT/2014/ND_Two_steps.pdf) and the continued mandate for

Page 7: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

© 2015 AAMI ▪ HTM Benchmarking Guide 5

periodic electrical safety leakage current inspections by many jurisdictions (despite recognition

in NFPA 99 [2012] Health Care Facilities Code that such testing is unnecessary).

With the exception of battery management, most medical equipment, particularly low- and

moderate-risk devices, no longer require periodic calibration or parts replacement and therefore

do not need to be on a routine inspection and maintenance program. Comprehensive

benchmarking includes metrics that, for example, calculate the ratio of devices on an inspection

and maintenance program to the total number of medical devices inventoried as an indicator of

whether an HTM program is decreasing the inefficient, but prevalent, “test everything

periodically” mentality.

Why Benchmark?

It’s all about performance improvement! As HTM program managers, we are continually

challenged to do better, to do more with less, to improve in terms of cost, quality, regulatory

compliance, patient safety, and every aspect of our work. Here’s how benchmarking can help:

1. Measure our performance. We have to know where we are before we can plan to move

forward. First, collect data using definitions that are standard across the HTM profession.

Next, calculate standard performance metrics. Basic metrics provide detailed assessments

within the broad areas of equipment, staffing, and costs. Examples include the various costs

associated with equipment maintenance and the value (acquisition cost, typically) of the

equipment maintained. Calculated metrics usually are ratios that normalize the data across

different institutions. For example, the COSR is the annual maintenance cost divided by the

equipment value.

2. Benchmark our performance. Ask the questions: 1) Where do we measure up? and 2)

Where do we fall short? To answer these questions we have to compare our performance

metrics with those of our peers—however we choose to define our peer group. We should

use benchmarking tools that give us flexibility in selecting a peer group and that have

performance data for a wide variety of HTM programs. We might be interested in comparing

the performance of our HTM programs with that of HTM programs in facilities of similar size

or geographic location. If we are a specialty facility, such as a pediatric hospital, we might

want to compare our performance with other such facilities. If we provide extensive in-house

maintenance services for imaging and laboratory equipment, we might not want to focus on

HTM programs that handle only general biomedical equipment.

Page 8: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

6 © 2015 AAMI ▪ HTM Benchmarking Guide

3. Improve our performance. When we have identified opportunities for improvement—

opportunities we know are achievable because our peers have achieved them—we need to

drill down into the data and see what they are doing better. We should search the HTM

literature. We should talk to other HTM professionals. When we have done our research, we

can create a performance improvement project and monitor our progress. Suppose we find

that our peers are doing better than we are in terms of COSR data. After further research, it

appears that the key factor is how they manage their imaging equipment service. That gives

us a target for our performance improvement efforts. Do we need to adjust the service levels

for our service contracts? Should we invest in more staff training so we can bring additional

services in house? Asking the right questions is the best way to get the right answers.

Performance Management: Measure, Benchmark, Improve

Case Study: HTM Staffing at UC Davis Medical Center

One use for benchmarking and metrics is to determine if your staffing levels are appropriate

compared with your workload and with other like institutions (peers). In these times of budget

constraints, and with personnel costs often being the largest percent of the budget, having

metrics that quantify staffing needs based on workload is very important.

At the University of California (UC) Davis Medical Center, we used AAMI’s Benchmarking

Solutions – Healthcare Technology Management to help define staffing metrics based on

several quantitative metrics and qualitative survey responses. These included the value of

equipment managed or maintained, full-time equivalent (FTE) counts, device counts per FTE,

derived hourly cost, and COSR.

The following is an example of the steps one might take to use AAMI’s Benchmarking Solutions

to establish and apply staffing metrics:

• Within AAMI’s Benchmarking Solutions, use the demographics selection (e.g., hospital size,

location) to define a peer group or use all respondents for a particular time period (e.g.,

2013).

• Measure and report your current and historic COSR (see Cohen 2010 in the references). A

COSR of less than about 6% validates that the overall expenses for current workload are

within reason (i.e., average or better compared with all respondents in AAMI’s

Benchmarking Solutions).

Page 9: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

© 2015 AAMI ▪ HTM Benchmarking Guide 7

• Measure and report current and historic staffing ratios based on the amount of equipment

supported. Several measurements can be made, including the amount (acquisition cost) of

equipment supported by one FTE, equipment counts supported by one FTE (only useful for

low-cost equipment), and workload estimates by type of equipment based on historical data,

manufacturer information, and other published workload data (e.g., ECRI Institute data).

• If the measurements are being used to justify additional staff needed to support a new or

expanded clinical service or a new healthcare facility, measure and report the net increase

in workload estimated for the new project’s equipment.

• Split the net increase in workload into “one-time startup workload” (e.g., planning,

installation, incoming inspection) and long-term, continuing repair and maintenance

workload. Make sure to remove from the long-term workload analysis any replaced

equipment that will be leaving.

• From the long-term workload increase, and the metrics listed above, calculate estimates of

the FTEs required to support the additional equipment.

• For any “big-ticket” items, develop a specific draft support plan in collaboration with the

customer department and refine the specifics for these more complex and expensive

systems. This may include obtaining quotations for parts agreements or other “shared”

support arrangements with the manufacturer or a third-party vendor.

• Other metrics to consider analyzing include the ratio of external repair and maintenance

costs to overall costs (sometimes called “penetration”) and your hourly service costs

compared with your peers.

• Conduct an internal review and “sanity check” on the preliminary results.

• Review the analysis results with your administrator and key stakeholders.

The Clinical Engineering department at UC Davis Medical Center used a similar process for

developing plans for adding staff for a new hospital wing that opened in 2010. We used

benchmarking data from a peer group of 17 university hospitals that all maintained at least

some imaging equipment in-house. The peer group data showed that one FTE technician or

specialist maintained about $5.5 million (acquisition cost) of equipment.

Evaluation of the additional workload details based on the types of equipment involved revealed

that two BMET 3 specialists would be needed: one for operating room (OR) integration and

other OR equipment support and one for the automated clinical laboratory. We also would need

Page 10: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

8 © 2015 AAMI ▪ HTM Benchmarking Guide

a Radiology Equipment Specialist for the seven new or upgraded

cardiac catheterization laboratories and other new radiology

equipment. Other positions requested were designated as BMET

2 generalists.

A draft report was presented to hospital administration. After that

presentation, and further discussion with Clinical Laboratory

management and the vendor of the lab automation lines, it was

determined that the FTE request for the lab automation line would

be dropped because the lab automation line was going to be

placed on a full-service contract, and the vendor was not

amenable to any workload sharing with consequent cost savings.

Subsequently, a “final” report was presented to a group of C-suite

leadership who reviewed all staffing requests for the new building.

Ultimately, four FTEs were approved for Clinical Engineering.

Benchmark data analysis definitely helped “sell” this proposal.

Depending on the project, another AAMI Benchmarking Solutions

metric that has been widely used to further validate or adjust

staffing levels is device count per technician (1,087 devices per

technician in this case study). However, with an average device

cost around $11,000, one should be very careful using this metric

for very expensive or very low-cost equipment. One CT scanner

does not equal one infusion pump.

One CT scanner does not equal one infusion pump.

Benchmarking in Context

Several HTM-related benchmarking tools are available today:

• AAMI: Benchmarking Solutions – Healthcare Technology Management

www.aami.org/productspublications/ProductDetail.aspx?ItemNumber=1062

• ECRI Institute: BiomedicalBenchmark

www.ecri.org/components/biomedicalbenchmark

“Without data you’re just another person with an opinion.”

—W. Edwards Deming

Page 11: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

© 2015 AAMI ▪ HTM Benchmarking Guide 9

• Truven Health Analytics: ActionOI

truvenhealth.com/your-healthcare-focus/hospital-management-decisions/actionoi

• Veterans Health Administration: Enhanced Biomedical Engineering Resources Survey

(EBERS)

www.va.gov/health

• Children’s Hospital Association: Benchmarking Tools

benchmark.childrenshospitals.net

• Internal and proprietary benchmarking tools

Some of these tools are commercially available to any healthcare organization (AAMI’s

Benchmarking Solutions, BiomedicalBenchmark, and ActionOI), while others are proprietary

and customized for the internal objectives of their particular organizations. These tools also

differ in scope and focus: ActionOI benchmarks a wide range of activities for healthcare

organizations, and the HTM benchmarking module is relatively limited in scope.

BiomedicalBenchmark focuses on “inspection and maintenance benchmarking” with detailed

metrics and analytical tools for inspection and maintenance activities, service contracts, and

related aspects of HTM practice. AAMI’s Benchmarking Solutions seeks to support HTM

department management with metrics related to equipment inventory, HTM-related costs, and

HTM staffing.

Another important dimension for considering benchmarking tools is the level of a particular HTM

program. AAMI’s HTM Levels Guide

(www.aami.org/productspublications/ProductDetail.aspx?ItemNumber=1391) defines three HTM

program levels:

• Level 1 – Fundamental: These programs provide essential technology services and

compliance with applicable standards and regulations.

• Level 2 – Progressive: Programs at this level have moved beyond the basics to provide

additional services and focus on cost-effectiveness.

• Level 3 – Advanced: These programs are on the leading edge, demonstrating the full range

of potential for HTM contributions to patient care.

Level 1 HTM programs typically engage in only the most basic types of benchmarking. They

will, for example, measure their financial performance in terms of budget compliance. They also

Page 12: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

10 © 2015 AAMI ▪ HTM Benchmarking Guide

will measure Joint Commission–mandated metrics for on-time completion of scheduled

maintenance.

Level 2 HTM programs carry out internal benchmarking activities, monitoring multiple metrics

within the program over time. Examples of additional metrics at this level are detailed

maintenance costs, use error, productivity, customer satisfaction, and the COSR. Performance

improvement activities focus on improving these metrics year over year.

Level 2 HTM programs also consider external benchmarking relative to HTM programs in similar

healthcare organizations. Some of the internal metrics may be compared with peers and

competitors when those external metrics are available.

Level 3 HTM programs regularly perform internal and external benchmarking. These efforts are

an integral part of their ongoing identification of best practices and pursuit of improved

performance. This level of benchmarking requires access to external performance metrics,

typically through the use of a benchmarking tool like those listed above. It also typically requires

generation and reporting of detailed metrics to a shared database associated with the selected

benchmarking tool.

Page 13: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

© 2015 AAMI ▪ HTM Benchmarking Guide 11

Analysis of Acquisition Cost Data

Cost-of-Service Ratio (COSR) and Acquisition Cost Data for 38 Clinical Engineering Programs

Page 14: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

12 © 2015 AAMI ▪ HTM Benchmarking Guide

Benchmarking Roadmap

Page 15: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

© 2015 AAMI ▪ HTM Benchmarking Guide 13

Foundation: Before embarking on a benchmarking initiative, having a functional HTM operation

is important. This can be in-house or contracted, large or small, but basic medical equipment

repair and maintenance services should be operational, including a reasonably accurate

medical equipment inventory, qualified staff, and a work order system of some type. In a small

operation, these functions might be shared with other departments (e.g., Facilities, IT). The

AAMI HTM Levels Guide mentioned above is an easy place to self-assess to determine whether

the basics are being met. Before benchmarking, most of the Level 1 HTM Program functions

should be operational.

Needs assessment: Determine areas that need to be improved by using the HTM Levels Guide

assessment tool, your institution’s leadership goals, or your own departmental objectives.

Prioritize the one or two improvement initiatives on which you want to focus.

Design an improvement plan: Share your needs assessment and prioritization with your

leadership and staff, then design a plan to accomplish your improvement objectives and “kick

off” the improvement initiative(s).

Set goals: Set specific quantitative goals and a completion timeline. If this is a long-term

project, set intermediate milestones to help stay on track.

Data requirements: Determine what data need to be measured in order to monitor what you

are measuring.

Data sources: Determine the sources you will use to collect the data. Typical data sources

include your computerized maintenance management system (CMMS), financial reports, and

purchasing records. We have found that the most difficult information for HTM programs to

access is the value of the medical equipment managed or maintained by HTM programs.

Historically, this information has not flowed to HTM; however, without equipment value data

(e.g., acquisition cost), calculating the COSR—the key financial indicator of HTM program cost-

effectiveness—is impossible. In such cases, one of the first performance improvement efforts of

the HTM program may simply be getting access to this information.

Data collection: Collect and record the data periodically (e.g., quarterly for internal

benchmarking, annually for external benchmarking).

Internal benchmarking: Frequently compare the newly collected data against your prior data

sets to determine if you are trending in the direction desired for your objectives.

Page 16: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

14 © 2015 AAMI ▪ HTM Benchmarking Guide

External benchmarking: Determine your appropriate comparison (peer) group. Usually less

frequently than the internal comparison, compare the newly collected data against the peer

group’s prior data sets to determine if you are trending in the direction desired for your

objectives.

Goals met? Have the goals been met? Are the metrics trending in the right direction and at the

expected pace? If yes, continue until the goals are met. When goals are met, ensure they are

sustainable before moving on to another performance improvement project.

If the project is not trending in the right direction or is moving too slowly, then assess whether

the improvement needs to be redesigned and reimplemented. Collect the next set of data after

allowing enough time for improvements to show progress.

Future Directions

The various benchmarking tools have substantial overlap in terms of data fields and calculated

metrics; however, important differences result from variations in the scope and focus of the

tools. Moreover, two HTM programs using the same benchmarking tool differ in how they record

their data. Do personnel costs include benefits? Do equipment inventories count each module of

a patient monitoring system or the system as a whole? Do personnel hours include

“nonproductive” time, such as meetings and vacations? If we don’t collect data consistently, we

can’t benchmark our performance against our peers.

If we don’t collect data consistently, we can’t benchmark

our performance against our peers.

This “apples and oranges” situation arises primarily from two factors. First, HTM programs have

tended to evolve independently. Pioneering HTM professionals invented the field and managed

their programs as they saw fit. There was little need (or opportunity) for benchmarking when the

emphasis was on just getting the job done. But consolidation in the healthcare system and the

need for continual performance improvement have made variations in HTM practice untenable.

The second factor, perhaps a consequence of the first, is that the HTM profession has not made

standardization a priority. We all tend to think of our HTM programs as unique and special, but

realistically, many of our core processes are quite similar: equipment inspection, maintenance,

Page 17: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

© 2015 AAMI ▪ HTM Benchmarking Guide 15

and repair. We now are seeing the development of standards, both formal and de facto, in many

aspects of HTM practice. Additional standardization may be warranted to support our

performance improvement objectives.

To help move this process along, the HTM Benchmarking Guide includes a summary of all the

metrics used in AAMI’s Benchmarking Solutions. This list has been evolving since 2009 and has

been improved over the years by feedback from AAMI’s Benchmarking Solutions subscribers.

Although it is based on the HTM literature and the experience of many HTM professionals, we

do not present it as a final product. Instead, it is intended to start a conversation among HTM

stakeholders about how to move forward.

The HTM stakeholders we have in mind include, of course, the many HTM professionals

practicing in healthcare facilities—large and small facilities, independent and members of

healthcare systems, with HTM programs of varying levels—and the healthcare systems and

organizations of which they are part.

Other stakeholders include professional associations like AAMI, accrediting organizations like

the Joint Commission, and recognized professional resources like ECRI Institute. Also included

are companies that provide HTM services to healthcare facilities.

Other key stakeholders are companies that provide CMMS software. A large portion of the

benchmarking data is contained in CMMS databases, having relevant CMMS data flow easily

into benchmarking software would be beneficial.

We also see an important trend among leading CMMS software providers: moving beyond the

basic inventory, scheduling, and maintenance history functions to providing comprehensive

HTM management tools. As that trend continues, more and more CMMS data fields will become

relevant to HTM benchmarking. That could mean more data flowing (with some degree of

automation) from CMMS software to benchmarking software. It also might mean eventually

incorporating benchmarking functions into CMMS software.

But it all depends on standardization of data fields and definitions, data collection methods, and

calculated performance metrics. We hope you will use this benchmarking guide as a reference

for future benchmarking projects. If we can all move the HTM field toward standardization of key

definitions and data fields, as well as making data collection easier, we will make true progress

toward widespread HTM benchmarking.

Page 18: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

16 © 2015 AAMI ▪ HTM Benchmarking Guide

References

• Dickey DM. A Numbers Game – Benchmarker Beware. Journal of Clinical Engineering.

2008;33(3):140–2.

• Cohen T. AAMI's Benchmarking Solution: Analysis of Cost of Service Ratio and Other

Metrics. Biomedical Instrumentation & Technology. 2010;44(4):346–9.

• Lynch PK. Are You Ready For the Future? Biomedical Instrumentation & Technology.

2012;46(2):150–1.

• Dinsmore JD Jr. Benchmarking 101. Biomedical Instrumentation & Technology.

2006;40(1):29–31.

• Williams JS. Benchmarking Best Practice: Using Quality Measures to Strengthen Equipment

Services. Biomedical Instrumentation & Technology. 2009;43(3):209–10.

• Dickey DM, Wylie M, Gresch A. Benchmarking That Matters: A Proposed New Direction for

Clinical Engineering. Biomedical Instrumentation & Technology. 2011;45(2):115–20.

• Netwal R. Budget Model Can Help Department Take Control of Its Financial Future.

Biomedical Instrumentation & Technology. 2010;44(5):389–90.

• Stiefel RH. Cautions and Precautions. Biomedical Instrumentation & Technology.

2011;45(3):231.

• Wang B, Eliason RW, Richards SM. Clinical Engineering Benchmarking: An Analysis of

American Acute Care Hospitals. Journal of Clinical Engineering. 2008;33(1):24–7.

• Wang B, Rui T, Fedele J. Clinical Engineering Productivity and Staffing Revisited: How

Should It Be Measured and Used? Journal of Clinical Engineering. 2012;37(4):135–45.

• Communication Tips for Clinical Engineering. Health Devices. 2013;42(6):188–93.

• Barta RA. Dashboards: A Required Business Management Tool. Biomedical Instrumentation

& Technology. 2010;44(3)228–30.

• Painter F. Designing and Developing A CE Department. Biomedical Instrumentation &

Technology. 2011;45(6):488–90.

• Gaev JA. Developing Benchmarks for Clinical Engineering Activities: A Methodology.

Biomedical Instrumentation & Technology. 2007;41(4):267–77.

Page 19: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

© 2015 AAMI ▪ HTM Benchmarking Guide 17

• Wang B, Fedele J, Pridgen B. Evidence-Based Maintenance: Part I: Measuring

Maintenance Effectiveness With Failure Codes. Journal of Clinical Engineering.

2010;35(3):132–44.

• Wang B, Fedele J, Pridgen B. Evidence-Based Maintenance: Part II: Comparing

Maintenance Strategies Using Failure Codes. Journal of Clinical Engineering.

2010;35(4):223–30.

• Wang B, Fedele J, Pridgen B. Evidence-Based Maintenance: Part III, Enhancing Patient

Safety Using Failure Code Analysis. Journal of Clinical Engineering. 2011;36(2):72–84.

• Wang B, Rui T, Koslosky J. Evidence-Based Maintenance: Part IV—Comparison of

Scheduled Inspection Procedures. Journal of Clinical Engineering. 2013;38(3):108–16.

• Williams JS. For the Kids: Managing Medical Equipment in Children's Hospitals. Biomedical

Instrumentation & Technology. 2009;43(5):360–7.

• Wang B, Eliason RW, Vanderzee SC. Global Failure Rate: A Promising Medical Equipment

Management Outcome Benchmark. Journal of Clinical Engineering. 2006;31(3):145–51.

• Baretich MF. How to Use Financial Benchmarks. Biomedical Instrumentation & Technology.

2011;45(5)405–7.

• Measuring Up. Health Devices. 2009;38(9):296–301.

• Cruz AM, Barr C, Denis ER. Offering Integrated Medical Equipment Management in an

Application Service Provider Model. Biomedical Instrumentation & Technology.

2007;41(6)479–90.

• Maddock K, Hertzler L. On Sculpture, Baseball, and Benchmarking. Biomedical

Instrumentation & Technology. 2007;41(4)332.

• Vockley M. Opening Doors: Get Noticed by the C-Suite. Biomedical Instrumentation &

Technology. 2010;44(3):192–7.

• Stiefel RH. Performance Improvement: A Topic Whose Time Has Come—Again. Biomedical

Instrumentation & Technology. 2010;44(2):147–9.

• Perspectives From ECRI Institute: Benchmarking for Clinical Engineering Departments.

Journal of Clinical Engineering. 2009;34(1):39–40.

Page 20: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

18 © 2015 AAMI ▪ HTM Benchmarking Guide

• Cohen T. Staffing Metrics: A Case Study. Biomedical Instrumentation & Technology.

2011;45(4):321–3.

• Dummert M. Staffing Models: How to Do It Right. Biomedical Instrumentation & Technology.

2012;46(1)44–7.

• Williams JS. Steady Performance Gives Rural Department Edge to Respond to Challenges.

Biomedical Instrumentation & Technology. 2008;42(4):291–3.

• Maddock KE. The (Benchmarking) Glass is Half Full. Biomedical Instrumentation &

Technology. 2006;40(4):328.

• Baretich MF. The Value of Certification. Biomedical Instrumentation & Technology.

2012;46(1):68–71.

• Kobernick T. Using Data to Help Evaluate and Justify Your Staffing Needs. Biomedical

Instrumentation & Technology. 2010;44(2):118–9.

• Lewis D. When Benchmarking Goes Bad: What Recourse Do CE Departments Have?

Biomedical Instrumentation & Technology. 2008;42(5):364–6.

• Wang B. Who Drank My Wine? Biomedical Instrumentation & Technology. 2006;40(6):418.

• Lynch PK. Why Can't We Learn? Biomedical Instrumentation & Technology.

2014;48(5)398–9.

Page 21: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

© 2015 AAMI ▪ HTM Benchmarking Guide 19

Metrics

The following pages outline the metrics used in AAMI’s Benchmarking Solutions – Healthcare

Technology Management. This material is presented as a basis for developing a standardized

set of HTM program management metrics.

Page 22: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

20 © 2015 AAMI ▪ HTM Benchmarking Guide

Metric Number

Metric Name

Help Text Section Page Data

Source

2 HCO Beds • Enter the licensed inpatient bed capacity for the HCO. Health Care Organization (HCO)

Size & Location Input

3 HCO Location Enter the region where the HCO is located. Select only one. • Northeast = DC, DE, MD, PA, NY, NJ, CT, MA, VT, RI, NH, ME • Southeast = AR, LA, MS, AL, GA, FL, SC, NC, VA, WV, KY, TN • Midwest = MN, WI, MI, IN, OH, IL, IA, MO, KS, NE, SD, ND • West = CO, MT, WY, UT, NV, ID, CA, OR, WA, HI, AK • Southwest = TX, NM, AZ, OK • Canada • International = Any non-US or non-Canadian country

Health Care Organization (HCO)

Size & Location Input

4 HCO Expenses Enter the total annual expenses for the HCO. For purposes of this survey, the Health Care Organization (HCO) is defined as the single facility for which you will be entering Clinical Engineering (CE) data.

Health Care Organization (HCO)

Volume Input

5 HCO Adjusted Discharges

Enter the total annual Adjusted Discharges (CMI Weighted) value for the HCO. This is a measure of service volume for the HCO. It represents the total number of inpatient discharges, adjusted for the number of outpatient visits, then multiplied by the CMI (Case Mix Index). CMI is a measure of patient complexity that typically ranges from 1.0 to 2.0, where higher values indicate that on average the HCO treats patients of greater complexity. You should be able to get the Adjusted Discharges (CMI Weighted) value from Finance (or get the annual Adjusted Discharges value and the average CMI value and multiply them together).

Health Care Organization (HCO)

Volume Input

6 CE Program Model

CE program services are primarily provided by which one of these methods - select only one. • By in-house HCO employees • By HCO system-level employees • By contract with an independent service organization (ISO) • By contract with a manufacturer providing multi-vendor service • Other

Clinical Engineering (CE) Program

Structure Input

7 CE Program Reporting

The CE program manager reports to which one of these HCO services - select only one. • Facility Services • Information Technology (IT) • Supply Chain Management • Nursing • Support Services • Finance • Administration Other

Clinical Engineering (CE) Program

Structure Input

Page 23: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

© 2015 AAMI ▪ HTM Benchmarking Guide 21

Metric Number

Metric Name

Help Text Section Page Data

Source

8 Committee Participation

A representative of CE is an active member of which of these committees (check all that apply). • Safety/Environment of Care • Capital Acquisition • Facility Design/Renovation • Risk Management • Patient Safety • Strategic Planning • Nursing Operations or Education • IT Networking or Security • Laser Safety • Product Evaluation • Quality Assurance • Performance Improvement • Radiation Safety • None of the above

Clinical Engineering (CE) Program

Responsibilities Input

9 Equipment Acquisition Responsibilities

The CE program has significant responsibility and accountability for which of these medical equipment acquisition activities (check all that apply). • Participation in equipment selection • Arranging or managing evaluations by clinicians • Specifying at least minimum requirements (e.g., standards compliance) for equipment acquisition • Recommending maintenance and warranty requirements • Incoming inspection • Installation of equipment that does not require manufacturer or vendor installation • Management of the installation when accomplished by others • Authority to withhold all or partial payment if all purchase conditions are not met • None of the above

Clinical Engineering (CE) Program

Responsibilities Input

10 Equipment Acquisition - Technologies

The CE program plays a significant role in the selection and acquisition of which of these technologies (check all that apply). • Imaging (e.g., x-ray, ultrasound, CT, MRI) • Therapeutic Radiology (e.g., gamma knife, linear accelerators, radiation treatment equipment) • Clinical Laboratory (including devices used for the preparation, storage, and analysis of patient

specimens and pharmaceuticals) • Specialized surgical equipment (e.g., robotics) • General biomedical equipment (devices for monitoring, diagnosis, treatment, or life support, but not

included in the categories above) • Non-patient care devices (e.g., patient transportation, patient entertainment systems, general

purpose computers, communications equipment, and other systems not included in the categories above)

• None of the above

Clinical Engineering (CE) Program

Responsibilities Input

Page 24: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

22 © 2015 AAMI ▪ HTM Benchmarking Guide

Metric Number

Metric Name

Help Text Section Page Data

Source

11 Equipment Maintenance Responsibilities

The CE program has significant responsibility and accountability for which of these medical equipment maintenance activities (check all that apply). • Selection of vendors and negotiation of maintenance contracts • Scheduled maintenance by CE • Repairs by CE • Documenting in the CMMS all maintenance performed by all providers • Overseeing repairs performed by vendors or manufacturers • Overseeing scheduled maintenance performed by vendors or manufacturers • Including all costs in the CMMS documentation regardless of provider • Deinstallation and disposal of medical equipment • None of the above

Clinical Engineering (CE) Program

Responsibilities Input

12 Equipment Maintenance - Technologies

The CE program is responsible for managing or providing maintenance for which of these technologies (check all that apply). • Imaging (e.g., x-ray, ultrasound, CT, MRI) • Therapeutic Radiology (e.g. gamma knife, linear accelerator, radiation treatment equipment) • Clinical Laboratory (including devices used for the preparation, storage, and analysis of patient

specimens and pharmaceuticals) • Specialized surgical equipment (e.g., robotics) • General biomedical equipment (devices for monitoring, diagnosis, treatment, or life support, but not

included in the categories above) • Non-patient care devices (e.g. patient transportation, patient entertainment systems, general

purpose computers, communications equipment, and other systems not included in the categories above)

• None of the above

Clinical Engineering (CE) Program

Responsibilities Input

13 CE Space - Shop

Square feet of shop space for the CE program. Clinical Engineering (CE) Program

Space Input

14 CE Space - Office

Square feet of office space for the CE program. Clinical Engineering (CE) Program

Space Input

15 CE Space - Other

Square feet of other space for the CE program. Clinical Engineering (CE) Program

Space Input

16 CE Space - Total

This answer is the sum of CE space for shop, office, and other. Clinical Engineering (CE) Program

Space Calculated

Page 25: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

© 2015 AAMI ▪ HTM Benchmarking Guide 23

Metric Number

Metric Name

Help Text Section Page Data

Source

17 Devices - Total Total number of all devices managed or maintained by CE. • Managed by the CE program means that CE is responsible for maintenance (inspection, scheduled

maintenance, and repair) and the cost of maintenance using an external service provider (by contract or by non-contract labor and materials).

• Maintained by the CE program means that CE is responsible for maintenance (inspection, scheduled maintenance, and repair) using CE staff.

Devices Number of Devices Input

18 Devices - Imaging and Therapeutic Radiology

Total number of all imaging equipment and therapeutic radiology equipment managed or maintained by CE. Examples are x-ray equipment, ultrasound equipment, CT systems, nuclear medicine systems, MR systems, linear accelerators, and other radiation treatment equipment.

Devices Number of Devices Input

19 Devices - Laboratory Equipment

Total number of all clinical laboratory equipment managed or maintained by CE. This category includes devices used for the preparation, storage, or analysis of patient specimens and pharmaceuticals.

Devices Number of Devices Input

20 Devices - General Biomedical Equipment

Total number of all the general biomedical equipment (devices not included in the two categories above) managed or maintained by CE. These are devices that provide monitoring, diagnosis, treatment, or life support. Examples: physiological monitors, infusion pumps, dialysis equipment, surgical equipment, scales, clinic equipment, ventilators, etc.

Devices Number of Devices Input

21 Devices - Other Total number of all other devices (patient care and non-patient care devices not included in the three categories above) managed or maintained by CE. Examples: beds, stretchers, wheelchairs, nurse call systems, patient entertainment systems, general purpose computers, communications equipment, TVs, etc.

Devices Number of Devices Input

23 CT Scanners The number of CT scanners that are managed or maintained by the CE program. Devices Device Count Details

Input

24 MRI Scanners The number of MRI scanners that are managed or maintained by the CE program. Devices Device Count Details

Input

25 Hybrid Imaging Systems

The number of hybrid imaging systems (e.g., PET/CT, SPECT/CT) that are managed or maintained by the CE program.

Devices Device Count Details

Input

27 Radiation Therapy Systems

The number of linear accelerators, gamma knife systems, stereotactic radiosurgery systems, cyberknife systems, and similar systems managed or maintained by the CE program.

Devices Device Count Details

Input

28 Interventional Radiology Systems

The number of cardiac catheterization labs, interventional radiology suites, hybrid OR's, and similar systems managed or maintained by the CE program.

Devices Device Count Details

Input

29 Other High Cost Devices and Systems

The number of other individual devices and single systems (not included in the five categories above) with acquisition costs greater than $500,000 managed or maintained by the CE program. Include all such systems (e.g., hyperbaric chamber systems, surgical robotic systems) not only imaging systems.

Devices Device Count Details

Input

30 Life Support Devices

Of the entire inventory of devices managed or maintained by the CE program, how many have been classified as "Life Support" devices per Joint Commission standards?

Devices Device Count Details

Input

Page 26: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

24 © 2015 AAMI ▪ HTM Benchmarking Guide

Metric Number

Metric Name

Help Text Section Page Data

Source

112 Endoscopes Number of endoscopes managed or maintained by the CE program. Devices Device Count Details

Input

31 Networked Medical Devices

Of the entire inventory of medical devices managed or maintained by the CE program, how many are connected to an IT network? Include all ethernet-connected medical devices (e.g., physiological monitors connected to EHR, imaging modalities connected to PACS, infusion pumps connected to a DERS system, etc.).

Devices Device Count Details

Input

32 Devices Receiving Scheduled Maintenance

Of the entire inventory of medical devices that are managed or maintained by the CE program, how many are included in the scheduled maintenance program? Count any device that receives scheduled inspection or maintenance (including electrical safety inspection) regardless of the source of the service (in-house or external service provider).

Devices Device Count Details

Input

33 Percentage of Devices Receiving Scheduled Maintenance

Percentage of devices in the scheduled maintenance program. Devices Device Count Details

Calculated

34 Acquisition Cost - Total

Total acquisition cost of all devices managed or maintained by CE. • Managed by the CE program means that CE is responsible for maintenance (inspection, scheduled

maintenance, and repair) and the cost of maintenance using an external service provider (by contract or by non-contract labor and materials).

• Maintained by the CE program means that CE is responsible for maintenance (inspection, scheduled maintenance, and repair) using CE staff.

Devices Acquisition Cost of Devices

Input

35 Acquisition Cost - Imaging and Therapeutic Radiology

The total acquisition cost of all imaging equipment and therapeutic radiology equipment managed or maintained by CE. Examples are x-ray equipment, ultrasound equipment, CT systems, nuclear medicine systems, MR systems, linear accelerators, and other radiation treatment equipment.

Devices Acquisition Cost of Devices

Input

36 Acquisition Cost - Laboratory Equipment

The total acquisition cost of all clinical laboratory equipment managed or maintained by CE. This category includes devices used for the preparation, storage, or analysis of patient specimens and pharmaceuticals.

Devices Acquisition Cost of Devices

Input

37 Acquisition Cost - General Biomedical Equipment

The total acquisition cost of all the general biomedical equipment (devices not included in the two categories above) managed or maintained by CE. These are devices that provide monitoring, diagnosis, treatment, or life support. Examples: physiological monitors, infusion pumps, dialysis equipment, surgical equipment, scales, clinic equipment, ventilators, etc.

Devices Acquisition Cost of Devices

Input

38 Acquisition Cost - Other

The total acquisition cost of all other devices (patient care and non-patient care devices not included in the three categories above) managed or maintained by CE. Examples: beds, stretchers, wheelchairs, nurse call systems, patient entertainment systems, general purpose computers, communications equipment, TVs, etc.

Devices Acquisition Cost of Devices

Input

40 Average Cost per Device

Total Acquisition Cost divided by Total Devices. Devices Acquisition Cost of Devices

Calculated

Page 27: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

© 2015 AAMI ▪ HTM Benchmarking Guide 25

Metric Number

Metric Name

Help Text Section Page Data

Source

41 Percent Imaging and Therapeutic Radiology Equipment Acquisition Cost

The percentage of Total Acquisition Cost represented by imaging and therapeutic radiology equipment.

Devices Acquisition Cost of Devices

Calculated

42 Percent Laboratory Equipment Acquisition Cost

The percentage of Total Acquisition Cost represented by laboratory equipment. Devices Acquisition Cost of Devices

Calculated

109 Total Acquisition Cost per Bed

Total Acquisition Cost divided by HCO Beds. Devices Acquisition Cost of Devices

Calculated

110 Total Number of Devices per Bed

Total Devices divided by HCO Beds. Devices Acquisition Cost of Devices

Calculated

111 Total Acquisition Cost per Maintenance Technician

Total Acquisition Cost divided by Number of Maintenance Personnel. Devices Acquisition Cost of Devices

Calculated

43 Total FTEs Total number of FTEs in the CE program. Staffing Professional Titles Input

44 Number of Clinical Engineers

Total number of clinical engineers (CEs) in the CE program. Include only those personnel with actual engineering credentials. Examples: Professional Engineer (PE) license, BS or higher degree in engineering (not engineering technology), Certified Clinical Engineer (CCE) credential.

Staffing Professional Titles Input

45 Number of BMETs

Total number of biomedical equipment technicians (BMETs) in the CE program (FTEs). Staffing Professional Titles Input

46 Number of Other Personnel

Total number of other personnel (not CEs or BMETs) in the CE program (FTEs). Examples: clerical staff, contracts administrator, etc.

Staffing Professional Titles Input

48 FTEs - CE Percent

Percentage of Total FTEs who are Clinical Engineers. Staffing Professional Titles Calculated

49 FTEs - BMET Percent

Percentage of Total FTEs who are BMETs. Staffing Professional Titles Calculated

50 Number of Managers and Supervisors

Number of FTEs in the CE department that are primarily involved in management and supervision. Allocate fractional FTEs for working supervisors (e.g., a full-time employee who does supervisory work 40% of the time and maintenance work 60% of the time would have 0.4 FTE here and 0.6 FTE in the next question).

Staffing Professional Roles Input

Page 28: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

26 © 2015 AAMI ▪ HTM Benchmarking Guide

Metric Number

Metric Name

Help Text Section Page Data

Source

51 Number of Maintenance Personnel

The number of FTEs in the CE department that are primarily involved in maintenance (inspection, scheduled maintenance, repair) and related activities.

Staffing Professional Roles Input

52 Number of Personnel in Other Roles

The number of FTEs in the CE department that are primarily involved in other functions (not management/supervision and not maintenance).

Staffing Professional Roles Input

54 Percent Management Staff

Percentage of CE staff in management or supervision. Staffing Professional Roles Calculated

55 Percent Maintenance Staff

Percentage of CE staff in maintenance activities. Staffing Professional Roles Calculated

57 Span of Control Number of employees per supervisor. Staffing Professional Roles Calculated

58 Certification Support

Select all of the certification-related policies that have been adopted by the CE program. • Reimbursement of the application or testing fee • Reimbursement for study materials • Reimbursement for review classes • Reimbursement for travel costs of testing • Formal recognition of certification achievements • Promotion or increase in pay for certification • None of the above

Staffing Professional Qualifications

Input

59 Certification - CCE

Number of CE program staff who hold the Certified in Clinical Engineering (CCE) credential. Staffing Professional Qualifications

Input

60 Certification - CBET

Number of CE program staff who hold one or more BMET certification (CBET, CLES, CRES) credentials.

Staffing Professional Qualifications

Input

61 Certification - IT Number of CE program staff who hold one or more IT certification credentials. Examples: Cisco and Microsoft networking and computer certifications, A+, Network+, etc.

Staffing Professional Qualifications

Input

62 Certification - CCE Percent

Percentage of Total FTEs with CE certification credential. Staffing Professional Qualifications

Calculated

63 Certification - CBET Percent

Percentage of Total FTEs with BMET certification credentials. Staffing Professional Qualifications

Calculated

64 Certification - IT Percent

Percentage of Total FTEs with IT certification credentials. Staffing Professional Qualifications

Calculated

Page 29: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

© 2015 AAMI ▪ HTM Benchmarking Guide 27

Metric Number

Metric Name

Help Text Section Page Data

Source

66 CE Manager Qualifications

Check all that apply. • At least 5 years of experience in CE program management or supervision • At least 10 years of experience in the CE field • BS or higher degree in engineering (not engineering technology) • Certification (CCE, CBET, CLES, CRES) • None of the above

Staffing Professional Qualifications

Input

67 CE Payroll Expenses

Total annual payroll cost (including overtime, on-call, and paid-time-off) for all FTEs in the CE program.

Expenses Staff Expenses Input

68 Benefits Expenses

Total annual cost of benefits for all FTEs in the CE program. If benefits are not included in the CE program budget, contact Human Resources to obtain an estimate (e.g. 25% of salary).

Expenses Staff Expenses Input

70 Maintenance Contracts Expenses

Total annual cost for maintenance contracts for all equipment managed or maintained by the CE program.

Expenses Non-staff Expenses Input

71 Vendor Maintenance Expenses

Total annual cost for vendor maintenance (fee-for-service, non-contract labor and materials) for equipment managed or maintained by the CE program.

Expenses Non-staff Expenses Input

73 Non-stock Parts Expenses

Total annual cost for parts not stocked by the CE program and those purchased to replace those in stock. These are parts used by external service providers, either parts provided by external providers (and billed to the CE program) or parts they draw from CE program stock (which the CE program must replenish).

Expenses Non-staff Expenses Input

72 Technical Supplies Expenses

Total annual cost for parts and technical supplies stocked by the CE program. These are parts and technical supplies used by in-house staff in their maintenance and repair work. Examples: low cost parts kept in stock, miscellaneous electronic components, miscellaneous hardware, wire, batteries, other commonly used materials.

Expenses Non-staff Expenses Input

74 Test Equipment Expenses

Total annual cost for purchasing, calibrating, and repairing test equipment. Expenses Non-staff Expenses Input

75 Training Expenses

Total annual cost of technical training and professional education. Includes service schools, conferences, travel-related expenses, continuing education, on-line teleconference fees and related training and education expenses.

Expenses Non-staff Expenses Input

76 Other CE Department Expenses

Total annual cost of all other CE program expenses. Include phone, computer and telecommunication-related expenses, reference materials, and similar expenses. Do not include building rental, electricity, heating and cooling expenses.

Expenses Non-staff Expenses Input

69 Total Staff Expenses

CE Payroll Expenses plus Benefits Expenses. Expenses Staff Expenses Calculated

77 Total Non-staff Expenses

This is the total of all non-staff expenses that have been previously entered. Expenses Non-staff Expenses Calculated

78 Total CE Expenses

Total Staff Expenses plus Total Non-staff Expenses. Expenses Non-staff Expenses Calculated

Page 30: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

28 © 2015 AAMI ▪ HTM Benchmarking Guide

Metric Number

Metric Name

Help Text Section Page Data

Source

81 In-house Maintenance Costs

Total annual cost for maintenance provided by CE program personnel. Includes Total Staff Expenses plus Total Non-staff Expenses minus Non-stock Parts Expenses, Maintenance Contracts Expenses, and Vendor Maintenance Expenses (or Total Expenses less External Maintenance Costs).

Operations Maintenance/Repair

Calculated

82 External Maintenance Costs

Total annual cost for maintenance provided by external sources. Includes Non-stock Parts Expenses plus Maintenance Contracts Expenses plus Vendor Maintenance Expenses.

Operations Maintenance/Repair

Calculated

83 External Maintenance Cost to Total Maintenance Cost Ratio

External Maintenance Costs divided by Total Expenses expressed as a percentage. Operations Maintenance/Repair

Calculated

79 Travel Time Total hours that CE program personnel spent TRAVELING to "off-campus" work locations (e.g. clinics, outpatient surgery centers, etc.). "Off-campus" definition: Greater than 1 mile, required to drive, fly, etc. Beyond walking distance.

Operations Maintenance/Repair

Input

80 In-house Maintenance Hours

Total annual hours expended by CE program staff on equipment maintenance (inspection, scheduled maintenance, and repair). Use actual data recorded in the CMMS.

Operations Maintenance/Repair

Input

65 Training Received Hours

Total paid hours of training received by CE program staff. Staffing Professional Qualifications

Input

87 Product Alerts/Recalls Hours

Number of hours the CE program staff spend on handling product alerts and recalls. Operations Technology Management

Input

84 Incident Investigation Hours

Number of hours the CE program staff spend on incident investigation. Include all hours spent on incident investigations and follow-up, including Failure Modes and Effects Analysis (FMEA) and Root Cause Analysis (RCA).

Operations Technology Management

Input

85 Strategic Planning and Pre-Procurement Technology Evaluation Hours

Number of hours the CE program staff spend on strategic planning and pre-purchase/pre-procurement evaluations. Include strategic planning, market research, technology assessment, replacement planning, evaluation, specification and RFP development, bid evaluation (e.g. for projects, include time spent up until vendor selection and contract award).

Operations Technology Management

Input

86 Technology Implementation Hours

Number of hours CE program staff spend on technology implementation. Include pre-implementation planning, implementation planning, clinician/facility/vendor/IT coordination, and installation. Do not include time included in the pre-purchase/pre-procurement question immediately above. Do not include time for incoming inspection of equipment.

Operations Technology Management

Input

Page 31: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

© 2015 AAMI ▪ HTM Benchmarking Guide 29

Metric Number

Metric Name

Help Text Section Page Data

Source

88 Teaching Hours Number of hours the CE program staff spend on TEACHING other staff. Include time for developing training materials as well as actual training time. Examples: clinical staff in-services, provision of training to other staff (including CE program staff, clinicians, and any other HCO personnel), and "one-on-one" teaching.

Operations Technology Management

Input

89 Hours for other projects and tasks not included above

Hours for other projects and tasks not included above Example: Medical device research and development. Do not include maintenance activities (inspection, scheduled maintenance, and repair) here.

Operations Technology Management

Input

90 Total Technology Management Hours

Total number of hours spent by CE program staff on these professional activities. Operations Technology Management

Calculated

91 Special Tasks: IT-related hours

How many hours of the total immediately above were related to IT-medical device interfacing, wireless medical device networking, and other IT-related activities? Include ALL medical device/IT interface projects, IT security, IT networking, software patching, spectrum management, wireless projects and issues.

Operations Technology Management

Input

92 Replacement Planning Factors

CE uses which of these factors in making equipment replacement recommendations (check all that apply). • Age compared to expected life • Current condition of the device/system • Repair history • Support capability • Patient safety • Standardization • Obsolescence and changes in existing technology • None of the above

Operations Policies/Procedures

Input

93 External Repair Documentation

Are all service activities (inspection, scheduled maintenance, and repair) performed by external service providers documented in the CMMS? Check only one. • Yes, both the service information and at least 95% of cost data • All service information but missing some cost data • All service information but missing cost data • Not all of either service information or cost data • Vendor service information is not documented in the CMMS

Operations Policies/Procedures

Input

94 Operational Performance Monitoring

The CE program continuously monitors its operational performance in which of these areas (check all that apply). • Response time for equipment service requests • Effectiveness of equipment repair activities • Medical equipment downtime • Effectiveness of inspection and preventive maintenance activities (e.g., PM yield) • Adherence to the inspection and preventive maintenance schedule (e.g., PM completion rate) • Compliance of vendors with service contract provisions • None of the above

Operations Policies/Procedures

Input

Page 32: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

30 © 2015 AAMI ▪ HTM Benchmarking Guide

Metric Number

Metric Name

Help Text Section Page Data

Source

95 IT-Related Planning

The CE program is actively involved in planning which of these IT-related activities (check all that apply). • Remote monitoring of medical device network performance • Manage backup and system administrative functions in clinical equipment networks • Integration of medical devices and IT systems (e.g., EMR) • Computer and network security • Wireless spectrum management • Clinical alarm and alarm secondary communication management • Network design for networked clinical systems • None of the above

Operations Policies/Procedures

Input

96 Scheduled Maintenance Outcomes

What percentage of scheduled inspection and maintenance work orders identify a need for corrective maintenance? Sometimes this is called "PM yield". Please note that ten percent, for example, should be entered as “10” rather than as “0.10” for this item.

Operations Policies/Procedures

Input

97 NPF Outcomes What percentage of work orders are identified as "No Problem Found," "Could Not Duplicate," "Use Error," "Abuse," or similar designation? Please note that ten percent, for example, should be entered as “10” rather than as “0.10” for this item.

Operations Policies/Procedures

Input

98 Shop Space per maintenance FTE

CE Shop Space divided by Total Maintenance Personnel. Operations Calculated Performance Metrics

Calculated

99 Devices per Technician

Total Devices divided by Total Maintenance Personnel. Operations Calculated Performance Metrics

Calculated

100 In-house Maintenance Hours to Paid Hours Ratio

In-house Maintenance Hours to paid hours (Maintenance FTEs x 2080 hours) as a percentage. This can be used as an approximate measure of department productivity.

Operations Calculated Performance Metrics

Calculated

101 Hourly Cost of In-house Maintenance

The cost per hour for the in-house staff to provide services is calculated by dividing In-house Maintenance Costs by In-house Maintenance Hours.

Operations Calculated Performance Metrics

Calculated

102 Technology Management %

This is the percentage of CE program work that is spent on technology management activities (as distinct from maintenance and repair activities).

Operations Calculated Performance Metrics

Calculated

103 Total CE program expense relative to total HCO expense

The cost to run the CE program as a percentage of the HCO's operating cost is calculated by dividing the Total CE Expenses by HCO Expenses, expressed as a percentage.

Operations Calculated Performance Metrics

Calculated

Page 33: HTM Benchmarking Guide - Amazon S3s3.amazonaws.com/rdcms-aami/files/production/public/FileDownloads/HTM/... · Case Study: HTM Staffing at UC Davis Medical Center . One use for benchmarking

© 2015 AAMI ▪ HTM Benchmarking Guide 31

Metric Number

Metric Name

Help Text Section Page Data

Source

104 Total CE program expense per Adjusted Discharges (CMI Weighted)

The cost to run the CE program per Adjusted Discharge (CMI Weighted) is calculated by dividing Total CE Expenses by HCO Adjusted Discharges.

Operations Calculated Performance Metrics

Calculated

105 Cost of Service Ratio (COSR)

The ratio of the Total Maintenance Cost to the Total Acquisition Cost is calculated by dividing Total CE Expenses by Total Acquisition Cost, expressed as a percentage.

Operations Calculated Performance Metrics

Calculated