radiography - focus on health20).pdf · radiographics 2015; 35:1630–1642 published online...

17
Radiography Activity for 2020 Activity No: A24 (20) 2020 Topic Quality improvement Article Practical Approaches to Quality Improvement for Radiologists Speciality Diagnostic / US / Mammography Approved for THREE (3) Clinical Continuing Education Units (CEU’s)

Upload: others

Post on 18-Jul-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Radiography - Focus on Health20).pdf · RadioGraphics 2015; 35:1630–1642 Published online 10.1148/rg.2015150057 Content Codes: 1From the Department of Radiology, Division of Cardiothoracic

Radiography

Activity for 2020

Activity No: A24 (20) 2020

Topic

Quality improvement

Article

Practical Approaches to Quality Improvement for Radiologists

Speciality

Diagnostic / US / Mammography

Approved for THREE (3) Clinical Continuing Education Units (CEU’s)

Page 2: Radiography - Focus on Health20).pdf · RadioGraphics 2015; 35:1630–1642 Published online 10.1148/rg.2015150057 Content Codes: 1From the Department of Radiology, Division of Cardiothoracic

Note: This copy is for your personal non-commercial use only. To order presentation-ready copies for distribution to your colleagues or clients, contact us at www.rsna.org/rsnarights.

QU

ALI

TY

1630

Practical Approaches to Quality Improvement for Radiologists1

Continuous quality improvement is a fundamental attribute of high-performing health care systems. Quality improvement is an essential component of health care, with the current emphasis on adding value. It is also a regulatory requirement, with reimburse-ments increasingly being linked to practice performance metrics. Practice quality improvement efforts must be demonstrated for credentialing purposes and for certification of radiologists in prac-tice. Continuous quality improvement must occur for radiologists to remain competitive in an increasingly diverse health care mar-ket. This review provides an introduction to the main approaches available to undertake practice quality improvement, which will be useful for busy radiologists. Quality improvement plays multiple roles in radiology services, including ensuring and improving pa-tient safety, providing a framework for implementing and improv-ing processes to increase efficiency and reduce waste, analyzing and depicting performance data, monitoring performance and implementing change, enabling personnel assessment and devel-opment through continued education, and optimizing customer service and patient outcomes. The quality improvement approach-es and underlying principles overlap, which is not surprising given that they all align with good patient care. The application of these principles to radiology practices not only benefits patients but also enhances practice performance through promotion of teamwork and achievement of goals.

©RSNA, 2015 • radiographics.rsna.org

Aine Marie Kelly, MD, MS, MA Paul Cronin, MD, MS

Abbreviations: ABR = American Board of Ra-diology, ACR = American College of Radiology, FMEA = failure mode and effect analysis, PDSA = plan, do, study, act

RadioGraphics 2015; 35:1630–1642

Published online 10.1148/rg.2015150057

Content Codes: 1From the Department of Radiology, Division of Cardiothoracic Radiology, University of Michi-gan Medical Center, 1500 E Medical Center Dr, Ann Arbor, MI 48109. Received March 9, 2015; revision requested April 27 and received May 27; accepted June 2. For this journal-based SA-CME activity, the authors, editor, and re-viewers have disclosed no relevant relationships. Address correspondence to A.M.K. (e-mail: [email protected]).

©RSNA, 2015

After completing this journal-based SA-CME activity, participants will be able to:

■ Discuss the rationale behind and the need for departmental and institutional quality improvement programs.

■ Describe the most commonly used approaches to quality improvement that can be applied in radiology departments.

■ List the steps involved in designing, implementing, and continuing a depart-mental or institutional quality improve-ment program.

See www.rsna.org/education/search/RG.

SA-CME LEARNING OBJECTIVES

IntroductionThe current health care quality crisis is not new, with quality lapses reported over at least 100 years. The Institute of Medicine’s report “To Err is Human” (1) highlighted nearly 100,000 deaths annually in the United States as a result of medical error. In a subsequent report, “Crossing the Quality Chasm,” the Institute of Medicine proposed closing the performance gap by promoting efficiency, effectiveness, patient safety, patient centeredness, timeliness, and equity (2). The report also suggested 10 rules for redesign, as fol-lows: care based on continuous healing relationships, care custom-ized according to patient needs and values, care controlled by the patient, knowledge sharing and free flow of information, evidence-based decision making, safety as a system property, transparency as a necessity, anticipation of needs, continual decrease in waste, and cooperation among clinicians as a priority. Many of these aims and rules for patient care are reflected in quality improvement perfor-mance indicators and programs (3).

Fred
Highlight
Page 3: Radiography - Focus on Health20).pdf · RadioGraphics 2015; 35:1630–1642 Published online 10.1148/rg.2015150057 Content Codes: 1From the Department of Radiology, Division of Cardiothoracic

RG • Volume 35 Number 6 Kelly and Cronin 1631

Radiology (ABR) requirement for maintenance of certification (10). A business case can also be made for quality, and this is highlighted by suc-cessful efforts of companies like General Electric, which uses quality improvement specialists and earns large returns on investments (11).

Quality improvement efforts in health care delivery are essential for ensuring and improving patient safety, implementing and continuously improving processes, analyzing and depicting data, monitoring performance and implementing change, enabling professional staff assessment and development through continued education, and optimizing customer service and patient outcomes (3).

Basic DefinitionsQuality control is carried out retrospectively to establish when measurements fall outside ranges of acceptability; when this occurs, action is taken (12,13). Quality control in radiology involves technical testing of imaging equipment and evaluation of imaging quality to ensure that they conform to standards. This includes evaluation of shielding around radiography facilities, measure-ment of processing parameters (eg, developer temperature, pH level, film base and fog, speed, and contrast), and assessment of radiation dose (eg, kilovolt peak, milliampere-seconds, imager output, and effective dose) and imaging parame-ters (image contrast, resolution, artifacts, appro-priate labeling, and consistency).

Quality assurance is more comprehensive and aims to ensure excellent (better than acceptable) standards by means of systematic collection and evaluation of data. Although quality assurance involves quality control, it focuses on specific performance indicators to ensure that more ser-vices are of high quality and fewer fall below the acceptable range. Performance indicators in radi-ology include safety, processes or procedures, and professional or patient outcomes or satisfaction. Examples of indicators are access to services, ap-propriateness of utilization, patient selection pa-rameters, timeliness of scheduling, the scheduling process, prescreening of patients, selection of im-aging modality and protocol, waiting times, wait-ing room amenities, technical effectiveness and efficiency, patient safety, image interpretation, availability of previously obtained images, repeat rate, correlation with pathologic findings, missed diagnosis rate, timeliness of reporting, finaliza-tion of reports, reporting of critical findings, and patient or professional in-service education. By identifying the performance indicators that need the most improvement, quality assurance helps radiology practices make decisions about their clinical practice and operational functions.

Health care reforms (and the great recession) have altered insurance plans, with a switch from fee per procedure to capitated payments for hos-pital services, preauthorization, reduced technical and professional fees, bundling, and less coverage for new services—all of which affect diagnostic imaging services. Accountable care organiza-tions and shared savings programs will encourage health care systems to be more competitive. The Centers for Medicare and Medicaid Services will begin applying payment adjustments to eligible physicians who do not satisfactorily report data on quality measures for covered professional services later this year (4). All of these factors and public reporting of health performance data will result in a greater incentive to focus on quality, rather than quantity (5,6).

The Joint Commission requires that active managed quality programs comply with national patient safety goals, and American College of Ra-diology (ACR) accreditation requires that a peer review process be in place (7,8). For trainees, the Accreditation Council for Graduate Medical Education requires that residents be able to apply principles of quality improvement to processes (9). For practicing radiologists, practice quality improvement is part IV of the American Board of

TEACHING POINTS ■ Quality improvement efforts in health care delivery are es-

sential for ensuring and improving patient safety, implement-ing and continuously improving processes, analyzing and depicting data, monitoring performance and implementing change, enabling professional staff assessment and develop-ment through continued education, and optimizing custom-er service and patient outcomes.

■ In radiology, the focus of quality improvement is to improve the performance and effectiveness of diagnostic and thera-peutic procedure processes, the appropriateness of imaging and procedures, quality and safety, and the management of all imaging services.

■ With total quality management or continuous quality im-provement, a uniform commitment to quality across the whole organization is required to focus on establishing collab-oration among various departments within the organization to continuously improve processes associated with providing services that meet or exceed expectations.

■ Kaizen (and lean) tools add an additional “human element” to the traditional PDSA cycles in that potential stakeholders must agree on what adds value and what constitutes waste before implementing any changes. Kaizen events are well suited to addressing focused problems in single work areas that can be completed quickly and for which rapid results can be appar-ent. Selection of the initial kaizen project should focus on a conspicuous problem that most of the team find annoying.

■ To improve overall efficiency and produce sustained improve-ment, a gradual, continuous, and comprehensive “lean trans-formation” of work philosophy and workplace culture should take place by applying lean methods to all processes and areas within a department.

Page 4: Radiography - Focus on Health20).pdf · RadioGraphics 2015; 35:1630–1642 Published online 10.1148/rg.2015150057 Content Codes: 1From the Department of Radiology, Division of Cardiothoracic

1632 October Special Issue 2015 radiographics.rsna.org

Key process metrics that can be readily mea-sured include access times, wait times, standard-ized protocols, and report finalization times (11). Professional and patient outcome metrics include peer review, chart review of reports compared with standards of reference, and procedural out-comes (success rate, complication rate, radiation dose, and procedural time). Metrics of particular importance to an organization are known as key performance indicators, and selection of these indicators will depend on the department’s or institution’s priorities for safety and quality (17). Another way to depict radiology quality metrics involves mapping a patient’s journey through the radiology department with a quality value map (Fig 1), which reveals many opportunities for quality improvement in radiology (18).

Approaches and Graphical Tools Used to

Identify and Depict Quality IssuesWith any quality improvement effort, one should be familiar with and correctly use appropriate graphical tools and approaches because one can manage only that which one can accurately mea-sure (16).

To determine what processes need improve-ment, all safety breaches or quality issues must be identified. These are then prioritized for quality improvement. There are many ways to gather and depict information about improvement opportu-nities, including surveys, safety and environment of care walkabouts, flowcharts or process maps for identifying bottlenecks, peer review and error reporting, using chart review to monitor compli-ance with national patient safety goals, brain-storming sessions, and strengths-weaknesses-opportunities-threats analysis (16).

All potential stakeholders (eg, patients and employees) should feel empowered to raise sug-gestions because this generates more ideas and ensures greater participation (“buy in”) in the quality improvement effort. Any quality improve-ment initiative should align with departmental or institutional goals and have full leadership support. One must also engage the leadership of other departments that may be affected by the changes. Quality improvement efforts are not finite, and quality improvement is often an itera-tive, repetitive process, with small tweaks made along the way to ensure constant improvement (3,19).

There are many graphical tools available for depicting data regarding safety, process ef-ficiency, and outcome measures (professional and patient). These tools include flowcharts, cause-and-effect (Ishikawa or fishbone) dia-grams, Pareto charts, check sheets (tally sheets),

Quality improvement and continuous qual-ity improvement focus on proactively improving and continually enhancing the quality of care and services by combining professional knowledge with knowledge about making improvements (14,15). Their philosophy is focused on continu-ous improvement of processes associated with services that meet or exceed the expectations of the patient or referring clinician. For quality improvement and continuous quality improve-ment, clear standards should be identified for every activity or process in an imaging facility. These standards should be measurable to allow processes to be continually improved. In radiol-ogy, quality improvement and continuous quality improvement also include highlighting errors and improving them after they occur and analyzing, understanding, and improving work processes. In radiology, the focus of quality improvement is to improve the performance and effectiveness of diagnostic and therapeutic procedure processes, the appropriateness of imaging and procedures, quality and safety, and the management of all imaging services (16).

Quality management and total quality man-agement are other terms used to describe or-ganizational processes that use strategies that focus on maintaining existing quality standards and making incremental improvements and are sometimes used interchangeably with continuous quality improvement With total quality man-agement or continuous quality improvement, a uniform commitment to quality across the whole organization is required to focus on establish-ing collaboration among various departments within the organization to continuously improve processes associated with providing services that meet or exceed expectations. In today’s competi-tive economic environment, quality management processes are commonplace in hospitals and are proving to be successful for improving quality while controlling costs.

Quality MetricsTo assess for opportunities to improve quality, one must be able to measure performance or effectiveness before and after the quality improve-ment initiative. Many measures or metrics can be used, including safety, process performance and/or efficiency, patient and professional outcomes, and satisfaction (service). Safety is the founda-tion of any quality program, and safety metrics most relevant to a radiology department include radiology-generated infections, medication error rates, patient falls, contrast material–induced ne-phropathy, critical test result reporting, specimen labeling errors, hand hygiene, medication recon-ciliation, and correct image labeling (11).

Page 5: Radiography - Focus on Health20).pdf · RadioGraphics 2015; 35:1630–1642 Published online 10.1148/rg.2015150057 Content Codes: 1From the Department of Radiology, Division of Cardiothoracic

RG • Volume 35 Number 6 Kelly and Cronin 1633

control charts (Shewhart or statistical process charts), histograms, and scatter diagrams. These are known collectively as the seven basic tools of quality (12). Scorecards and dashboards placed in a prominent location can be used to display gathered data and monitor and track results in an organization (11,20,21).

Commonly Used Approaches to Quality Improvement

Root Cause AnalysisErrors, sentinel events, and near misses occur every day in hospitals and departments and can present opportunities for improvement. The Joint Commission requires that accredited hospitals identify and respond appropriately to all sentinel or major adverse events (22). Identification and elimination of causes can establish circumstances that will decrease the chances of recurrence (23). Reactive identification of causes is the first step to solving problems, and this is the basis of root cause analysis—a retrospective process. After an error occurs, root cause analysis can be performed to investigate what, how, and why it occurred and help determine actions to minimize or prevent its recurrence (24). Root cause analysis should be performed promptly after an event occurs, by a team of four to 10 people with various roles and backgrounds in the department. A useful graphical tool is the fishbone diagram, and a useful method to get from the obvious (proximate) cause to the root cause is to use the “five whys” (25). For each apparent cause, the team asks, “Why did this happen?” Once the first answer is determined, the question is repeated at least four more times. For example, if a patient undergoes the wrong radio-logic procedure, the first question asked is “Why did the patient undergo the wrong procedure?” The answer might be because the procedure was entered incorrectly by the scheduler. The second question is “Why did the scheduler enter the pro-cedure incorrectly?” The answer might be because it was prescribed incorrectly. The third question becomes “Why was the incorrect examination prescribed?” The answer might be that the radiolo-gist could not read the handwriting on the request form. The fourth question is “Why was the form filled out in illegible handwriting?” The answer may be that there is no electronic form (and be-cause, in many cases, the doctor’s handwriting is not legible). The final question is “Why don’t we have an electronic request form for radiology pro-cedures with a drop-down menu of options and alerts in cases of allergies, other contraindications, and appropriateness?”

The Joint Commission suggests 11 steps for performing root cause analysis for sentinel events,

as follows: (a) organize a team, (b) define what happened, (c) identify and define processes related to the event, (d) identify proximate (closest to the error) causes, (e) design and implement any necessary “quick fix” interim changes, (f) identify root causes, (g) identify potential risk-reduction strategies, (h) formulate and evaluate proposed improvement actions and identify measures of suc-cess, (i) develop and implement an improvement action plan, (j) evaluate and fine-tune improve-ment efforts, and (k) communicate the results (23,26). A cause-and-effect (fishbone) diagram for root cause analysis of why an incorrect imaging test was performed is illustrated in Figure 2.

Failure Mode and Effect AnalysisWith failure mode and effect analysis (FMEA), a prospective systematic approach is used to identify and understand causes, contributing fac-tors, and effects of potential failures on a process, system, or practice. Proactive identification of po-tential problems (latent and active predisposing contributing factors) may prevent problems from happening and is more effective than root cause analysis (27). In addition, clinicians are more likely to accept a prospective process like FMEA because, unlike error reporting, there is no alloca-tion of blame. Prospective analysis is a more posi-tive approach to problems because it harnesses people’s knowledge and competencies rather than highlighting human weaknesses (27). Radiology departments with equipment and many discrete processes lend themselves to FMEA.

Improvement processes like FMEA are now mandatory for hospitals that admit patients, and Joint Commission standards require annual FMEA and suggest five steps for performing it (28). The steps (and substeps) involved in FMEA depend on the complexity of the process and the severity of the consequences of failure (27,29). Fishbone diagrams are useful for depicting each specific failure mode. First, a potential problem is chosen and a team is assembled to map out and evaluate the process leading up to the problem. Each failure mode is assessed and ranked for severity, probability of occurrence, and prob-ability of detection to generate a criticality index (risk priority number), a numeric score used to prioritize and determine if any exceed acceptable limits. An action plan is developed, and the team follows up after the action plan is put in place to reassess risk (Fig 3). Table 1 illustrates a worked example of the FMEA process for patients re-ferred for cardiac computed tomography (CT) who did not undergo adequate workup, resulting in missing information about contrast material allergy, contraindications, or history, which can lead to time delays. The National Center for

Page 6: Radiography - Focus on Health20).pdf · RadioGraphics 2015; 35:1630–1642 Published online 10.1148/rg.2015150057 Content Codes: 1From the Department of Radiology, Division of Cardiothoracic

1634 October Special Issue 2015 radiographics.rsna.org

Figure 2. Fishbone (cause-and-effect) diagram shows root cause analysis of an error involving use of the wrong imaging test.

Patient Safety at the U.S. Department of Veter-ans Affairs has developed the health care FMEA to proactively identify risks to patient safety and reduce errors in health care (23). There are many similarities to FMEA, but severity and probability definitions are modified within a hazard scoring matrix, and a decision tree is used in the health care FMEA (30). Compared with FMEA, the health care FMEA generates a more simplified hazard score (from 1 to 16) compared with the risk priority number (from 1 to 100). The Joint Commission and the National Center for Patient Safety have produced tool kits for FMEA team facilitators and other interested team members. These tool kits, along with the FMEA Handbook, are useful resources (28,31).

Plan, Do, Study, ActThe “plan, do, study, act” (PDSA) (Deming) cycle involves a trial and learning approach in which a

hypothesis or solution is generated and then tested on a small scale before making changes to the whole system (12,25). First, a specific process or path is identified, and work teams, including all parties that interact with the system being stud-ied, are assembled. A system flowchart (graphical tool) is created, and observations are made about problems that exist or areas to be improved. Next, testing is initiated and a course of action to im-prove the process is planned. Finally, the effects of changes to the system are monitored or measured and feedback is given. Small changes can be incor-porated quickly at the end of each cycle, with addi-tional changes added for subsequent cycles (Fig 4). An example of a PDSA cycle in radiology would be to look at the appropriate use of CT for pulmonary embolism in the emergency department (Fig 5). In the first PDSA cycle, one would plan what data to collect with CT for pulmonary embolism (eg, age, sex, whether a Wells score was obtained, whether a

Figure 1. Radiology qual-ity value map. Diagram shows the patient’s jour-ney through the radiology department. (Adapted and reprinted, with permission, from references 11 and 18.)

Page 7: Radiography - Focus on Health20).pdf · RadioGraphics 2015; 35:1630–1642 Published online 10.1148/rg.2015150057 Content Codes: 1From the Department of Radiology, Division of Cardiothoracic

RG • Volume 35 Number 6 Kelly and Cronin 1635

Figure 3. Diagram illus-trates the steps involved in FMEA.

Table 1: Example of FMEA Process for Patients Referred for Cardiac CT Who Did Not Undergo Ad-equate Workup

Process Potential RisksDetection

ScoreOccurrence

ScoreSeverity Score

Risk Priority No. or Criti-cality Score*

Process 1: examination requested Request completed

and faxed to radiology

Insufficient or incorrect clinical data on request; incorrect car-diac CT requested

6 4 5 120

Protocol completed

Incorrect type of cardiac CT pre-scribed; insufficient or incorrect clinical information available; requesting clinician’s handwrit-ing not legible

3 3 5 45

Process 2: cardiac CT is performed Acquire images Delay in performing examination

as clinical information is sought2 6 3 36

Wrong type of cardiac CT per-formed because clinical informa-tion is missing

2 3 6 36

Examination is canceled because patient has a contraindication or allergy

5 2 2 20

Administer contrast material and other drugs

Patient has contrast material al-lergy or contraindication

6 2 6 72

Patient has β-blocker allergy or contraindication

6 2 6 72

Patient has nitroglycerin allergy or contraindication

6 2 6 72

Process 3: notification of significant or urgent findings at examination Call clinician to

convey resultsReferring clinician is outside hospi-

tal system and cannot be located1 4 6 24

Referring clinician out of office and not contactable by any other means

1 4 6 24

Note.—Inadequate workup can result in missing information about contrast material allergy, contraindications, or history, which can lead to delays. *The risk priority number or criticality index is the product of the detection, occurrence, and severity scores.

Page 8: Radiography - Focus on Health20).pdf · RadioGraphics 2015; 35:1630–1642 Published online 10.1148/rg.2015150057 Content Codes: 1From the Department of Radiology, Division of Cardiothoracic

1636 October Special Issue 2015 radiographics.rsna.org

Figure 5. Diagram illustrates PDSA cycles with regard to appropriateness for CT pulmonary angiography through the emergency department. PE = pulmonary embolism, VQ = ventilation-perfusion.

Figure 4. Diagram illustrates the steps involved in the PDSA cycle.

d-dimer assay was performed, whether ultrasonog-raphy or ventilation-perfusion scanning was used) and then collect the data (“do”). The data are then studied to assess the appropriateness rate. The final step in the PDSA cycle is to act (eg, deliver an edu-cational intervention). In the second cycle, follow-ing the educational intervention, one would carry out the same steps of the PDSA approach and determine whether the educational intervention improved appropriateness. If not, then one would plan to survey referring clinicians to assess their perceived indications for use of CT in pulmonary embolism and their perceptions of its risks and benefits. Cycle 3 would be a PDSA cyle carried out after a revised targeted educational intervention on the basis of the results of the survey.

Lean ManagementLean management, or simply “lean,” derives from the Toyota Production System management and manufacturing policies designed to allow personnel and organizations to become more efficient and eliminate waste (32). Continuous incremental improvements in performance are made, with the goal of adding value to services provided while maintaining the highest possible customer satisfaction. Expenditure of resources for reasons other than creating value for the customer is regarded as wasteful and targeted for elimination. As waste is eliminated, production times and costs will decrease. Lean management, which emphasizes process analysis, is particularly relevant to radiology departments, which depend on a smooth flow of patients and uninterrupted equipment function for efficient operation (33). To improve overall efficiency and produce a sus-tained improvement, a gradual, continuous, and comprehensive “lean transformation” of work philosophy and workplace culture should take place by applying lean methods to all processes and areas within a department. Lean transforma-tion differs from other approaches in that it is both a philosophy and an organizational way of life that continuously transforms staff into lean experts and keeps everyone on the path of con-tinuous improvement (33).

The principles of lean management include having equal involvement and respect for all staff, observing and analyzing processes where they occur (by “going to the gemba,” or “real place”), eliminating all forms of waste or steps in the process that do not add value, standardizing work processes to minimize variation, improving flow of all processes in the system, using visual cues to communicate and update, adding value for the customer, and using lean (graphical) tools.

Lean management starts with equal involve-ment of all staff members, with the understand-

ing that frontline workers best understand any problems and must be involved in quality im-provement efforts at all stages of the process. The opinions of all staff must be respected and val-ued, and staff should move from compliance (do-ing things because they have to) to commitment (doing things because they believe it is adding value) to ensure sustainable change (33). For any lean initiative to achieve sustainable change, all stakeholders (supervisors, managers, and front-line staff members) must be engaged in dialogue and brainstorming at all stages of the process.

Visiting the workplace (ie, going to the gemba) is encouraged for senior staff to view the workflow and process environment, identify safety hazards, see the cause of complaints, and better understand inefficiencies. Workers can demonstrate processes and illustrate inefficiencies and then be allowed to make suggestions for improvements in their area.

Page 9: Radiography - Focus on Health20).pdf · RadioGraphics 2015; 35:1630–1642 Published online 10.1148/rg.2015150057 Content Codes: 1From the Department of Radiology, Division of Cardiothoracic

RG • Volume 35 Number 6 Kelly and Cronin 1637

Eliminating waste or steps that do not add value is an essential principle of lean manage-ment. The eight wastes of the lean approach are applicable to radiology and include overproduc-tion (imaging more anatomy than needed), trans-portation (unnecessarily transferring patients, personnel, or equipment), inventory (stock occu-pies physical space that costs money to rent), mo-tion (unnecessary movement of personnel within a work area), defects (field of view is too small), overprocessing (too many reformatted images are produced), waiting (imaging unit downtime), and skills (underutilizing capabilities of faculty) (33,34).

The standardization of work reduces varia-tion and increases efficiency and, in radiology, involves the use of flowcharts (eg, determination of an appropriate imaging modality for patients suspected of having pulmonary embolus) and preprocedure checklists.

To identify variations in flow and order (bottlenecks) in complex clinical environments, process management tools like value stream mapping can be used to map the process from concept to product (35). Value stream maps differ from traditional flow diagrams or process maps in that they enable capture of both process and material flow and allow clear identification of waste and value-added steps. Mapping out an entire process, such as an outpatient diagnostic test encounter (from request initiation through report finalization), involves a multidisciplinary team of all involved in the process steps. Before mapping, it is important to clearly understand the customers’ (patient, referring clinician) needs and expectations in relation to the product to be mapped (35). There are six steps in value stream mapping, but maps will vary in size and complex-ity depending on the process.

The first step in value stream mapping is to chart suppliers (personnel involved in starting the process, including referring clinicians and

patients), inputs (information entered to start the process, including request forms), processes (including examination requests, prescreening, scheduling, protocoling, patient registration, patient preparation, and imaging procedure), out-puts (effects of the test, including test results and disposition), and customers (personnel affected by the information, including patients, admit-ting ward personnel, clinicians, and radiologists). These steps are abbreviated as “SIPOC” (Table 2) (36).

Next, the process is observed in the work-place and then mapped in at least four to eight steps to identify queues and staging areas in the process. Information systems (electronic or manual) should be added to the map. Data and time stamps should be identified (wait time, cycle time, number of people in queue, delays), and sources of waste should be searched for (informa-tion, process, physical, environment, and people). Finally, the map is completed by validating times with baseline data (how long steps should take, with use of benchmarks from the literature) (Fig 6) (37).

Other tools that can be used to improve order include the five S’s (sorting, straightening, scrub-bing, systematizing, and standardizing), produc-tion leveling and/or level scheduling (keeping small batches rather than bulk), “push” (making required actions mandatory) and “pull” (mak-ing required actions easy) systems, and mistake proofing (33).

Visual tools can be used to communicate and inform and include color coding of different devices to enable easier stocking and retrieval and the use of reminder (to restock) cards inserted near the end of a batch of stock. An A3 report (named after the A3 paper size) documents the background (problem), current condition, and goals on one side and depicts analysis, proposed solutions (countermeasures), implementation, and status reviews on the other (38). An A3

Table 2: First Step in Value Stream Mapping: Charting Supplier, Input, Process, Output, and Customer

Supplier Input Process Output Customer

Patient Patient information Request submitted Imaging results PatientReferring clinician History and physical Prescreening ques-

tionnairePatient disposi-

tionReferring clinician

Scheduler Schedule Radiologist protocols … RadiologistRadiologist Information systems Technologist reviews … …Lead technologist Available imaging time Patient scheduled … …Radiology nurses Magnetic resonance

(MR) imaging safety information

Patient registers … …

MR imaging unit MR imaging request Perform MR imaging … …

Page 10: Radiography - Focus on Health20).pdf · RadioGraphics 2015; 35:1630–1642 Published online 10.1148/rg.2015150057 Content Codes: 1From the Department of Radiology, Division of Cardiothoracic

1638 October Special Issue 2015 radiographics.rsna.org

Figure 6. (a) Value stream map. Diagram illustrates the patient pathway through the MR imaging department and reveals that the longest wait time was for the nurse to insert the intravenous line. IV = intravenous, MiChart = electronic medical record system, PACS = picture archiving and communication system, RIS = radiology informa-tion system. (b) Value stream mapping icons.

(and lean) tools add an additional human element to the traditional PDSA cycles in that potential stakeholders must agree on what adds value and what constitutes waste before implementing any changes. Kaizen events are well suited to address-ing focused problems in single work areas that can be completed quickly and for which rapid results can be apparent. Selection of the initial kaizen project should focus on a conspicuous problem that most members of the team find annoying. The problem should be defined and measured if pos-sible. Next, the problem is analyzed, and a single waste problem is selected to focus on. A kaizen team should be assembled to include a sponsor (ideally, a knowledgeable individual, such as a lean or kaizen coach or a member of the quality

report posted in a prominent location docu-ments learning, decision making, and planning steps undertaken to solve problems after a root cause analysis. Displaying the A3 report is useful to communicate lean project goals, foster effec-tive and efficient dialogue in an organization or group, and communicate progress to staff (Fig 7).

In any part of the lean processes, one must not lose focus on the customers and should ensure that value is being added from their perspective. In radiology, customers include referring physi-cians and patients. One must solicit feedback from these groups because our perception of their values may not be accurate. Referring physicians rely heavily on imaging and often value timeli-ness of report finalization, whereas patients value aspects like ease and speed of procedure schedul-ing or comfort in the reception area.

KaizenKaizen, which means “good change” or “improve-ment,” is a lean management tool; other tools include hoshin or balanced planning, balanced scorecards, annual operating plans, and manage-ment dashboards (33). A kaizen event (or blitz) involves the quick (short period, eg, a week) analysis of small manageable components of prob-lems, with rapid implementation of solutions and continuous real-time reassessment (19). Kaizen

Page 11: Radiography - Focus on Health20).pdf · RadioGraphics 2015; 35:1630–1642 Published online 10.1148/rg.2015150057 Content Codes: 1From the Department of Radiology, Division of Cardiothoracic

RG • Volume 35 Number 6 Kelly and Cronin 1639

assurance team), a leader (this could be the person who initially identified the problem), and mem-bers (any involved or interested team members). The team can hold formal (kaizen event or blitz) or informal (applicable to problems that everyone agrees have a simple quick solution to the waste) meetings. Next, the process change is agreed on and solutions implemented. Because kaizen is often a local process that involves a single depart-ment, this allows quick assessment or testing of the effect of the change. The steps involved in kaizen are as follows: define, measure, and analyze the problem; select a single waste step to focus on; assemble a kaizen team with sponsor, leader, and members; hold a kaizen event to discuss strate-gies to address waste step; implement the process change; quickly test the effect of change; and iden-tify the next problem and repeat the process.

Lean or kaizen efforts must be aligned with institutional goals to decrease risk for changes being reversed. All potential stakeholders must be included, from institutional leaders and senior administration, through all management levels, to frontline workers. Stakeholders must feel continu-ally engaged and be promoted and rewarded for reporting events, including near misses. A “just culture” of quality and safety should be promoted by building consensus and momentum. Initially, the lean or kaizen team will need help in assem-bling and guidance on the selection of clearly definable problems. Most health systems now have lean or kaizen coaches available to guide teams through the PDSA process. After a few cycles, team leaders will evolve to become coaches to ap-ply principles to optimize engagement and achieve improvements. The team should meet regularly and be committed to the PDSA cycles. The use

of visual tools can help keep meetings short and focused (19). A surveillance system should be es-tablished to monitor quality indicators, and a sys-tematic process should be put in place to analyze and manage reported events. Finally, processes must be implemented to prevent errors, improve safety, and manage customer relations, including continued educational programs.

Six SigmaSix Sigma has been around for about a century and was popularized by Motorola in the 1980s. In manufacturing, Six Sigma is a quality standard based on reducing variability within processes or products to move the mean toward a standard of reference (39). Six Sigma indicates six standard deviations from the arithmetic mean, allowing only 3.4 defects per million. By measuring the number of defects, identifying the sources of error, and systematically determining how to avoid them, one aims to reach zero defects. Six Sigma involves statistical methods and starts with identification of the process and key customers, which requires a team of individuals who understand the problem and are familiar with the institution. The five steps of the Six Sigma process are as follows: define cus-tomer needs, measure performance, analyze data, set priorities and launch improvements, and check for change compared with baseline to control the future process (define, measure, analyze, improve, and control, or DMAIC) (13). The elements of Six Sigma are similar to those of PDSA and other qual-ity improvement tools and align with good medi-cal practice (Fig 8). A perceived weakness of Six Sigma is its complexity, with rigorous adherence to problem solving potentially resulting in overworking simple problems with obvious solutions (34).

Figure 7. Sample A3 report for patients who arrive at the CT department without information about creatinine level and allergies.

Annie
Typewritten Text
12
Page 12: Radiography - Focus on Health20).pdf · RadioGraphics 2015; 35:1630–1642 Published online 10.1148/rg.2015150057 Content Codes: 1From the Department of Radiology, Division of Cardiothoracic

1640 October Special Issue 2015 radiographics.rsna.org

Figure 10. Diagram illustrates the Institute for Healthcare Im-provement model for health care improvement.

Figure 9. Diagram illustrates the ABR three-phase model for practice quality improvement projects.

Figure 8. Diagram illustrates steps involved in the Six Sigma approach.

The ABRIn 2007, the ABR established the practice qual-ity improvement program as part of its main-tenance of certification program, with the goal “to improve the quality of healthcare through diplomate-initiated learning and quality im-provement” (40). A satisfactory practice quality improvement project must be relevant to one’s practice, be achievable in one’s clinical setting, produce results suited for repeat measurements during the maintenance of certification cycle, and be reasonably expected to improve quality (41). The ABR’s prescribed method is a version of PDSA and has at least three phases: a base-line PDSA cycle, implementation of an im-provement plan, and a post-improvement-effort PDSA cycle (42). This practice quality improve-ment model is incorporated into the ABR’s maintenance of certification for diplomates, with time-limited ABR certificates (from 2003 on) being required to complete three practice quality improvement projects within a 10-year cycle and diplomates with continuous certifi-cates (from 2012 on) required to perform at least one practice quality improvement project in the previous 3 years on audit. The ABR al-lows individual, group, or institutional projects, and groups can include radiation oncologists and physicists. The ABR Web site gives useful examples and information for carrying out proj-ects (41). With this approach, changes are tested and implemented one at a time with periods of measurement in between (Fig 9).

Model for Health Care Improvement

The Institute for Healthcare Improvement is committed to redesigning health care into a system without errors, waste, delay, and unsus-tainable costs (43). Its model for health care

improvement is one of the most popular qual-ity improvement methods used in health care and an alternative to the ABR’s approach that involves framing the proposed project with three questions, namely “What are we trying to ac-complish?” “How will we know that the change is an improvement?” and “What changes can we make that will result in improvement?” (Fig 10) (42). Suggested process changes are then tested to obtain knowledge about the process or the potential effectiveness of proposed changes, re-fined, and tested again (in an iterative process), with changes implemented gradually. Each round of testing is conducted by using PDSA methods. Essentially, each test is a focused scientific experiment, with a hypothesis, plan-ning and conducting the test, analysis of results, and actions taken on the basis of the results, which could include making a change, modify-ing the planned changes, retesting, or rejecting the change (42). An advantage of the model for health care improvement is that it enables rapid iteration, with measures obtained continu-ously, allowing an effect on outcome almost immediately.

Page 13: Radiography - Focus on Health20).pdf · RadioGraphics 2015; 35:1630–1642 Published online 10.1148/rg.2015150057 Content Codes: 1From the Department of Radiology, Division of Cardiothoracic

RG • Volume 35 Number 6 Kelly and Cronin 1641

The ACRACR resources available to enable radiologists to assess, monitor, and improve radiology quality include the ACR physician consortium for perfor-mance improvement and national committee for quality assurance clinical performance measures, which can be used as a basis for quality improve-ment efforts (44). By registering with the ACR National Radiology Data Registry, radiology prac-tices can benchmark outcomes and process-of-care measures and develop quality improvement programs that qualify as ABR maintenance of cer-tification (45). Subscribing to ACR’s RADPEER allows practices to carry out periodic peer review and counts as a group practice quality improve-ment project for ABR maintenance of certification (46). Finally, freely available ACR practice param-eters and technical standards provide standards for safe and effective diagnostic imaging, and the ACR Appropriateness Criteria provide guidance on appropriate imaging in specific patient clinical scenarios (47,48).

ConclusionHerein, we described practical approaches to quality improvement for practicing radiologists and outlined available resources. Continuous quality improvement programs will ensure that radiology practice is optimized to ensure the highest quality patient care while satisfying regu-latory and health care policy requirements and containing costs.

References 1. Institute of Medicine. To err is human: building a safer

health system. https://www.iom.edu/~/media/Files/Re port%20Files/1999/To-Err-is-Human/To%20Err%20is%20Human%201999%20%20report%20brief.pdf. Published November 1999. Accessed February 22, 2015.

2. Institute of Medicine. Crossing the quality chasm: a new health system for the 21st century. https://www.iom.edu /Reports/2001/Crossing-the-Quality-Chasm-A-New-Health -System-for-the-21st-Century.aspx. Published March 2001. Accessed February 22, 2015.

3. Kruskal JB, Anderson S, Yam CS, Sosna J. Strategies for establishing a comprehensive quality and performance im-provement program in a radiology department. RadioGraphics 2009;29(2):315–329.

4. Centers for Medicare and Medicaid Services (CMS). Physician quality reporting system (PQRS). http://www .cms.gov/Medicare/Quality-Initiatives-Patient-Assessment -Instruments/PQRS/index.html?redirect=/PQRS/. Published April 2014. Accessed February 22, 2015.

5. Seltzer SE, Lee TH. The transformation of diagnostic radiol-ogy in the ACO era. JAMA 2014;312(3):227–228.

6. Toussaint JS, Berry LL. The promise of lean in health care. Mayo Clin Proc 2013;88(1):74–82.

7. Joint Commission. National patient safety goals (NPSG). http://www.jointcommission.org/standards_information /npsgs.aspx. Published 2015. Accessed February 22, 2015.

8. American College of Radiology. Accreditation. http://www .acr.org/quality-safety/accreditation. Accessed February 22, 2015.

9. Accreditation Council for Graduate Medical Education (ACGME). Diagnostic radiology milestones. http://www

.acgme.org/acgmeweb/tabid/148/ProgramandInstitutional Accreditation/Hospital-BasedSpecialties/DiagnosticRadiol ogy.aspx. Published 2012. Accessed February 22, 2015.

10. American Board of Radiology. Maintenance of certifica-tion part IV: practice quality improvement (PQI). http://www.theabr.org/moc-dr-comp4. Published 2007. Accessed February 22, 2015.

11. Johnson CD, Krecke KN, Miranda R, Roberts CC, Denham C. Quality initiatives: developing a radiology quality and safety program—a primer. RadioGraphics 2009;29(4): 951–959.

12. Abujudeh HH, Bruno MA. Basic definitions. In: Quality and safety in radiology. Oxford, England: Oxford University Press, 2012.

13. Erturk SM, Ondategui-Parra S, Ros PR. Quality manage-ment in radiology: historical aspects and basic definitions. J Am Coll Radiol 2005;2(12):985–991.

14. Applegate KE. Continuous quality improvement for radiolo-gists. Acad Radiol 2004;11(2):155–161.

15. Amaral CS, Rozenfeld H, Costa JM, Magon MdeF, Mas-carenhas YM. Improvement of radiology services based on the process management approach. Eur J Radiol 2011;78(3): 377–383.

16. Kruskal JB, Eisenberg R, Sosna J, Yam CS, Kruskal JD, Boiselle PM. Quality initiatives: quality improvement in radiology—basic principles and tools required to achieve success. RadioGraphics 2011;31(6):1499–1509.

17. Abujudeh HH, Kaewlai R, Asfaw BA, Thrall JH. Quality initiatives: key performance indicators for measuring and improving radiology department performance. RadioGraph-ics 2010;30(3):571–580.

18. Swensen SJ, Johnson CD. Radiologic quality and safety: map-ping value into radiology. J Am Coll Radiol 2005;2(12):992–1000.

19. Knechtges P, Decker MC. Application of kaizen methodology to foster departmental engagement in quality improvement. J Am Coll Radiol 2014;11(12 Pt A):1126–1130.

20. Donnelly LF, Gessner KE, Dickerson JM, et al. Quality initiatives: department scorecard—a tool to help drive imag-ing care delivery performance. RadioGraphics 2010;30(7): 2029–2038.

21. Abujudeh HH, Bruno MA. Control charts and dashboards. In: Quality and safety in radiology. Oxford, England: Oxford University Press, 2012.

22. Joint Commission. Sentinel events. http://www.jointcom mission.org/assets/1/6/CAMH_24_SE_all_CURRENT.pdf. Updated January 2015. Accessed February 22, 2015.

23. Abujudeh HH, Bruno MA. Root cause analysis (RCA) and health care failure and effect analysis (HFMEA). In: Quality and safety in radiology. Oxford, England: Oxford University Press, 2012.

24. Kruskal JB, Siewert B, Anderson SW, Eisenberg RL, Sosna J. Managing an acute adverse event in a radiology department. RadioGraphics 2008;28(5):1237–1250.

25. Bruno MA, Nagy P. Fundamentals of quality and safety in diagnostic radiology. J Am Coll Radiol 2014;11(12 Pt A):1115–1120.

26. Joint Commission. Framework for conducting a root cause analysis and action plan. http://www.jointcommission .org/Framework_for_Conducting_a_Root_Cause_Analy sis_and_Action_Plan/. Updated March 2013. Accessed February 22, 2015.

27. Thornton E, Brook OR, Mendiratta-Lala M, Hallett DT, Kruskal JB. Application of failure mode and effect analysis in a radiology department. RadioGraphics 2011;31(1):281–293.

28. Joint Commission. Failure mode and effect analysis (FMEA). http://www.jointcommission.org/Failure_Mode_Effect_and_Criticality_Analysis_FMECA_Worksheet/. Accessed July 24, 2015.

29. Abujudeh HH, Kaewlai R. Radiology failure mode and effect analysis: what is it? Radiology 2009;252(2):544–550.

30. DeRosier J, Stalhandske E, Bagian JP, Nudell T. Using health care failure mode and effect analysis: the VA National Center for Patient Safety’s prospective risk analysis system. Jt Comm J Qual Improv 2002;28(5):248–267, 209.

31. Dailey KW. The FMEA pocket handbook. New York, NY: DW Publishing, 2004.

Page 14: Radiography - Focus on Health20).pdf · RadioGraphics 2015; 35:1630–1642 Published online 10.1148/rg.2015150057 Content Codes: 1From the Department of Radiology, Division of Cardiothoracic

1642 October Special Issue 2015 radiographics.rsna.org

32. Toyota Production System. http://www.toyota-global.com /company/vision_philosophy/toyota_production_system/. Accessed July 24, 2015.

33. Kruskal JB, Reedy A, Pascal L, Rosen MP, Boiselle PM. Quality initiatives: lean approach to improving performance and efficiency in a radiology department. RadioGraphics 2012;32(2):573–587.

34. Vallejo B. Tools of the trade: lean and six sigma. J Healthc Qual 2009;31(3):3–4.

35. Lee E, Grooms R, Mamidala S, Nagy P. Six easy steps on how to create a lean sigma value stream map for a multidisci-plinary clinical operation. J Am Coll Radiol 2014;11(12 Pt A): 1144–1149.

36. i Six Sigma. SIPOC diagram. http://www.isixsigma.com/tools -templates/sipoc-copis/sipoc-diagram/. Accessed February 22, 2015.

37. i Six Sigma. Value stream mapping. http://www.isixsigma.com /tools-templates/value-stream-mapping/. Accessed February 22, 2015.

38. A3 Thinking. http://a3thinking.com/index.html. Published 2013. Accessed February 22, 2015.

39. Abujudeh HH, Bruno MA. Six sigma and lean: opportunities for health care to do more and better with less. In: Quality and safety in radiology. Oxford, England: Oxford University Press, 2012.

40. Strife JL, Kun LE, Becker GJ, Dunnick NR, Bosma J, Hattery RR. American Board of Radiology perspective on

maintenance of certification. Part IV. Practice quality im-provement for diagnostic radiology. RadioGraphics 2007; 27(3):769–774.

41. American Board of Radiology. Maintenance of certification: PQI projects and templates. http://www.theabr.org/moc-dr -pqi-projects. Accessed February 22, 2015.

42. Lee CS, Larson DB. Beginner’s guide to practice quality improvement using the model for improvement. J Am Coll Radiol 2014;11(12 Pt A):1131–1136.

43. Institute for Healthcare Improvement Web site. http://www .ihi.org/Pages/default.aspx. Accessed February 22, 2015.

44. American College of Radiology. Performance measures. http://www.acr.org/Quality-Safety/Quality-Measurement /Performance-Measures. Updated 2015. Accessed February 22, 2015.

45. American College of Radiology. National radiology data reg-istry. http://www.acr.org/Quality-Safety/National-Radiology-Data-Registry. Updated 2015. Accessed February 22, 2015.

46. American College of Radiology. RADPEER. http://www.acr .org/Quality-Safety/RADPEER. Accessed February 22, 2015.

47. American College of Radiology. Practice parameters and technical standards. http://www.acr.org/Quality-Safety /Standards-Guidelines. Updated 2015. Accessed February 22, 2015.

48. American College of Radiology. ACR appropriateness criteria. http://www.acr.org/Quality-Safety/Appropriateness-Criteria. Updated 2015. Accessed February 22, 2015.

This journal-based SA-CME activity has been approved for AMA PRA Category 1 CreditTM. See www.rsna.org/education/search/RG.

Page 15: Radiography - Focus on Health20).pdf · RadioGraphics 2015; 35:1630–1642 Published online 10.1148/rg.2015150057 Content Codes: 1From the Department of Radiology, Division of Cardiothoracic

QUESTIONNAIRE A24(20)

Practical Approaches to Quality Improvement for Radiologists INSTRUCTIONS

• Read through the article and answer the multiple choice questions provided below

• Some questions may have more than one answer; in which case you must please mark all the correct answers

Question 1: Which one of the following is NOT a rule for redesign, as suggested by the “Crossing the Quality Chasm” report?

A: Care based on continuous healing relationships B: Care customized according to practitioner needs and

values C: Evidence-based decision making D: The prioritization of cooperation among clinicians

Question 2: Is it TRUE or FALSE that quality improvement efforts in health care delivery are essential for a number of reasons, including ensuring patient safety, continuously improving processes and enabling staff development through continuing education?

A: TRUE B: FALSE

Question 3: Which one of the following is TRUE regarding “Quality assurance?”

A: It is carried out retrospectively to establish when measurements fall outside ranges of acceptability

B: It involves technical testing of imaging equipment and evaluation of imaging quality

C: It includes evaluation of shielding around radiography facilities

D: It aims to ensure excellent standards by means of systematic collection and evaluation of data

Question 4: Which one of the following is NOT TRUE regarding “quality improvement and continuous quality improvement?”

A: Clear standards should be identified for every activity or process in an imaging facility

B: The focus is to improve the effectiveness of diagnostic/therapeutic procedure processes, the appropriateness of imaging and procedures and the management of all imaging services

C: It is superior to “Quality management” because a uniform commitment to quality across the whole organization is required

D: None of the above Question 5: Which one of the following is a “key process metric?”

A: Critical test result reporting B: Correct image labeling C: Procedural outcomes D: Report finalization times

Question 6: Which one of the following is the first step in root cause analysis of sentinel events?

A: To identify root causes B: To identify proximate causes C: To organize a team D: To design and implement necessary interim changes

Question 7: According to figure 2, which one of the following reasons is related to the “Man” for the wrong imaging test being performed in the FMEA process?

A: Requesting physician’s handwriting is not legible B: Requesting physician not educated in calibration

testing C: Physician did not notice wrong exam was done D: Physician exposed patient to too much radiation

Question 8: Which one of the following is NOT a step involved in the “plan, do, study, act” (PDSA) cycle?

A: Identify process to study B: Create grading tool C: Initiate testing D: Test course of action E: Give feedback

Question 9: Which of the following is one of the eight wastes of the lean management approach?

A: Underproduction B: Transparency C: Overutilizing facilities D: Inventory

Question 10: Is it TRUE or FALSE that the Kaizen tools add a human element to the traditional PDSA cycles in that one stakeholder can decide what constitutes waste, thereby ensuring more efficacy in the process of implementing changes?

A: TRUE B: FALSE

Page 16: Radiography - Focus on Health20).pdf · RadioGraphics 2015; 35:1630–1642 Published online 10.1148/rg.2015150057 Content Codes: 1From the Department of Radiology, Division of Cardiothoracic

Question 11: Which one of the following is most relevant regarding Six Sigma?

A: It indicates six standard deviations from the arithmetic

mean and allows six defects per million

B: It involves analysis of small components of problems

with rapid implementation of solutions

C: It focuses on measuring the number of defects,

identifying the sources of error and systematically

determining how to avoid them

D: None of the above

Question 12: Which one of the following is a perceived weakness of Six Sigma?

A: Its complexity could result in overworking simple problems with obvious solutions

B: Its complexity could result in a perceived unacceptable number of defects in practice

C: It involves too many team members; potentially adding to problems not being solved

D: None of the above Question 13: Which one of the following is TRUE regarding the ABR model?

A: It was established in the 1980s and popularized by the company Motorola

B: It asks what we are trying to accomplish and how we will know that the change is an improvement

C: It consists of a baseline PDSA cycle, implementing change and post-improvement PDSA cycle

D: It has an implementation of improvement plan as one of its three steps

Question 14: Which one of the following is an advantage of the model for health care improvement?

A: Changes are tested and implemented simultaneously with periods of measurement in- between

B: It enables rapid iteration allowing an almost immediate effect on outcome

C: The various steps in the model ensure that all aspects of a problem are addressed

D: Results are only available on the medium term; but have high reliability

Question 15: It is TRUE or FALSE that ACR practice parameters are freely available, provide standards for effective and safe diagnostic imaging, and the criteria provide guidance on appropriate imaging in specific clinical scenarios?

A: TRUE B: FALSE

END

Page 17: Radiography - Focus on Health20).pdf · RadioGraphics 2015; 35:1630–1642 Published online 10.1148/rg.2015150057 Content Codes: 1From the Department of Radiology, Division of Cardiothoracic

For office use MARK: /15 = _______%

(70% PASS RATE)

FAILED (R50 to resubmit)

PASSED (IAR will be sent)

MODERATED BY: CAPTURED: DATE:

PO Box 71 Wierda Park 0149

400 Theuns van Niekerk Street Wierda Park 0157

http://foh-cpd.co.za Whatsapp: 074 230 3874 Tel: 012 653-0133 /2373 Mon-Fri: 07:30-16:30

Activity is accredited for THREE (3) CLINICAL CEU’s

PERSONAL INFORMATION

(If your personal details have not changed, only complete the sections marked with an asterisk *)

Circle your speciality Diagnostic Radiography

CT MRI X Ray Mammography Ultrasound Radiation Oncology

Nuclear Medicine

ANSWER SHEET A24 (20)

Practical approaches to quality improvement for Radiologists

SEND ANSWER SHEET TO:

FAX: 086 614 4200 / 012 653 2073 OR WHATSAPP: 074 230 3874 OR EMAIL: [email protected]

YOU WILL RECEIVE A CONFIRMATION OF RECEIPT SMS WITHIN 12-24 HOURS, IF NOT RECEIVED PLEASE SEND AGAIN

Please rate the article:

HPCSA No *FOH Number

*Initials &Surname *Cell Number needed for confirmation sms

Employer Email address

*Time spent on activity _____Hour _____Min

A B C D E A B C D E

1 9

2 10

3 11

4 12

5 13

6 14

7 15

8

I hereby declare that the completion of this document is my own effort without any assistance.

Signed:

Date:

POOR 1

FAIR 2

AVERAGE 3

GOOD 4

EXCELLENT 5