what does the evidence tell us about ux? · what does the evidence tell us about ux? march 3, 2016...
TRANSCRIPT
What Does the Evidence Tell Us About UX? March 3, 2016
Raj Ratwani, PhD Scientific Director1
Zach Hettinger, MS, MD Medical Director1
Emergency Physician2
Erica Savage Project Manager and Policy Researcher1
1National Center for Human Factors in Healthcare, MedStar Health 2MedStar Union Memorial Hospital
Conflict of Interest Raj Ratwani, PhD and Aaron Zachary Hettinger, MD MS have received research funding from AHRQ, ONC, and the AMA. Erica Savage has no conflicts to report.
Agenda • EHR Usability Policy and Practices
• Current Gaps and Solutions
• Usability Policy Comparisons Across Industries
Learning Objectives • Discuss the current state of UX understanding • Identify resources for UX research and evaluation studies • Describe methods used to improve UX
EHR Usability: From Practice to Policy Raj Ratwani, PhD Scientific Director, National Center for Human Factors in Healthcare, MedStar Health Assistant Professor of Emergency Medicine, Georgetown University School of Medicine
Usability and User Centered Design • Extent to which a product can be used by specified users
to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use
• Usability is measurable
• User centered design is a process for developing usable systems
ISO 3407
1: User Interface Design
Displays and Controls Screen Design Clicks & Drags
Colors & Navigation
The Two Bins of Usability
2: Cognitive Task Support
“Workflow Design”
Smart Data Visualization Support Cognitive Work
Usefulness
Photo credit to Bob Wears, MD, PhD
Frontline Impact
• How does use of a new EHR with CPOE impact physician performance in the ED?
– Three study periods: pre, go-live, post – 2 hour observation periods – 14 EM physicians during each phase
• Observers record minute by minute allocation to different tasks – Computer, verbal communication, patient time, paper
Benda, N., Meadors, MA, Hettinger, A.Z, and Ratwani, R.M. (in press). Emergency Physician Task Switching Increases with the Introduction of a Commercial EHR. Annals of Emergency Medicine
Task Allocation Time
Benda, N., Meadors, MA, Hettinger, A.Z, and Ratwani, R.M. (in press). Emergency Physician Task Switching Increases with the Introduction of a Commercial EHR. Annals of Emergency Medicine
Tasks Per Minute
Benda, N., Meadors, MA, Hettinger, A.Z, and Ratwani, R.M. (in press). Emergency Physician Task Switching Increases with the Introduction of a Commercial EHR. Annals of Emergency Medicine
Significant Increase in Task Switching • The cognitive cost of the EHR is increased task switching:
– Increased stress and frustration – Increase likelihood of error
• Users are forced to adapt their work practices to the EHR
Benda, N., Meadors, MA, Hettinger, A.Z, and Ratwani, R.M. (in press). Emergency Physician Task Switching Increases with the Introduction of a Commercial EHR. Annals of Emergency Medicine
Policies to Promote Usability
• Office of the National Coordinator (ONC) has certification requirements in place to promote usability:
– Vendors must attest to a UCD process
– Vendors must conduct a summative (final) usability test on a subset of functions and report their method and results
• The information for certified vendor products is publically
available
Snapshot of Vendor UCD Practices
• Visited 11 different EHR vendors to learn about their processes (2012)
• Analyzed the vendor usability reports for 50 products with the highest attestation frequencies
– Focused on CPOE – Reported UCD process – Number of participants and participant
demographics Ratwani, R.M., Fairbanks, R.J., Hettinger, A.Z. & Benda, N. (2015). Electronic Health Record Usability: Analysis of the User Centered Design Processes of Eleven Electronic Health Record Vendors. Journal of the American Medical Informatics Association. Ratwani, R.M., Benda, N., Hettinger, A.Z., & Fairbanks, R.J. (2015) Electronic Health Record Vendor Adherence to Usability Certification Requirements and Testing Standards. Journal of the American Medical Association (JAMA) 314(10):1070-1071.
Reported UCD Process
Test Participant Demographics
Comparing Vendor Usability Processes • Using the certified health product list documents we compared 20
vendors on: – User centered design process – Summative testing
• Methodology • Results
• Many vendors did not : – Use rigorous use cases – Have appropriate metrics – Discuss improvements based on findings
www.healthITusability.org
Usability is Not Vendor-Centric Problem
UCD and Development
Customization &
Implementation
Safety and Usability
Monitoring Certification
EHR Usability: Current Gaps and Solutions A. Zach Hettinger, MD, MS Medical Director, National Center for Human Factors in Healthcare, MedStar Health Assistant Professor of Emergency Medicine, Georgetown University School of Medicine Emergency Physician, MedStar Union Memorial Hospital
Poor Health IT Usability • Often Unrecognized as a Systems Issue • Frequently Blame the User • Rarely Reported as Safety Events
HEALTHY: Primary Prevention
HAS HEART DISEASE Secondary Prevention
HAD HEART ATTACK Tertiary Prevention
Healthy Lifestyle
Smoking Cessation
Screening for Risk Factors
Control of Risk Factors
CAD: Management After Heart
Attack
Optimizing Management
of Heart Failure
Prevention Model: Heart Disease
Integrated Patient Safety Transformational Model Primary Prevention Secondary Prevention Tertiary Prevention
User-Centered Design
Hazard Reporting & Analysis
Adverse Event Reporting
Pre-Hazard Post-Hazard Post-Harm HARM HAZARD
Clinical Data Risk Assessment
Claims Analysis Implementation “Best Practices”
User Analytics & Help Desk Tickets
Wait for the Harm(Tertiary) • Claims Analysis
– 1% (n=238) if cases identified as containing Health IT contribution to error
– 80% moderate to severe harm – Class of error not related to harm
Graber ML, Siegal D, Riah H, Johnston D, Kenyon K. Electronic Health Record-Related Events in Medical Malpractice Claims. J Patient Saf. 2015 Nov . PMID: 26558652.
• 1 serious or major injury
• 10 minor injuries
• 30 property damage injuries
• 600 incidents with no visible damage or injury
Based on 1,753,498 accidents from 297 companies, 21 different industries
Bird, 1969
Slide acknowledgment: Robert Panzer, MD
Accident Causation Pyramid Tip of the Iceberg
Wait for the Harm(Tertiary) • Adverse Event Analysis
– Similar to claims – Health IT usability often goes unrecognized – Example
• Drug Database conversion: Phenytoin, ER/IR
Avoid Harm(Secondary) • Patient Safety Event Reporting • Clinical Data Risk Analysis • User Analytics & Help Desk Tickets
Resilience Engineering • Consequences Driven
– Focus on the <1% of actions/events that lead to harm – The only thing that separates catastrophe from serendipity
• Process Driven – Examine the 99%+ of cases/events that no harm occurs – “Good Luck” – “I could have told you that was going to happen”
Hollnagel E. Prologue: the scope of resilience engineering. In: E.Hollnagel, Paries J, D.D.Woods, Wreathall J, eds. Resilience Engineering in Practice: A Guidebook2011.
Patient Safety Event Reporting • Review of Identified Hazards and Near Misses • Example
– Look Alike/Sound Alike Medication Errors • ISMP Dangerous Dyads • 70,000+ Event Reports • 84/130 Identified Cases of Errors • ~10% occurred during administration phase
If you miss the difference between “O” and “R” the patient will remind you
Clinical Data Risk Assessment
“Hidden” Sources • User Analytics
– Often used to identify poor performers – Target for training
• Help Desk Tickets – Password Resets – User Errors
• Vs. Use Errors
High Risk Industries • Human Factors Engineering • User Centered Design • Proactive Risk Assessment • Failure Modes Effect Analysis • Etc…
• All used with success in other industries
Policy Comparison Across Industries Erica Savage Project Manager and Policy Researcher, National Center for Human Factors in Healthcare, MedStar Health
Government Regulation in High Risk Industries
Government Regulation in High Risk Industries
High Risk Industries
Goal • Compare regulations and enforcement methods across
federal agencies to identify standards or processes that could be leveraged to optimize electronic health record usability.
Rigor of design process used Additional practices
Availability of interface level
design specifications
2 1 3
Method
Office of the National Coordinator (ONC)
Summative testing
No design specifications
Regulations User-centered design process, Attestation-based evaluation Rigor of
design process
Availability of design specs.
Additional info.
Federal Aviation Administration (FAA)
Test in a realistic environment
Certifying personnel with HF/usability expertise
Design specifications
Regulations
Standard test scenarios
Human-centered design process, Evidence-based evaluation Rigor of
design process
Availability of design specs.
Additional info.
Food and Drug Administration (FDA)
Industry specific design specifications
Regulations
Formative testing
Human-centered design process, Evidence-based evaluation
Innovation
Address usability issues
Rigor of design process
Availability of design specs.
Additional info.
What can we take away?
Use standardized test scenarios in a simulated or test
environment
Require evidence-based evaluation
Rigor of design process used Additional practices
Availability of interface design specifications
2 1 3
Implement evidence-based
evaluation, conducted by
HF/usability experts
Utilize interface level design
specifications, may mean different sets for different groups
Questions
Raj Ratwani, PhD @RajRatwani [email protected]
Zach Hettinger, MD, MS @ZachHettinger [email protected]
Erica Savage @EricaSavage [email protected]
www.medicalHFE.org
www.healthITusability.org