improved interfaces for human-robot interaction in urban search and rescue michael baker robert...

17
Improved Interfaces for Human-Robot Interaction in Urban Search and Rescue Michael Baker Robert Casey Brenden Keyes Holly A. Yanco University of Massachusetts Lowell

Post on 22-Dec-2015

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Improved Interfaces for Human-Robot Interaction in Urban Search and Rescue Michael Baker Robert Casey Brenden Keyes Holly A. Yanco University of Massachusetts

Improved Interfaces for Human-Robot Interaction in Urban Search and Rescue

Michael BakerRobert CaseyBrenden KeyesHolly A. Yanco

University of Massachusetts Lowell

Page 2: Improved Interfaces for Human-Robot Interaction in Urban Search and Rescue Michael Baker Robert Casey Brenden Keyes Holly A. Yanco University of Massachusetts

Why Human-Robot Interaction in Urban Search and Rescue is Hard

Important to avoid secondary collapse by navigating safely “Tunnel vision” view of the environment Easy to miss vital information and details No sense of scale, travel direction or color

Video courtesy of CRASAR

Page 3: Improved Interfaces for Human-Robot Interaction in Urban Search and Rescue Michael Baker Robert Casey Brenden Keyes Holly A. Yanco University of Massachusetts

Human-Robot Interaction Issues in Urban Search and Rescue

Usability studies show:– Users spend a lot of time trying to gain

situation awareness– 30% of run time spent looking around instead

of navigating– Most users focus only on the video display

We are looking to create an interface, that will be simple yet robust.

Page 4: Improved Interfaces for Human-Robot Interaction in Urban Search and Rescue Michael Baker Robert Casey Brenden Keyes Holly A. Yanco University of Massachusetts

Problems with Existing Interfaces

Designed for more than one task

GUI shows extraneous information

Sensor information is too spread out

Large learning curve

Not configurable

Page 5: Improved Interfaces for Human-Robot Interaction in Urban Search and Rescue Michael Baker Robert Casey Brenden Keyes Holly A. Yanco University of Massachusetts

Problems with Existing Interfaces

Wasted real estate Sonar map is

difficult to read Map is not on the

same eye level.

Page 6: Improved Interfaces for Human-Robot Interaction in Urban Search and Rescue Michael Baker Robert Casey Brenden Keyes Holly A. Yanco University of Massachusetts

Our Approach

Capitalize on the user’s natural area of focus Fuse sensor information to decrease cognitive load

– Present sensor information so it’s readily and easily understood

– Increase situation awareness while decreasing the user’s mental effort

Enhancements to increase user efficiency– Suggestions– Additional sensors

Page 7: Improved Interfaces for Human-Robot Interaction in Urban Search and Rescue Michael Baker Robert Casey Brenden Keyes Holly A. Yanco University of Massachusetts

UMass Lowell’s USAR Interface

Page 8: Improved Interfaces for Human-Robot Interaction in Urban Search and Rescue Michael Baker Robert Casey Brenden Keyes Holly A. Yanco University of Massachusetts

Pan and Tilt Indicators

Page 9: Improved Interfaces for Human-Robot Interaction in Urban Search and Rescue Michael Baker Robert Casey Brenden Keyes Holly A. Yanco University of Massachusetts

Pan and Tilt Indicators

Page 10: Improved Interfaces for Human-Robot Interaction in Urban Search and Rescue Michael Baker Robert Casey Brenden Keyes Holly A. Yanco University of Massachusetts

Single Camera Problems

User has to remember what is behind the robot

Leads to Problems– 41% rear hits– Poor situation awareness behind the robot

Page 11: Improved Interfaces for Human-Robot Interaction in Urban Search and Rescue Michael Baker Robert Casey Brenden Keyes Holly A. Yanco University of Massachusetts

Two Camera Solution

“Rear view mirror”-inspired video display Automatic remapping of drive commands to simplify navigation Automatic remapping of range information to match robot

travel direction

Page 12: Improved Interfaces for Human-Robot Interaction in Urban Search and Rescue Michael Baker Robert Casey Brenden Keyes Holly A. Yanco University of Massachusetts

Ranging Information

Ranging information is displayed around the video– Takes advantage of the user’s natural area of

focus

Use color, number of bars and location to lessen the user’s cognitive effort

Option to display raw distance values

Page 13: Improved Interfaces for Human-Robot Interaction in Urban Search and Rescue Michael Baker Robert Casey Brenden Keyes Holly A. Yanco University of Massachusetts

Ranging Information

Ranging information is displayed around the video

– Takes advantage of the user’s natural area of focus

Use color, number of bars and location to lessen the user’s cognitive effort

Option to display raw distance values

Page 14: Improved Interfaces for Human-Robot Interaction in Urban Search and Rescue Michael Baker Robert Casey Brenden Keyes Holly A. Yanco University of Massachusetts

Map of the Environment

Page 15: Improved Interfaces for Human-Robot Interaction in Urban Search and Rescue Michael Baker Robert Casey Brenden Keyes Holly A. Yanco University of Massachusetts

Preliminary Tests

Users liked ability to switch the camera view Users prefer the joystick over the keyboard

control Suggestions were helpful Usability tests in progress

Page 16: Improved Interfaces for Human-Robot Interaction in Urban Search and Rescue Michael Baker Robert Casey Brenden Keyes Holly A. Yanco University of Massachusetts

Ongoing Work

Customizations– People interact with the same interface differently – Reflect user’s preference, not developer’s

Use layered sensor modalities Variable frame rates for front and rear camera

Page 17: Improved Interfaces for Human-Robot Interaction in Urban Search and Rescue Michael Baker Robert Casey Brenden Keyes Holly A. Yanco University of Massachusetts

UMass Lowell USAR Interface