improved interfaces for human-robot interaction in urban search and rescue michael baker robert...

Post on 22-Dec-2015

213 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Improved Interfaces for Human-Robot Interaction in Urban Search and Rescue

Michael BakerRobert CaseyBrenden KeyesHolly A. Yanco

University of Massachusetts Lowell

Why Human-Robot Interaction in Urban Search and Rescue is Hard

Important to avoid secondary collapse by navigating safely “Tunnel vision” view of the environment Easy to miss vital information and details No sense of scale, travel direction or color

Video courtesy of CRASAR

Human-Robot Interaction Issues in Urban Search and Rescue

Usability studies show:– Users spend a lot of time trying to gain

situation awareness– 30% of run time spent looking around instead

of navigating– Most users focus only on the video display

We are looking to create an interface, that will be simple yet robust.

Problems with Existing Interfaces

Designed for more than one task

GUI shows extraneous information

Sensor information is too spread out

Large learning curve

Not configurable

Problems with Existing Interfaces

Wasted real estate Sonar map is

difficult to read Map is not on the

same eye level.

Our Approach

Capitalize on the user’s natural area of focus Fuse sensor information to decrease cognitive load

– Present sensor information so it’s readily and easily understood

– Increase situation awareness while decreasing the user’s mental effort

Enhancements to increase user efficiency– Suggestions– Additional sensors

UMass Lowell’s USAR Interface

Pan and Tilt Indicators

Pan and Tilt Indicators

Single Camera Problems

User has to remember what is behind the robot

Leads to Problems– 41% rear hits– Poor situation awareness behind the robot

Two Camera Solution

“Rear view mirror”-inspired video display Automatic remapping of drive commands to simplify navigation Automatic remapping of range information to match robot

travel direction

Ranging Information

Ranging information is displayed around the video– Takes advantage of the user’s natural area of

focus

Use color, number of bars and location to lessen the user’s cognitive effort

Option to display raw distance values

Ranging Information

Ranging information is displayed around the video

– Takes advantage of the user’s natural area of focus

Use color, number of bars and location to lessen the user’s cognitive effort

Option to display raw distance values

Map of the Environment

Preliminary Tests

Users liked ability to switch the camera view Users prefer the joystick over the keyboard

control Suggestions were helpful Usability tests in progress

Ongoing Work

Customizations– People interact with the same interface differently – Reflect user’s preference, not developer’s

Use layered sensor modalities Variable frame rates for front and rear camera

UMass Lowell USAR Interface

top related