human – robot communication

52
Paul Fitzpatrick Human – Robot Human – Robot Communication Communication

Upload: stuart-carver

Post on 31-Dec-2015

48 views

Category:

Documents


2 download

DESCRIPTION

Human – Robot Communication. Paul Fitzpatrick. Human-readable actions Reading human actions Conclusions. Human – Robot Communication. Motivation for communication. Motivation. What is communication for? Transferring information Coordinating behavior What is it built from? Commonality - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Human – Robot Communication

Paul Fitzpatrick

Human – Robot CommunicationHuman – Robot Communication

Page 2: Human – Robot Communication

Motivation for communication Human-readable actions Reading human actions Conclusions

Human – Robot CommunicationHuman – Robot Communication

Page 3: Human – Robot Communication

MotivationMotivation

What is communication for?– Transferring information– Coordinating behavior

What is it built from?– Commonality – Perception of action– Protocols

Page 4: Human – Robot Communication

Communication protocolsCommunication protocols

Computer – computer protocols

TCP/IP, HTTP, FTP, SMTP, …

Page 5: Human – Robot Communication

Communication protocolsCommunication protocols

Human – human protocols

Human – computer protocols

Initiating conversation, turn-taking, interrupting,

directing attention, …

Shell interaction, drag-and-drop,dialog boxes, …

Page 6: Human – Robot Communication

Communication protocolsCommunication protocols

Human – human protocols

Human – computer protocols

Human – robot protocols

Initiating conversation, turn-taking, interrupting,

directing attention, …

Shell interaction, drag-and-drop,dialog boxes, …

Page 7: Human – Robot Communication

Requirements on robotRequirements on robot

Human-oriented perception– Person detection, tracking– Pose estimation– Identity recognition– Expression classification– Speech/prosody recognition– Objects of human interest

Human-readable action– Clear locus of attention– Express engagement– Express confusion, surprise– Speech/prosody generation

ENGAGED ACQUIREDPointing (53,92,12)Fixating (47,98,37)Saying “/o’ver[200] \/there[325]”

Page 8: Human – Robot Communication

Example: attention protocolExample: attention protocol

Expressing attention Influencing other’s attention Reading other’s attention

Page 9: Human – Robot Communication

Motivation for communication Human-readable actions Reading human actions Conclusions

Foveate gazeFoveate gaze

Page 10: Human – Robot Communication

Human gaze reflects attentionHuman gaze reflects attention

(Taken from C. Graham, “Vision and Visual Perception”)

Page 11: Human – Robot Communication

Types of eye movementTypes of eye movement

Vergence angle

Left eye

Right eye

Ballistic saccade to new target

Smooth pursuit and vergence co-operate to track object

(Based on Kandel & Schwartz, “Principles of Neural Science”)

Page 12: Human – Robot Communication

Engineering gazeEngineering gaze

Kismet

Page 13: Human – Robot Communication

Collaborative effortCollaborative effort

Cynthia Breazeal Brian Scassellati And others Will describe components I’m

responsible for

Page 14: Human – Robot Communication

Engineering gazeEngineering gaze

Page 15: Human – Robot Communication

Engineering gazeEngineering gaze

“Cyclopean” camera Stereo pair

Page 16: Human – Robot Communication

Tip-toeing around 3DTip-toeing around 3D

WideViewcamera

Narrowviewcamera

Objectof interest

Field of view

Rotatecamera

New field of view

Page 17: Human – Robot Communication

ExampleExample

Page 18: Human – Robot Communication

Influences on attentionInfluences on attention

Page 19: Human – Robot Communication

Influences on attentionInfluences on attention

Built in biases

Page 20: Human – Robot Communication

Influences on attentionInfluences on attention

Built in biases Behavioral state

Page 21: Human – Robot Communication

Influences on attentionInfluences on attention

Built in biases Behavioral state Persistence

slipped……recovered

Page 22: Human – Robot Communication

Directing attentionDirecting attention

Page 23: Human – Robot Communication

Motivation for communication Human-readable actions Reading human actions Conclusions

Head pose estimationHead pose estimation

Page 24: Human – Robot Communication

Head pose estimation (rigid)Head pose estimation (rigid)

* Nomenclature varies

Yaw* Pitch* Roll*Translation

inX, Y, Z

Page 25: Human – Robot Communication

Head pose literatureHead pose literature

Horprasert, Yacoob, Davis ’97

McKenna, Gong ’98

Wang, Brandstein ’98

Basu, Essa, Pentland ’96

Harville, Darrell, et al ’99

Page 26: Human – Robot Communication

Head pose: AnthropometricsHead pose: Anthropometrics

Horprasert, Yacoob, Davis

McKenna, Gong

Wang, Brandstein

Basu, Essa, Pentland

Harville, Darrell, et al

Page 27: Human – Robot Communication

Head pose: EigenposeHead pose: Eigenpose

Horprasert, Yacoob, Davis

McKenna, Gong

Wang, Brandstein

Basu, Essa, Pentland

Harville, Darrell, et al

Page 28: Human – Robot Communication

Head pose: ContoursHead pose: Contours

Horprasert, Yacoob, Davis

McKenna, Gong

Wang, Brandstein

Basu, Essa, Pentland

Harville, Darrell, et al

Page 29: Human – Robot Communication

Head pose: mesh modelHead pose: mesh model

Horprasert, Yacoob, Davis

McKenna, Gong

Wang, Brandstein

Basu, Essa, Pentland

Harville, Darrell, et al

Page 30: Human – Robot Communication

Head pose: IntegrationHead pose: Integration

Horprasert, Yacoob, Davis

McKenna, Gong

Wang, Brandstein

Basu, Essa, Pentland

Harville, Darrell, et al

Page 31: Human – Robot Communication

My approachMy approach

Integrate changes in pose (after Harville et al)

Use mesh model (after Basu et al) Need automatic initialization

– Head detection, tracking, segmentation– Reference orientation– Head shape parameters

Initialization drives design

Page 32: Human – Robot Communication

Head tracking, segmentationHead tracking, segmentation

Segment by color histogram, grouped motion Match against ellipse model (M. Pilu et al)

Page 33: Human – Robot Communication

Mutual gaze as reference pointMutual gaze as reference point

Page 34: Human – Robot Communication

Mutual gaze as reference pointMutual gaze as reference point

Page 35: Human – Robot Communication

Tracking pose changesTracking pose changes

Choose coordinates to suit tracking 4 of 6 degrees of freedom measurable

from monocular image Independent of shape parameters

X translation Y translationTranslation

in depthIn-planerotation

Page 36: Human – Robot Communication

Remaining coordinatesRemaining coordinates

2 degrees of freedom remaining Choose as surface coordinate on head Specify where image plane is tangent to

head Isolates effect of errors in parameters

Tangent region shifts when head rotates in depth

Page 37: Human – Robot Communication

Surface coordinatesSurface coordinates

Establish surface coordinate system with mesh

Page 38: Human – Robot Communication

Initializing a surface meshInitializing a surface mesh

Page 39: Human – Robot Communication

ExampleExample

Page 40: Human – Robot Communication

Typical resultsTypical results

Ground truth due to Sclaroff et al.

Page 41: Human – Robot Communication

MeritsMerits

No need for any manual initialization Capable of running for long periods Tracking accuracy is insensitive to model User independent Real-time

Page 42: Human – Robot Communication

ProblemsProblems

Greater accuracy possible with manual initialization

Deals poorly with certain classes of head movement (e.g. 360° rotation)

Can’t initialize without occasional mutual regard

Page 43: Human – Robot Communication

Motivation for communication Human-readable actions Reading human actions Conclusions

Page 44: Human – Robot Communication

Other protocolsOther protocols

Page 45: Human – Robot Communication

Other protocolsOther protocols

Protocol for negotiating interpersonal distance

Comfortable interaction distance

Too close – withdrawal response

Too far – calling

behavior

Person draws closer

Person backs off

Beyond sensor range

Page 46: Human – Robot Communication

Other protocolsOther protocols

Protocol for negotiating interpersonal distance

Protocol for controlling the presentation of objects

Comfortable interaction speed

Too fast – irritation response

Too fast,Too close –

threat response

Page 47: Human – Robot Communication

Protocol for negotiating interpersonal distance

Protocol for controlling the presentation of objects

Protocol for conversational turn-taking

Other protocolsOther protocols

Protocol for introducing vocabulary Protocol for communicating processes

Protocols make good modules

Page 48: Human – Robot Communication

LinuxSpeech

recognition

Speakers

NTspeech synthesisaffect recognition

LinuxSpeech

recognition

Face Control

Emotion

Percept& Motor

Drives & Behavior

L

Tracker

Attent.system

Dist.to

target

Motionfilter

Eyefinder

Motorctrl

audiospeechcomms

Skinfilter

Colorfilter

QNX

CORBA

sockets,CORBA

CORBA

dual-portRAM

CamerasEye, neck, jaw motors

Ear, eyebrow, eyelid,lip motors

Microphone

Trackhead

Recog.pose

Trackpose

Page 49: Human – Robot Communication

SkinDetector

ColorDetector

MotionDetector

FaceDetector

WideFrame Grabber

MotionControlDaemon

RightFrame Grabber

LeftFrame Grabber

Right FovealCamera

WideCamera

Left FovealCamera

Eye-Neck Motors

W W W W

Attention

WideTracker

FovealDisparity

Smooth Pursuit& Vergence

w/ neck comp.VOR

Saccadew/ neckcomp.

Fixed ActionPattern

AffectivePostural Shiftsw/ gaze comp.

Arbitor

Eye-Head-Neck Control

DisparityBallistic movement Locus of attentionBehaviorsMotivations

, , . ..

, d d 2

s s s

, d d 2

p p p

, d d 2

f f f , d d 2

v v v

WideCamera 2

Tracked target

Salient target

Eyefinder

LeftFrame Grabber

Distanceto target

Page 50: Human – Robot Communication

Other protocolsOther protocols

What about robot – robot protocol? Basically computer – computer But physical states may be hard to model Borrow human – robot protocol for these

Page 51: Human – Robot Communication

Current, future workCurrent, future work

Protocols for reference– Know how to point to an object– How to point to an attribute?– Or an action?

Until a better answer comes along:– Communicate task/game that depends on

attribute/action– Pull out number of classes, positive and

negative examples for supervised learning

Page 52: Human – Robot Communication

FINFIN