paul fitzpatrick human – robot communication. motivation for communication human-readable...

52
Paul Fitzpatrick Human – Robot Human – Robot Communication Communication

Upload: anna-morris

Post on 06-Jan-2018

223 views

Category:

Documents


2 download

DESCRIPTION

Motivation  What is communication for? –Transferring information –Coordinating behavior  What is it built from? –Commonality –Perception of action –Protocols

TRANSCRIPT

Page 1: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Paul Fitzpatrick

Human – Robot CommunicationHuman – Robot Communication

Page 2: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Motivation for communication Human-readable actions Reading human actions Conclusions

Human – Robot CommunicationHuman – Robot Communication

Page 3: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

MotivationMotivation

What is communication for?– Transferring information– Coordinating behavior

What is it built from?– Commonality – Perception of action– Protocols

Page 4: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Communication protocolsCommunication protocols

Computer – computer protocols

TCP/IP, HTTP, FTP, SMTP, …

Page 5: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Communication protocolsCommunication protocols

Human – human protocols

Human – computer protocols

Initiating conversation, turn-taking, interrupting,

directing attention, …

Shell interaction, drag-and-drop,dialog boxes, …

Page 6: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Communication protocolsCommunication protocols

Human – human protocols

Human – computer protocols

Human – robot protocols

Initiating conversation, turn-taking, interrupting,

directing attention, …

Shell interaction, drag-and-drop,dialog boxes, …

Page 7: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Requirements on robotRequirements on robot

Human-oriented perception– Person detection, tracking– Pose estimation– Identity recognition– Expression classification– Speech/prosody recognition– Objects of human interest

Human-readable action– Clear locus of attention– Express engagement– Express confusion, surprise– Speech/prosody generation

ENGAGED ACQUIREDPointing (53,92,12)Fixating (47,98,37)Saying “/o’ver[200] \/there[325]”

Page 8: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Example: attention protocolExample: attention protocol

Expressing attention Influencing other’s attention Reading other’s attention

Page 9: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Motivation for communication Human-readable actions Reading human actions Conclusions

Foveate gazeFoveate gaze

Page 10: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Human gaze reflects attentionHuman gaze reflects attention

(Taken from C. Graham, “Vision and Visual Perception”)

Page 11: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Types of eye movementTypes of eye movement

Vergence angle

Left eye

Right eye

Ballistic saccade to new target

Smooth pursuit and vergence co-operate to track object

(Based on Kandel & Schwartz, “Principles of Neural Science”)

Page 12: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Engineering gazeEngineering gaze

Kismet

Page 13: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Collaborative effortCollaborative effort

Cynthia Breazeal Brian Scassellati And others Will describe components I’m

responsible for

Page 14: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Engineering gazeEngineering gaze

Page 15: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Engineering gazeEngineering gaze

“Cyclopean” camera Stereo pair

Page 16: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Tip-toeing around 3DTip-toeing around 3D

WideViewcamera

Narrowviewcamera

Objectof interest

Field of view

Rotatecamera

New field of view

Page 17: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

ExampleExample

Page 18: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Influences on attentionInfluences on attention

Page 19: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Influences on attentionInfluences on attention

Built in biases

Page 20: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Influences on attentionInfluences on attention

Built in biases Behavioral state

Page 21: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Influences on attentionInfluences on attention

Built in biases Behavioral state Persistence

slipped……recovered

Page 22: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Directing attentionDirecting attention

Page 23: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Motivation for communication Human-readable actions Reading human actions Conclusions

Head pose estimationHead pose estimation

Page 24: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Head pose estimation (rigid)Head pose estimation (rigid)

* Nomenclature varies

Yaw* Pitch* Roll*Translation

inX, Y, Z

Page 25: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Head pose literatureHead pose literature

Horprasert, Yacoob, Davis ’97

McKenna, Gong ’98

Wang, Brandstein ’98

Basu, Essa, Pentland ’96

Harville, Darrell, et al ’99

Page 26: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Head pose: AnthropometricsHead pose: Anthropometrics

Horprasert, Yacoob, Davis

McKenna, Gong

Wang, Brandstein

Basu, Essa, Pentland

Harville, Darrell, et al

Page 27: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Head pose: EigenposeHead pose: Eigenpose

Horprasert, Yacoob, Davis

McKenna, Gong

Wang, Brandstein

Basu, Essa, Pentland

Harville, Darrell, et al

Page 28: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Head pose: ContoursHead pose: Contours

Horprasert, Yacoob, Davis

McKenna, Gong

Wang, Brandstein

Basu, Essa, Pentland

Harville, Darrell, et al

Page 29: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Head pose: mesh modelHead pose: mesh model

Horprasert, Yacoob, Davis

McKenna, Gong

Wang, Brandstein

Basu, Essa, Pentland

Harville, Darrell, et al

Page 30: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Head pose: IntegrationHead pose: Integration

Horprasert, Yacoob, Davis

McKenna, Gong

Wang, Brandstein

Basu, Essa, Pentland

Harville, Darrell, et al

Page 31: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

My approachMy approach

Integrate changes in pose (after Harville et al)

Use mesh model (after Basu et al) Need automatic initialization

– Head detection, tracking, segmentation– Reference orientation– Head shape parameters

Initialization drives design

Page 32: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Head tracking, segmentationHead tracking, segmentation

Segment by color histogram, grouped motion Match against ellipse model (M. Pilu et al)

Page 33: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Mutual gaze as reference pointMutual gaze as reference point

Page 34: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Mutual gaze as reference pointMutual gaze as reference point

Page 35: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Tracking pose changesTracking pose changes

Choose coordinates to suit tracking 4 of 6 degrees of freedom measurable

from monocular image Independent of shape parameters

X translation Y translationTranslation

in depthIn-planerotation

Page 36: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Remaining coordinatesRemaining coordinates

2 degrees of freedom remaining Choose as surface coordinate on head Specify where image plane is tangent to

head Isolates effect of errors in parameters

Tangent region shifts when head rotates in depth

Page 37: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Surface coordinatesSurface coordinates

Establish surface coordinate system with mesh

Page 38: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Initializing a surface meshInitializing a surface mesh

Page 39: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

ExampleExample

Page 40: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Typical resultsTypical results

Ground truth due to Sclaroff et al.

Page 41: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

MeritsMerits

No need for any manual initialization Capable of running for long periods Tracking accuracy is insensitive to model User independent Real-time

Page 42: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

ProblemsProblems

Greater accuracy possible with manual initialization

Deals poorly with certain classes of head movement (e.g. 360° rotation)

Can’t initialize without occasional mutual regard

Page 43: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Motivation for communication Human-readable actions Reading human actions Conclusions

Page 44: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Other protocolsOther protocols

Page 45: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Other protocolsOther protocols

Protocol for negotiating interpersonal distance

Comfortable interaction distance

Too close – withdrawal response

Too far – calling

behavior

Person draws closer

Person backs off

Beyond sensor range

Page 46: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Other protocolsOther protocols

Protocol for negotiating interpersonal distance

Protocol for controlling the presentation of objects

Comfortable interaction speed

Too fast – irritation response

Too fast,Too close –

threat response

Page 47: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Protocol for negotiating interpersonal distance

Protocol for controlling the presentation of objects

Protocol for conversational turn-taking

Other protocolsOther protocols

Protocol for introducing vocabulary Protocol for communicating processes

Protocols make good modules

Page 48: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

LinuxSpeech

recognition

Speakers

NTspeech synthesisaffect recognition

LinuxSpeech

recognition

Face Control

Emotion

Percept& Motor

Drives & Behavior

L

Tracker

Attent.system

Dist.to

target

Motionfilter

Eyefinder

Motorctrl

audiospeechcomms

Skinfilter

Colorfilter

QNX

CORBA

sockets,CORBA

CORBA

dual-portRAM

CamerasEye, neck, jaw motors

Ear, eyebrow, eyelid,lip motors

Microphone

Trackhead

Recog.pose

Trackpose

Page 49: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

SkinDetector

ColorDetector

MotionDetector

FaceDetector

WideFrame Grabber

MotionControlDaemon

RightFrame Grabber

LeftFrame Grabber

Right FovealCamera

WideCamera Left Foveal

Camera

Eye-Neck Motors

W W W WAttention

WideTracker

FovealDisparity

Smooth Pursuit& Vergence

w/ neck comp.VOR

Saccadew/ neckcomp.

Fixed ActionPattern

AffectivePostural Shiftsw/ gaze comp.

ArbitorEye-Head-Neck Control

DisparityBallistic movement Locus of attentionBehaviorsMotivations

, , . ..

, d d 2s s s

, d d 2p p p

, d d 2f f f , d d 2v v v

WideCamera 2

Tracked target

Salient target

Eyefinder

LeftFrame Grabber

Distanceto target

Page 50: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Other protocolsOther protocols

What about robot – robot protocol? Basically computer – computer But physical states may be hard to model Borrow human – robot protocol for these

Page 51: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

Current, future workCurrent, future work

Protocols for reference– Know how to point to an object– How to point to an attribute?– Or an action?

Until a better answer comes along:– Communicate task/game that depends on

attribute/action– Pull out number of classes, positive and

negative examples for supervised learning

Page 52: Paul Fitzpatrick Human – Robot Communication.  Motivation for communication  Human-readable actions  Reading human actions  Conclusions Human – Robot

FINFIN