artificial intelligence advanced visual interaction understanding … · 2016-09-28 · artificial...

66
Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions using Visual Information Atsushi Nakazawa IST, Kyoto University

Upload: others

Post on 22-May-2020

9 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

Artificial Intelligence AdvancedVisual Interaction

Understanding Human Motions and Emotions using Visual Information

Atsushi Nakazawa

IST, Kyoto University

Page 2: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• 12/2 (today) – Understanding human emotions using visual information

• 12/9 (next week) – Understanding human motions (behaviors) using visual information

Page 3: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• Affective Computing – Treating human emotions in computer systems• Rosalind Piccard – TED talks

• Applications Affective Computing

• How to quantize human emotions ?

• Emotion model: Primary and secondary emotions

• How to measure human emotions ?• Quantize facial actions FACS (Facial Action Coding System)

• Relations between facial expressions and emotions

• Physiological signals and parasympathetic nervous system

• Measurement of human internal states from physiological signals

• Human gaze estimation• Gaze and its measurement techniques

• Corneal imaging and its applications

• Estimate human internal states from eye images – pupillary response

• Summary

Page 4: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Human Machine Interface (HMI) that understands / expresses emotional signals

Recent trend• Machines that simply follow the command is realized by recent AI algorithms (such as deep learning).

e.g WEB search, Natural language processing (NLP), Facial detection and identifications, Character and image recognitions

• On the contrary, human does not change so much !Computer allergyHuman cannot specify the task

“I want clothes that look good on me !”“I want to drive nice place”

Human cannot control their emotions.“My car navigation system always shows the wrong roots !”

• Can we give emotion to machine ?

Page 5: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Page 6: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• Proposed by Dr. Rosalind Picard (MIT Media Lab.), 1995

• ”Affective Computing”

Discussed why computer systems needs emotional sensing and processing.

• Applications of Affective Computing

• Elements of Affective Computing

• Concerns about Affective Computing

TED Talk video

Page 7: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• Generally, emotions are assumed to be negative than logics.• Usually people thinks “Emotions disturb ones decisions.”

• Observations of a patient with frontal lobe damage (Antonio Damasio)• He has normal IQ and cognitive abilities, but does not have emotions (appear to be like Star Trek’s Mr. Spock).

• In reality, he always make disastrous decisions.• E.g. Typical people stop investments after large losses,

however, he continues investing until becoming bankrupt.

• He has similar problems in social interactions.• He seems to be unable to learn the links between

dangerous choices and bad feelings, so he repeatsdangerous decisions.

• He will disappear into and endless rational search foranything, even a simple task such as schedule an appointments.• “Well, this time might be good.”• “Maybe I will have to be on that side of town so this time

would be better..”

Page 8: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• Emotion is vital for human decision making.• Damasio has hypothesized that Elliot’s brain is missing “somatic markers” that associate positive or

negative feelings with certain decisions. These feelings would help limit a mental search by nudging the person away from considering the possibilities with bad associations.

Similar logic works to the future computer (machine) systems• Computer system that can make emotional decision, beyond logics.

Page 9: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Human-Machine relation• Physiological sensing • Pattern analysis• Sensors• Emotional expressions using

machine systems

Human-Human relation• Affective Communication• Affective Wearable Computers

Fundamentals• Modeling and understanding

emotions

Physiological Sensing

PatternAnalysis

Emotional expressionUsing machine systems

Understanding and modelingof Emotions

Affective ComputingApplications

InteractionWith Machine Systems

Affective Communications

Page 10: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

MAUI: a Multimodal Affective User Interface

Page 11: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• What is the emotion for machines.

Page 12: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• 135 / 379 users feel genders.

• 87 / 379 users give names.• Most commonly, the robots were given a human name

such as Sarah, Alex, Joe, and Veronica. Other names were wordplays on the word “Roomba”: Roomie, Roomby and Ruby.

• 44 / 379 users find personality.

• 42 / 379 users talks.

• 43 / 379 users dress-up him/her.

http://www.nbcnews.com/id/21102202/ns/technology_and_science-tech_and_gadgets/t/roombas-fill-emotional-vacuum-owners/#.Vdqjna2hMxA

Housewives or Technophiles?: Understanding DomesticRobot Owners

Page 13: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

MAUI: a Multimodal Affective User Interface

Page 14: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• Many efforts have been conducted for modeling emotions.

Page 15: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• Facial Action Coding System (FAC) – Ekman• Facial expressions consist of the movement of facial parts (in FACS 68 parts, 10 groups)

Action Units (Aus)

• Six Basic Emotions (Anger, Disgust, Fear, Joy, Sadness, Surprise) consists of the combination of AUs.

Page 16: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Sadness:

• close_t_l_eyelid (FAP 19), close_t_r_eyelid (FAP 20), close_b_l_eyelid (FAP 21), close_b_r_eyelid (FAP 22), raise_l_i_eyebrow (FAP 31), raise_r_i_eyebrow (FAP 32), raise_l_m_eyebrow (FAP 33), raise_r_m_eyebrow (FAP 34), raise_l_o_eyebrow (FAP 35),

raise_r_o_eyebrow (FAP 36)

Page 17: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• Tracks facial parts Estimate emotions from the movement of facial parts

Page 18: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• Waseda University (Takanishi Lab.)

Page 19: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• RAFD expression dataset

https://www.behance.net/gallery/10675283/Facial-Expression-Public-Databases

Coding Facial Expressions with Gabor WaveletsLyons M. and Akamatsu S., Kamachi M, Gyoba J., FG98

Page 20: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

•We show such obvious facial expression in real life ?

Page 21: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• Primary and Secondary emotions theory (Damasio)• Primary emotion : Direct output of emotion to input signal, such as

feeling pains.

• Secondary emotion: More higher level of emotions

• Primary emotion is related to the response of amygdala. However, secondary emotion is the response at (ventromedial prefrontal cortex (VMF)).

Page 22: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Page 23: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• They constructs layered model(Multi-level Dynamic Bayesian Networks) of mental stats and display levels.

Page 24: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Page 25: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY• See also http://www.affdex.com/

Page 26: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Page 27: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Page 28: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Affectiva-MIT Facial Expression Dataset (AM-FED): Naturalistic andSpontaneous Facial Expressions Collected In-the-Wild

Page 29: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Physiological Sensing

PatternAnalysis

Emotional expressionUsing machine systems

Understanding and modelingof Emotions

Affective ComputingApplications

InteractionWith Machine Systems

Affective Communications

Page 30: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• Mobile EDA sensors skin conductivty

Page 31: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Page 32: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Wearable sensor reading facial muscles that detects positive facial expressions [Suzuki2013].

Page 33: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• Classify the types of smiles using fEMG (facial EMG) and SCR (SkinConductivity)• fEMG measures は眼輪筋、口角下制筋、大頬骨筋の3カ所を計る

Page 34: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY34

August 4, 2014

Recall 98.4%

Precision 83.3%

F measure90.2%

Page 35: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• Eye image processing

• Corneal imaging method(Non-calibrated eye gaze tracker, peripheral vision estimation, scene-corneal reflection matching)

• Pupillary reponse

Page 36: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• Find ‘where one is looking at ?’

• Many applications• Computer Science / Engineering

• Human behavior analysis

• Saliency estimation

• Drivers view analysis

• Input device, Human interface design

• Other fields• Phycology / Physiology

• Medical/life sciences

• Marketing

San Augustin et al, 2009

Babcock et al, 2004

Current EGT systems have several problemsof usability (calibration, headmount drift)

Page 37: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Georgia Institute of Technology

Page 38: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Where is the most ‘appealing’ point in a vending machine ?http://bizmakoto.jp/makoto/articles/1403/26/news045.html

Page 39: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Eye imageGaze direction

w.r.t. eye-camera

PoG (scene point)Estimation

w.r.t. eye-camera

Gaze (eye) direction estimation Point of gaze (PoG) estimation

Relative pose

Gaze Target

Camera

Gaze

Eye

Require Calibration, Headmount DriftParallax error

Page 40: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY40

Corneal reflections

Outside viewCorneal shape and limbus

Corneal imaging [Nishino and Nayer 2006]

Page 41: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• Light arriving at cornea• partially refracts and enters eye

• partially reflects into environment

We can observe human view through corneal reflections

Cornea

Camera Camera

Mirror

• Similar optics with CatadioptricImaging system

Page 42: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• Display-Eye Calibration using corneal reflections [ICCV2009]

• Solve fundamentals of corneal reflection analysis [J. Computer Vision and Applications 2011]

• Non-calibrated gaze estimation using corneal imaging [ECCV2012]

• Super-resolution scene reconstruction using corneal reflections [BMVC2012]

• Remote non-calibrated EGT system for infants and babies (with James Rehg’s group @ Georgia Tech.)

• Robust registration of scene and eye reflection using RANRESAC algorithm [MIRU2014]

• Corneal imaging camera [WeSAX2015]

Page 43: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• USB2 Micro Camera (1280 x 1024, 15fps)• Two visible LEDs (to identify the corneal center position)• Tracking speed : 10 fps for 3D eye pose estimation and GRP estimation

Page 44: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Flow overview Eye image projection

• Use weak-perspective projection.• Eye 3D pose five parameters (x,y,s,t,f)• Estimation of these five parameters using a

particle filter

Page 45: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Optical axisCorneal surface

Eye ball

Geometric eye model

Problem: find a corneal surface point that reflects the light from human viewing direction Gaze Reflection Point (GRP)

3D geometry model

Gaze Reflection Point (GRP)

GRP

Point of Gaze

Page 46: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Page 47: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Page 48: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Page 49: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Page 50: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Page 51: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Page 52: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• Designing 3rd generation Practical use for marketing, health-care fields.

• Much higher resolution (full-HD) and frequency (30Hz).

• More smarter !

Page 53: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• We want to capture Eye Gaze of infants in everyday scene.

• Problems in existing methods:– Require body-attached devices.

– Narrow field of movement.

– Restricted in in-door environment.

– Require calibrations

– System errors including drift and parallax errors.

[Franchak2010] (NYU)

Page 54: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• Strong relation between ASD and gaze

• Develop a method to observe the gaze behavior of infants / children

• Requirements• Easy mesurement (w/o body-attached devices,

w/o headmount)

• w/o calibrations

Page 55: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Page 56: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITYCamera array

Zoom and Focus Controller

video

Zoom FocusController

Page 57: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• Point of gaze estimation while card magic.

Page 58: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• Eye tracking cannot know the one is looking seriously or unseriously.

• We want to estimate the human concentration pupillary response

Concentrating (pupil dilates)Normal

Approach: observe pupil dilationOne is concentrating pupil dilatesBuild a pupil dilation model with respect to the task difficulty.

Page 59: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Task

• Ask 10 users path-tracking game (iraira bou)

Experiment

• Observe pupil size using eye observation camera (with IR)

Start

Goal

Cursor

Page 60: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Page 61: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

110

115

120

125

130

135

140

145

150

0 10000 20000 30000 40000 50000 60000 70000 80000 90000IN OUT

5pixel 10pixel10pixel

Dilates when entering narrow pathway

Get smaller after going out to wider pathway

Time(ms)

Pupil s

ize(

pix

el)

Page 62: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

𝑎𝑖 (random effect analysis)

𝑎𝑖 > 0 **(p=0.0013)

Statistically confirmed the difficulty of the game (pathway width) is related to the pupil dilation.

The model between ID (task difficulty) and pupil change ratio

𝐼𝐷 =1

𝑃𝑎𝑡ℎ𝑊𝑖𝑑𝑡ℎ

𝑃𝑢𝑝𝑖𝑙𝐶ℎ𝑎𝑛𝑔𝑒𝑅𝑎𝑡𝑖𝑜𝑖 = −𝑎𝑖𝐼𝐷

+ 𝑏𝑖

𝑃𝑢𝑝𝑖𝑙𝐶ℎ𝑎𝑛𝑔𝑒𝑅𝑎𝑡𝑖𝑜 = −𝑎𝑖𝑃𝑎𝑡ℎ𝑊𝑖𝑑𝑡ℎ + 𝑏𝑖

Pathway width (pixel)

Pupil d

ilation

(Ratio)

Page 63: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

1000人/日

5000人/日 10000人/日

3000人/日

リアルシーンページビュー

• Collecting mass gaze data • What is the most viewed object in a city?• How many people looked at a particular advertisement. • Where we should show critical information ?

Page 64: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

Personal difference

Memorability ofthe object

How the subjectsee the object

Status of the subject

Assumed to be constant in our experiments

Object in a scene

First person view image

Status of the subject

Features retrieved from FPV image Frequency• オブジェクトが一人称視点映像中に何回出てきたか

・・・ ・・・

一回目 二回目

時間

停留時間

1/4

1/4 中心部分 周辺部分

Duratioin• オブジェクトが一人称視点映像中に何秒現れたか

Position• オブジェクトが一人称視点映像のどこに現れたか

Scene memorability is related to how the subject see the object.

Page 65: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• Looking frequently and longer cause higher memorability.

Subject 1 Subject 2

Subject 3 Subject 4

Frequency [times]

Du

ration

[sec]

Du

ration

[sec]

Du

ration

[sec]

Du

ration

[sec]

Du

ration

[sec])

Frequenct [times]

Total

RED:Correctly rememberedBLUE:Wrongly remembered

Frequency [times]

Frequency [times] Frequency [times]

Page 66: Artificial Intelligence Advanced Visual Interaction Understanding … · 2016-09-28 · Artificial Intelligence Advanced Visual Interaction Understanding Human Motions and Emotions

KYOTO UNIVERSITY

• A novel eye gaze tracking method using the corneal imaging technique.

• Calibration-free, no headmount drift

• obtain human peripherally as well (not only point of gaze)

• Applications for marketing, heath-care purposes, as well as user interface.

• Pupillary response indicates the human concentration.

• Show the model between task difficulty and pupil dilation.

• Potentially has a usage of new bio sensors.