Transcript
Page 1: 39° 51' 05'' N 104° 40' 34'' W 39° 49' 06'' N 104° 52' 10'' W · ‹ Features learned by MMC ‹ Features learned by PCA+LDA Evaluated metrics: ‹ Correct classification rate

!"#$%&'()*+,-Æ/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRST UVWXYou Are How You Walk: Uncooperative MoCap Gait Identificationfor Video Surveillance with Incomplete and Noisy Data

Michal Balazia · Petr SojkaMasaryk University, Faculty of Informatics, Botanická 68a, 602 00 Brno, Czech Republic

https://gait.fi.muni.cz

Identification Pipeline

Spo�ed walker

MoCap data

Gait template

Gait sample

Iden�fied walker

DEPARTURES

BANK

OPENING HOURS

BANK

Phase IV:Iden�fying walkers

Phase III:Extrac�ng gait features

Phase II:Detec�ng gait

Phase I:Acquiring MoCap data

Phase I – Acquiring Motion Capture DataMotion capture (MoCap) technology providesvideo clips of moving individuals containingan overall structure of the human body andestimated 3D joint coordinates.

MoCap data can be collected online by a readilyavailable system of multiple cameras (Vicon)or by a depth camera (Microsoft Kinect).

To visualize MoCap data, a stick figure thatrepresents the human skeleton can be recoveredfrom joint spatial coordinates in time.

Frame: 0 25 50 75 100 125 150

y

xz

Phase II – Detecting Gait CyclesPeople spotted in our tracking space do notwalk all the time; on the contrary, they performvarious activities.

Identifying people from gait requires videosegments where they are actually walking.

Clean gait cycles need to be first filtered out.

There are methods for detecting gait cyclesdirectly as well as action recognition methodsthat need a demonstrative example of a gaitcycle to query general motion sequences. Frames

0 75 225 300 450

Feet

dis

tan

ce

150 375

Gait cycles

Phase III – Extracting Gait FeaturesLet a model of the human body have J jointsand all NL learning samples of CL walkers belinearly normalized to their average length T .

Labeled learning data in a sample space havethe form GL = {(gn, `n)}NL

n=1 where

gn =[[γ1 (1) · · · γJ (1)] · · · [γ1 (T ) · · · γJ (T )]

]>is a gait sample (one gait cycle) in whichγ j (t) ∈ R3 are 3D spatial coordinates of jointj ∈ {1, . . . , J} at time t ∈ {1, . . . ,T } normalizedto the person’s position and walk direction.

Each learning sample falls into one of thelearning identity classes {Ic}

CLc=1 labeled by `n.

Feature extraction is set by matrix Φ ∈ RD×D

from D-dimensional sample space G = {gn}Nn=1

to D-dimensional feature space G ={gn

}Nn=1

with D < D.

A given gait sample gn can be transformed intogait template by gn = Φ>gn.

Templates are compared by the Mahalanobisdistance.

Evaluated methods:• 8 hand-designed geometric feature sets• Features learned by MMC• Features learned by PCA+LDA

Evaluated metrics:• Correct classification rate• Discriminativeness• Robustness to noisy and incomplete data• Clusterability• Scalability

0.55

0.60

0.65

0.70

0.75

0.80

Are

a u

nd

er t

he

RO

C c

urv

e

0.50

0.55

0 10 20 30 40 50 60 70 80 90 100

Are

a u

nd

er t

he

RO

C c

urv

e

% noise for 9 learning and 55 evaluation identities

Ahmed Ali Andersson Ball

Dikovski Kwolek Preis Sinha

MMC PCA+LDA

0.65

0.70

0.75

0.80

0.85

Are

a u

nd

er t

he

RO

C c

urv

e

0.60

(2,62)

(3,61)

(4,60)

(5,59)

(6,58)

(7,57)

(8,56)

(9,55)

(10,54)

(11,53)

(12,52)

(13,51)

(14,50)

(15,49)

(16,48)

(17,47)

(18,46)

(19,45)

(20,44)

(21,43)

(22,42)

(23,41)

(24,40)

(25,39)

(26,38)

(27,37)

(28,36)

(29,35)

(30,34)

(31,33)

(32,32)

Configurations in the form (# learning identities, # evaluation identities)

Specifications of the evaluated methods are available at web https://gait.fi.muni.cz.

Phase IV – Identifying WalkersPerson re-identification (Re-ID) focuses onspotting people of interest in multiple cameras.

But in video surveillance, labeled data for allthe people encountered is never available.

Training the model on an auxiliary databasecreates an unsupervised environment suitablefor searching for similar gait templates and forclustering them into potential walker identities.

One can retrieve information about a person’sappearances captured by the video surveillancesystem – their location trace that includestimestamp and geolocation of each appearance.

rejectedacceptedquery 39° 43' 04'' N 104° 51' 50'' W

2017/06/24 22:49:38

39° 49' 06'' N 104° 52' 10'' W

2017/06/23 13:24:19

39° 51' 05'' N 104° 40' 34'' W

2017/06/21 07:55:16

DEPARTURES

BANK

AcknowledgementsMichal Balazia and Petr Sojka: You Are How You Walk: Uncooperative MoCap Gait Identification for Video Surveillance with Incomplete and Noisy Data, International Joint Conference on Biometrics (IJCB), Denver, 2017.Michal Balazia and Petr Sojka: Gait Recognition from Motion Capture Data, ACM Transactions on Multimedia Computing (TOMM), Special Issue on Representation, Analysis and Recognition of 3D Humans, 2017.The data used in this project was created with funding from NSF EIA-0196217 and was obtained from http://mocap.cs.cmu.edu. Evaluation framework and database are available online at https://gait.fi.muni.cz.

Top Related