activity recognition taiwoo park may 7, 2013 1 bao, ling, and stephen s. intille. "activity...

38
Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive Computing. Springer Berlin Heidelberg, 2004. 1-17. Park, Taiwoo, et al. "E-gesture: a collaborative architecture for energy-efficient gesture recognition with hand-worn sensor and mobile devices."Proceedings of the 9th ACM Conference on Embedded Networked Sensor Systems. ACM, 2011. Some slides are from CSCI 546 course materials by James Reinebold, USC

Upload: may-thomas

Post on 16-Dec-2015

242 views

Category:

Documents


6 download

TRANSCRIPT

Page 1: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

1

Activity Recognition

Taiwoo ParkMay 7, 2013

Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive Computing. Springer Berlin Heidelberg, 2004. 1-17.Park, Taiwoo, et al. "E-gesture: a collaborative architecture for energy-efficient gesture recognition with hand-worn sensor and mobile devices."Proceedings of the 9th ACM Conference on Embedded Networked Sensor Systems. ACM, 2011.

Some slides are from CSCI 546 course materials by James Reinebold, USC

Page 2: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

2

Activity?• Higher level activities

– Giving a lecture, having a breakfast, playing soccer…

• Lower level activities– Lying on a bed, standing still, running, walking, …

Page 3: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

3

An easy example• Assumptions:

– Sensors on smartphones are only available• Accelerometer, compass, gyroscope, light, …• You can attach smartphones on your body

– Only three target activities to recognize• Running• Standing still• Lying on a bed

• How can we recognize activities?

Page 4: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

4

An easy example (cont’d)

Is the phone being shaken?

Phone orien-tation Activity

No Upright

Yes Upright

No Lying down

Yes Lying down

Variance of accelerometer sensor signalfor the last 3 seconds

Average value ofaccelerometer y-axis sensor signals

for the last 3 seconds

y

xz

Standing still

Running

Lying on a bed

Nothing

Page 5: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

5

Activity recognition pipeline

Bader, Sebastian, and Thomas Kirste. "A Tutorial Introduction to Automated Activity and Intention Recognition." (2011).

Page 6: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

6

An easy example (revisited)

Is the phone being shaken?

Phone orien-tation Activity

No Upright Standing still

Yes Upright Running

No Lying down Lying on a bed

Yes Lying down …?

Variance of accelerometer sensor signalfor the last 3 seconds

Average value ofaccelerometer y-axis sensor signals

for the last 3 secondsWindowing

Feature extraction

Classification

Page 7: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

Data collection• Semi-Naturalistic, User-Driven

Data Collection– Obstacle course / worksheet– No researcher supervision while

subjects performed the tasks

• Timer synchronization• Discard data within 10 seconds of

start and finish time for activities

12:30 – 12:50

Walking

Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive Computing. Springer Berlin Heidelberg, 2004. 1-17.

Page 8: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

Activities• Walking• Sitting and Relaxing• Standing Still• Watching TV• Running• Stretching• Scrubbing• Folding Laundry• Brushing Teeth• Riding Elevator

• Walking Carrying Items• Working on Computer• Eating or Drinking• Reading• Bicycling• Strength-training• Vacuuming• Lying down & relaxing• Climbing stairs• Riding escalator

Page 9: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

Data collection

Source: Bao 2004

Page 10: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

Sensors Used• Five ADXL210E accelerometers (manufactured by

Analog Devices)– Range of +/- 10g– 5mm x 5mm x 2mm– Low Power, Low Cost– Measures both static and dynamic acceleration

• Sensor data was stored in a memory card by using “Hoarder Board”

Source: http://vadim.oversigma.com/Hoarder/LayoutFront.htm

Page 11: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

Example Signals

Source: Bao 2004

Page 12: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

12

Activity recognition pipeline

Bader, Sebastian, and Thomas Kirste. "A Tutorial Introduction to Automated Activity and Intention Recognition." (2011).

Page 13: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

13

Classification

‘Running’ ‘Standing still’

‘Lying on a bed’ ‘Walking’

Data sample

Collected data samples (in advance)

Question: What is the most similar samples to the current one?

Methods: Naïve bayes, nearest neighbor, decision table/tree,

HMM (Hidden Markov Models), …

?

Page 14: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

14

Decision Table

Is the phone being shaken?

Phone orien-tation Activity

No Upright

Yes Upright

No Lying down

Yes Lying down

y

xz

Standing still

Running

Lying on a bed

Nothing

Page 15: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

Decision Trees• Make a tree where the non-leaf nodes are the

features, and each leaf node is a classification. Each edge of the tree represents a value range of the feature.

• Move through the tree until you arrive at a leaf node• Generally, the smaller the tree the better.

– Finding the smallest is NP-Hard

Source: http://pages.cs.wisc.edu/~dyer/cs540/notes/learning.html

Page 16: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

Decision Tree Example

Phone Orien-tation?

Not lying down: Is the phone being

shaken?

Running Standing still

Lying on a bed

Lying downUpright

NoYes

Page 17: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

Nearest Neighbor• Split up the domain into various dimensions, with

each dimension corresponding to a feature.• Classify an unknown point by having its K nearest

neighbors “vote” on who it belongs to.• Simple, easy to implement algorithm. Does not work

well when there are no clusters.

Source: http://pages.cs.wisc.edu/~dyer/cs540/notes/learning.html

Page 18: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

Nearest Neighbor Example

Average value of accel Y

Varia

nce

of a

ccel

Lying on a bed Standing still

Running

Page 19: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

Naïve Bayes Classifier• Multiplies the probability of an observed datapoint

by looking at the priority probabilities that encompass the training set.– P(B|A) = P(A|B) * P(B) / P(A)

• Assumes that each of the features are independent.• Relatively fast.

Source: cis.poly.edu/~mleung/FRE7851/f07/naiveBayesianClassifier.pdf

Page 20: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

20

Activity recognition pipeline

Bader, Sebastian, and Thomas Kirste. "A Tutorial Introduction to Automated Activity and Intention Recognition." (2011).

Page 21: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

Feature Extraction• Time-domain features [Maurer 2006]

– Mean (average), Root Mean Square, Variance, …

• FFT-based feature computation [Bao 2004]– Sample at 76.25 Hz– 512 sample windows (about 6.71 sec)– Extract mean energy, entropy, and correlation features

Maurer, Uwe, et al. "Activity recognition and monitoring using multiple sensors on different body positions." Wearable and Implantable Body Sensor Networks, 2006. BSN 2006. International Workshop on. IEEE, 2006.

Page 22: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

Source: Bao 2004

Page 23: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

Source: Bao 2004

Page 24: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

Results

• Decision tree was the best performer, but…

Classifier Classification Accuracy (%, Leave-one-subject-out Training)

Decision Table 46.75 +/- 9.296

Nearest Neighbor 82.70 +/- 6.416

Decision Tree 84.26 +/- 5.178

Naïve Bayes 52.35 +/- 1.690

Page 25: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

Per-activity accuracy breakdown

Page 26: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

Trying With Less Sensors

Accelerometer (s) Left In Difference in Recognition Activity

Hip -34.12 +/- 7.115

Wrist -51.99 +/- 12.194

Arm -63.65 +/- 13.143

Ankle -37.08 +/- 7.601

Thigh -29.47 +/- 4.855

Thigh and Wrist -3.27 +/- 1.062

Hip and Wrist -4.78 +/- 1.331

With only two accelerometers we can get good performance

Page 27: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

Lessons• Accelerometers can be used to affectively distinguish

between everyday activities.• Decision trees and nearest neighbor algorithms are

good choices for activity recognition.• Some sensor locations are more important than

others.• Selecting a feature set is important to increase

recognition accuracy.

Page 28: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

28

E-Gesture: A Collaborative Architecture

for Energy-efficient Gesture Recognitionwith Hand-worn Sensor and Mobile Devices

Page 29: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

29

Motivation

Smartphone

Mobile Gesture Interaction Framework

Hand-worn Sensors

Mobile applications using hand gestures

Wristwatch-typemotion sensor

(Accelerometer,Gyroscope)

Mobility!!!!!

Page 30: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

30

Challenge: Energy and Accuracy

• Conventional gesture processing pipeline (for gesture recognition in stationary setting)

Data Sensing GestureSegmentation

(Button,Algorithms)

Classification(HMM, DTW)

GestureSamples(Candidate)

Result

Sensor Mobile Device

Accel

Gyro

ContinuousRaw Data

Gesture ‘A’ or ‘B’or non-gesture

Page 31: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

31

Challenge: Energy and Accuracy

Data Sensing GestureSegmentation

(Button,Algorithms)

Classification(HMM, DTW)

GestureSamples(Candidate)

Result

Sensor Mobile Device

Accel

Gyro

ContinuousRaw Data

Sensor: 20hrs (250mAh)

Smartphone: 24hrs 17hrs

Mobility Noises

Continuous

data transmissionEnergyAccuracy

Mobility noisesEnergy

Energy-hungry

Gyroscope

(56%)

Accuracy

Mobility noises

Over 90% False segmentation

Only 70% Classification

Page 32: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

32

E-Gesture Architecture

Collaborative Gesture Sensing and Segmentation Classification

(Adaptive and Multi-Situation

HMM)

Gesture Samples(Candidate)

Result(e.g. lay-down)Accel Gyro

Trigger

Adaptation

Wristwatch Sensor Device Mobile Device

1. Device-wise collaborationDetection on wristwatch, classification on smartphone

2. Sensor-wise collaborationAccel turns on gyro for energy efficiency

Gyro adapts accel’s sensitivity for mobility changes

Accelerometer: (+)Energy-efficient, (-)Mobility-vulnerableGyroscope: (-)Energy-hungry, (+)Mobility-robust

Page 33: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

Sensor-side Energy Savings

33

46mW

39mW (↓15%)

19mW (↓59%)

59% less energy consumption, 2.4x longer lifetime

250mAhLi-ion

Battery

Continuous sensing + transmission:

20 hrs

Device-wise collaboration(reduced transmission)

23.7 hrs (1.2x)

Device-wise, Sensor-wisecollaboration

(gyroscope power control,reduced transmission)

48.7 hrs (2.4x)Energy Consumption

Page 34: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

34

Mobile-side Energy Savings

All processing on mobile:42.1hrs

Device-wise collaboration(reduced transmission)

74hrs (1.8x)

122mW

70mW(↓43%)

Nexus One1400mAh

Li-ionBattery

3G/WiFi on

Energy Consumption

Page 35: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

35

Implementation

• Sensor node– Atmega128L MCU– Bluetooth, ZigBee– Sensors

• 3-Axis Accelerometer (ADXL335)• 3-Axis Gyroscope (3 XV-3500CB)• 40Hz Sensing

– Vib motor

• Smartphones– Nokia N96, Google Nexus One– Bluetooth Radio

Google Nexus One Sensor node

Nokia N96 Bluetooth Headset

Page 36: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

36

Sample Applications

• Swan Boat [Ubicomp09][MM09][ACE09]– Collaborative boat-racing exertion game– Utilizes hand gestures as additional game input

• Punching together, flapping together

• Mobile Music Player, Phone Call Manager– Featuring eye-free, touch-free controls– User can control the application by hand gestures

Hand-worn Sensors

Page 37: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

37

Conclusion• Mobile gestural interaction platform

– Collaborative gesture processing• 1.8x longer battery lifetime for Smartphone• 2.4x longer battery lifetime for Hand-worn Sensor

+ preserving gyroscope’s detection performance

– Mobility-robust gesture classification using HMM• Up to 94.6% of classification accuracy on mobile usage

by mobility-considered classification architecture design

– It will greatly facilitate gesture-based mobile applications

• Provided a novel sensor fusion scheme• Serial fusion + feedback control

– Saves energy + preserves detection accuracy

Page 38: Activity Recognition Taiwoo Park May 7, 2013 1 Bao, Ling, and Stephen S. Intille. "Activity recognition from user-annotated acceleration data." Pervasive

38