wearable technologies: what's brewing in the lab?

Post on 16-Apr-2017

434 Views

Category:

Engineering

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Wearable technologies:

what's brewing in the lab?

http://www.sussex.ac.uk/strc/research/wearable

Dr. Daniel RoggenWearable Technologies Lab

Sensor Technologies Research CentreUniversity of Sussex

500 London police officers will be equipped with Taser wearable cameras

[1] http://thenextweb.com/uk/2014/05/08/500-london-police-officers-will-equipped-taser-wearable-cameras-today/

Naylor, G.: Modern hearing aids and future development trends, http://www.lifesci.sussex.ac.uk/home/Chris_Darwin/BSMS/Hearing%20Aids/Naylor.ppt

Wearable form factor dictated by an application

1961: Thorp&Shannon’s wearable computer

Edward O. Thorp. The Invention of the First Wearable Computer, Proc Int Symp on Wearable Computers, 1998

+44% wins

• Cigarette pack size• Toe-triggered timer• Audio feedback• 12 transistors

Marion, Heinsen, Chin, Helmso. Wrist instrument opens new dimension in personal information, Hewlett-Packard Journal, 1977

• « It’s a digital electronic wristwatch, a personal calculator, an alarm clock, a stopwatch, a timer, and a 200-yearcalendar, and its functions can interact to produce previously unavailable results »

• 38K transistors• 20 uW / 36mW screen off/on

• Reliability– « Shock and vibrations, temperature and

humidity changes, body chemicals, abrasive dust, and constant friction against clothing presented a challenge to the designers »

• Design– « Requirements for a small and visually

pleasing product imposed additionaldifficulties rarely encountered at HP»

Wearables driven by miniaturization

Flexible and stretchable electronics(Munzenrieder et al. University of Sussex)

Bent finger Straight finger

Accordion-like electronics for electronic skin

Processor

Battery

DisplaySensors:Touch Motion proximity camera

Bone-conductingspeaker

A new definition of wearables:« Smart assistant »

Payattention!

Augmenting the user• Sense from a first person perspective• Always with the user• Learns behaviors, habits, needs

Vannevar Bush, As we may think. Life magazine, 1945

Cyclop cameraSpeech recognitionAccess to all human knowledge (Memex)

“Let us project this trend ahead to a logical, if not inevitable,outcome”

Wearable computer = smart assistant

* Augment and mediate interactions

+ No barrier between you and the world

* Constant access to information

• Self-contained / personal

× Micro-interactions

• Proactive / implicit interaction

* Sense and model context

* Adapt interaction modalities based on context+ Starner, ISWC 2013 Closing Keynote, September 2013, Zürich • Starner, The challenges of wearable computing: Part 1, IEEE Pervasive Computing Magazine, 2001x Ashbrook, Enabling mobile microinteractions, PhD, 2010

•What did I do yesterday?.....

• What am I doing in the kitchen?....• You went to the  supermarket, and enjoyed a coffee with Lisa

• If you want to cook spaghettis, think of heating the water

Recognition of human activities and their context

Activity diarisation, memory augmentation(e.g. memory assistant for dementia)

Supporting behaviour change: Lab is on 4th floor

Stairs?Lift?

Sensing and recognisingactivities

Motion sensor (accelerometer)

Custom wearables• Flexible form factor• Application specific needs

– (e.g. 1KHz motion sensing)

• Sensor research• Low power reserch• Interaction research

Activities of daily living

The OPPORTUNITY dataset for reproducible research(avail. on UCI ML repository)

Activity of daily living• 12 subjects• > 30'000 interaction primitives

(object, environment)

Roggen et al., Collecting complex activity datasets in highly rich networked sensor environments, INSS 2010http://opportunity-project.eu/challengeDatasethttp://vimeo.com/8704668

Sensor rich• Body, objects, environment• 72 sensors (28 sensors in 2.4GHz band)• 10 modalities• 15 wired and wireless systems

Low-level activity models (primitives)

Design-time: Training phase

Optim

ize

Sensor data

Annotations High-level activity models

Optim

ize

ContextActivity

Reasoning

Symbolic processing

Activity-aware application

A1, p1, t1

A2, p2, t2

A3, p3, t3

A4, p4, t4

t

[1] Roggen et al., Wearable Computing: Designing and Sharing Activity-Recognition Systems Across Platforms, IEEE Robotics&Automation Magazine, 2011

Runtime: Recognition phase

FS2 P2

S1 P1

S0 P0

S3 P3

S4 P4

S0

S1

S2

S3

S4

F1

F2

F3

F0 C0

C1

C2

PreprocessingSensor sampling Segmentation

Feature extractionClassification

Decision fusion

R

Null classrejection

Subsymbolic processing

• Public challenge carried out in 2011• Any method• Any combination of 113 wearable channels

17 Gestures• Open / close door 1• Open / close door 2• Open / close fridge• Open / close dishwasher• Open /close drawer 1• Open / close drawer 2• Open / close drawer 3• Clean table• Drink from cup• Toggle light switch

Method PerformanceLDA 0.25QDA 0.24NCC 0.191NN 0.553NN 0.56UP 0.22NStar 0.65SStar 0.70CStar 0.77

2011 results [1]

[1] Chavarriaga et al., The Opportunity challenge: A benchmark database for on-body sensor-based activity recognition, Pattern recognition letters, 2013[2] Ordones Morales et al., Deep LSTM recurrent neural networks for multimodal wearable activity recognition, In preparation

ConvLSTM [2] 0.86 2015 results

+9%“Deep learning”

Parkinson’s assistance

EC grant Nr FP6-018474-2 EC grant FP7-288516

M. Bächlin, M. Plotnik, D. Roggen, I. Maidan, J. M. Hausdorff, N. Giladi, and G. Tröster. Wearable Assistant for Parkinson's DiseasePatients With the Freezing of Gait Symptom. IEEE Transactions on Information Technology in Biomedicine, 14(2):436 - 446, 2010.

Freezing of gait (transient motor block)

thigh sensor

shank sensor

trunk sensor

earphones

wearablecomputer

• Sensitivity = 73.1%• Specificity = 81.6%

Glass & people with Parkinson’sWorkshop @ Newcastle University (28.08.2013)

• Accept positive “Benefit – privacy” tradeoffs• “Sharing under my control to whom I choose”• “Same as a phone / computer”, “just another interaction”• “Gives me confidence back, that is what I need”• “I cannot use a phone with shopping bags and a stick, Glass

would be always ready”• “Everybody is different – interface should be customizable”

McNaney et al. Exploring the Acceptability of Google Glass as an Everyday Assistive Device for People with Parkinson’s, CHI 2014

Contextual support in the assembly line

Quality control in car manufacturing

Continuously, 8 hours/day!

Automatic electronic checklist

Inertial measurement unit (orientation sensor)

Motion capture

Stiefmeier et al., Wearable Activity Tracking in Car Manufacturing, Pervasive Computing Magazine, 2008

Automatic electronic checklist

• Advantages– Automatic documentation– Reproducibility– Guarantees quality– Improved usability

Learning

Micro-learning (Tin Man Labs, LLC)

Passive haptic learning for rehabilitation

Crowd behaviour analytics

Managing collective behaviors

Lord Mayor’s Show – November 12th, 2011, London

Lord Mayor’s Show – November 12th, 2011, London

Roggen et al., Recognition of crowd behavior from mobile sensors with pattern analysis and graph clustering methods, Networks and Heterogeneous Media 6(3), 2011Lukowicz et al, On-body sensing: from gesture-based input to activity-driven interactions, IEEE Computer, October 2010

Advanced behavioral analysis

Sports

Sports analysis

• Sensors on arm & hand

Low-power pattern recognition (template matching)

Atmel AVR8ATmega3248-bit w/o FPU

ARM Cortex M4STM32F40732-bit w/ FPU

• Real-time• High-speed: 67 (AVR), 140 (M4) motifs w/ 8mW, 10mW @8MHz• Low-power: single gesture spotter (AVR) w/ 135uW @ 120KHz• Tunable tradeoffs: power/performance, sensitivity/specificity• Suitable for hardware implementation

LM-WLCSS

Roggen&Cuspinera Limited-Memory Warping LCSS for Real-Time Low-Power Pattern Recognition in Wireless Nodes, Proc. EWSN 2015

Beach volleyball serves from wrist-worn gyro

Removing sand Serve

Distinguish subtle pattern differences (e.g. serve styles)

Next steps: play and style analysisUetsuji et al., Wearable sensing and classification of beach volleyball styles, In preparation

Insight into research

www.opportunity-project.euEC grant n° 225938

pattern recognition in opportunistic configurations of sensors(problem of distributed signal processing and machine learning)

EU funding ~ 1.5M€ / 3yr

Walkthrough: knowledge discovery - using unknownsensors

• Static properties• “3D skeleton"

• ExperienceItems “HCI-VolumeUp”• ExperienceItems “HCI-VolumeDn”• ExperienceItems “HCI-Next”• ExperienceItems “HCI-Prev”

• Static properties• “Acceleration“

• Dynamic properties• “Wrist“

Physical and geometricalrelation between sensors

readings!

• Static properties• "Acceleration"

• Dynamic properties• “Wrist“

• ExperienceItems “HCI-VolumeUp”• ExperienceItems “HCI-VolumeDn”• ExperienceItems “HCI-Next”• ExperienceItems “HCI-Prev”

Baños et al, Kinect=IMU? Learning MIMO Models to Automatically Translate Activity Recognition Models Across Sensor Modalities, ISWC 2012

Translation performance

• Same limb translation: accuracy <4% below baseline (accuracy ~95%)• System identification: 3 seconds• Self‐spreading of recognition capabilities!

Walkthrough: self-adaptation to gradual changes

Förster, Roggen, Tröster, Unsupervised classifier self-calibration through repeated context occurences: is there robustness against sensor displacement to gain?, Proc. Int. Symposium Wearable Computers, 2009

Calibration dynamics

Self-calibration to displaced sensors increases accuracy:• by 33.3% in HCI dataset• by 13.4% in fitness dataset

“expectation maximization”

Walkthrough: minimally user-supervised self-adaptation

• Adaptation leads to:• Higher accuracy in the adaptive case v.s. control• Higher input rate• More "personalized" gestures

Förster et al., Online user adaptation in gesture and activity recognition - what’s the benefit? Tech Rep.

Förster et al., Incremental kNN classifier exploiting correct - error teacher for activity recognition, ICMLA 2010

Förster et al., On the use of brain decoded signals for online user adaptive gesture recognition systems, Pervasive 2010

Walkthrough: brain-guided self-adaptation

• ~9% accuracy increase with perfect brain signal recognition• ~3% accuracy increase with effective brain signal recognition accuracy•Adaptation guided by the user’s own perception of the system• User in the loop

Conclusion!

What is it that makes a device a "wearable"?

Always with the user

Personalised

Autonomous

Preempt needs

Augments our capabilities!

Acknowledgements

Sakura Uetsuji Dr Luis Ponce Cuspinera

Former colleagues at ETHZ: Dr Alberto Calatroni, Dr Kilian Foerster, Dr Michael Hardegger, Dr Martin Wirz, Dr Long-Van Nguyen-Dinh and others

Dr Francisco Javier Ordones Morales

top related