simple and complex activity recognition through smart phones
DESCRIPTION
This paper was presented at the International Conference on Intelligent Environments, 2012. It was awarded with "Most Commended Paper Award".TRANSCRIPT
- 1. Simple and Complex Activity Recognition Through Smart Phones Stefan Dernbach, Barnan Das, Narayanan C. Krishnan, Brian L. Thomas, Diane J. Cook Presented by:Dr. Aaron S. Crandall Center for Advanced Studies in Adaptive Systems (CASAS) Washington State University June 27, 2012
2. Activity RecognitionAging in place. Remote health monitoring.2 3. Environmental Sensorscookingexercisinghand washing 3 4. Sensors on Phone4 5. Our Endeavor: The Transition5 6. Proposed Architecture6 7. Activity Types7 8. Experimental Setup Device: Samsung CaptivateTM Operating System: Android 2.1 Froyo Sensors used: Accelerometer and gyroscope Sampling Rate: 30 Hz # Participants: 10 Machine Learning Algorithms: Multillayer Perceptron Nave Bayes Bayes Net Decision Table Best-First Tree K-star8 9. Feature Generation FeatureAccelerationOrientationMeanX, Y, ZAzimuth, Pitch, RollMinX, Y, ZAzimuth, Pitch, RollMaxX, Y, ZAzimuth, Pitch, RollStandard DeviationX, Y, ZAzimuth, Pitch, Roll# Zero-CrossingsX, Y, ZPair-wise CorrelationX/Y, X/Z, Y/Z9 10. Results: AccuracyFigure: Performance of Different Classifiers 10 11. Results: Sliding WindowFigure: Classification Accuracies for K-star with Different Window Length. Und corresponds to the scenario when sliding window is not used 11 12. Results: Orientation DataFigure: Accuracy of K-star with and without using orientation information from gyroscope 12 13. Conclusion Simple activities recognized very accurately. Accuracy for complex activities not to high. However, indicates potential usage of phone sensor data.13 14. Contact UsDr. Diane [email protected] [email protected]://casas.wsu.edu 14 15. 15