Successfully reported this slideshow.
On the Development of
A Real-Time Multi-Sensor
Activity Recognition System
IWAAL 2015 (Puerto Varas, Chile)
Oresti Banos1,...
Wearable activity recognition
The Activity
Recognition
Chain (ARC)
Phenomena
Human activity
(body motion)
Measurement
Sens...
Roadmap for the development of an online WAR system
Problem
specification
Activities, Sensors
and Processing
Hub selection...
Activities, Sensors and Processing
Jumping
Standing Sitting Lying down
Climbing/descending
stairs
Waist bending
Arms eleva...
Activities, Sensors and Processing
Single sensor vs Multiple sensor
Multi-sensorSingle sensor
IdealSelf
AR model/
#faulty ...
Activities, Sensors and Processing
Smartphone vs Cloud
Smartphone as a Platform Cloud as a Platform
 Reasonable computing...
Activities, Sensors and Processing
http://orestibanos.com/
Dataset collection
http://archive.ics.uci.edu/ml/
datasets/MHEALTH+Dataset
Sensing Modalities:
- 3D ACC, 3D GYR, 3D MAG, (...
Offline model implementation and evaluation
Parameters:
- Data
• 3D ACC from chest, wrist and ankle
- Segmentation
- Slidi...
Offline model implementation and evaluation
Parameters:
- Data
• 3D ACC from chest, wrist and ankle
- Segmentation
- Slidi...
Online model implementation and evaluation
https://github.com/mHealthTech
nologies/mHealthDroid
Banos, O., Garcia, R., Hol...
Online model implementation and evaluation
http://orestibanos.com/
Online model implementation and evaluation
Activities:
• L1: Standing still (1 min)
• L2: Sitting and relaxing (1 min)
• L...
Lessons learned / Future work
• Granularity level (movement, gesture, action, activity)
• “Jogging & Arms elevation”
• Act...
Conclusions
• Most activity recognition works contribute with
models not validated in realistic conditions
• This work sum...
Work supported by the Junta de Andalucia Project P12-TIC-2082. This work was also supported by the ICTD Program (10049079,...
Prochain SlideShare
Chargement dans…5
×

On the Development of A Real-Time Multi-Sensor Activity Recognition System

485 vues

Publié le

There exist multiple activity recognition solutions offering
good results under controlled conditions. However, little attention has been given to the development of functional systems operating in realistic settings. In that vein, this work aims at presenting the complete process for the design, implementation and evaluation of a real-time activity recognition system. The proposed recognition system consists of three wearable inertial sensors used to register the user body motion, and a mobile application to collect and process the sensory data for the recognition of the user activity. The system not only shows good recognition capabilities after online evaluation but also after analysis at runtime. In view of the obtained results, this system may serve for the recognition
of some of the most frequent daily physical activities.

Publié dans : Sciences
  • Soyez le premier à commenter

  • Soyez le premier à aimer ceci

On the Development of A Real-Time Multi-Sensor Activity Recognition System

  1. 1. On the Development of A Real-Time Multi-Sensor Activity Recognition System IWAAL 2015 (Puerto Varas, Chile) Oresti Banos1, Miguel Damas2, Alberto Guillen2, Luis-Javier Herrera2, Hector Pomares2, Ignacio Rojas2, Claudia Villalonga1,2 and Sungyong Lee1 1Ubiquitous Computing Lab, Kyung Hee University, Korea 2Department of Computer Architecture and Computer Technology, University of Granada, Spain
  2. 2. Wearable activity recognition The Activity Recognition Chain (ARC) Phenomena Human activity (body motion) Measurement Sensing (wearables/on-body) Processing Data curation and knowledge inference Recognized Activity http://orestibanos.com/
  3. 3. Roadmap for the development of an online WAR system Problem specification Activities, Sensors and Processing Hub selection Dataset collection Offline models implementation and evaluation Online model implementation and evaluation (Market…) http://orestibanos.com/
  4. 4. Activities, Sensors and Processing Jumping Standing Sitting Lying down Climbing/descending stairs Waist bending Arms elevation Walking Crouching Cycling Jogging Running Body-general motion: Postures | Translation | Fitness Body-part-specific motion: Trunk, Upper-extremities, Lower-extremities http://orestibanos.com/
  5. 5. Activities, Sensors and Processing Single sensor vs Multiple sensor Multi-sensorSingle sensor IdealSelf AR model/ #faulty sensors 0 1 2 3 4 5 New dynamic range= 30% original dynamic range  [-3g,3g] SARC (hip) 82±5 66±4 - - - - SARC (wrist) 88±5 54±6 - - - - SARC (arm) 80±3 58±7 - - - - SARC (ankle) 83±4 58±8 - - - - SARC (thigh) 89±2 72±4 - - - - HWC 96±2 96±2 93±3 86±5 73±8 65±14 New dynamic range= 10% original dynamic range  [-1g,1g] SARC (hip) 82±5 21±11 - - - - SARC (wrist) 88±5 18±9 - - - - SARC (arm) 80±3 26±14 - - - - SARC (ankle) 83±4 21±7 - - - - SARC (thigh) 89±2 20±6 - - - - HWC 96±2 94±2 87±6 53±2 27±17 25±19 Resiliency to Sensor Failures/Faults Resiliency to Sensor Misplacements Banos, O., Damas, M., Guillen, A., Herrera, L.J., Pomares, H., Rojas, I., Villalonga C. Multi-sensor fusion based on asymmetric decision weighting for robust activity recognition. Neural Processing Letters, vol. 42, no. 1, pp. 5-26 vol. 42, no. 1, pp. 5-26 (2015). Banos, O., Toth M. A., Damas, M., Pomares, H., Rojas, I. Dealing with the effects of sensor displacement in wearable activity recognition. Sensors, vol. 14, no. 6, pp. 9995-10023 (2014). http://orestibanos.com/
  6. 6. Activities, Sensors and Processing Smartphone vs Cloud Smartphone as a Platform Cloud as a Platform  Reasonable computing resources X Limited storage (+ data is trapped into the device)  Widely available (“cheap”)  Data privacy is guaranteed while not distributed to other platforms or services  “Unlimited” processing resources  “Unlimited” storage resources X Expensive for small scale applications X Complex management of data privacy http://orestibanos.com/
  7. 7. Activities, Sensors and Processing http://orestibanos.com/
  8. 8. Dataset collection http://archive.ics.uci.edu/ml/ datasets/MHEALTH+Dataset Sensing Modalities: - 3D ACC, 3D GYR, 3D MAG, (+ 2-leads ECG) Subjects: - 10 males (27-45 years old) Activities: • Standing still (1 min) • Sitting and relaxing (1 min) • Lying down (1 min) • Walking (1 min) • Climbing/descending stairs (1 min) • Waist bends forward (20x) • Frontal elevation of arms (20x) • Knees bending (crouching) (20x) • Cycling (1 min) • Jogging (1 min) • Running (1 min) • Jump front & back (20x) MHEALTH Dataset http://orestibanos.com/
  9. 9. Offline model implementation and evaluation Parameters: - Data • 3D ACC from chest, wrist and ankle - Segmentation - Sliding window – 2sec (no-overlap) - Feature set: • FS1: mean + std • FS2: FS1 + max + min + mcr • FS3: FS2 + mode + median + kurtosis - Classifier: • NB (Naïve Bayes) • DT (Decision Tree) • NCC (Nearest Centroid Classifier) • KNN (K-Nearest Neighbor) Evaluation procedure: - 10-fold CV - 100 iterationsFS1 FS2 FS3 0.5 0.6 0.7 0.8 0.9 1 F-score NB DT NCC KNN http://orestibanos.com/
  10. 10. Offline model implementation and evaluation Parameters: - Data • 3D ACC from chest, wrist and ankle - Segmentation - Sliding window – 2sec (no-overlap) - Feature set: • FS1: mean + std • FS2: FS1 + max + min + mcr • FS3: FS2 + mode + median + kurtosis - Classifier: • NB (Naïve Bayes) • DT (Decision Tree) • NCC (Nearest Centroid Classifier) • KNN (K-Nearest Neighbor) Evaluation procedure: - 10-fold CV - 100 iterationsFS1 FS2 FS3 0.5 0.6 0.7 0.8 0.9 1 F-score NB DT NCC KNN FS1 FS2 FS3 0.5 0.6 0.7 0.8 0.9 1 F-score NB DT NCC KNN FS1 FS2 FS3 0.5 0.6 0.7 0.8 0.9 1 F-score NB DT NCC KNN FS1 FS2 FS3 0.5 0.6 0.7 0.8 0.9 1 F-score NB DT NCC KNN http://orestibanos.com/
  11. 11. Online model implementation and evaluation https://github.com/mHealthTech nologies/mHealthDroid Banos, O., Garcia, R., Holgado, J. A., Damas, M., Pomares, H., Rojas, I., Saez, A., Villalonga, C. mHealthDroid: a novel framework for agile development of mobile health applications. IWAAL 2014, Belfast, Northern Ireland, December 2-5, (2014). Banos, O., Villalonga, C., Garcia, R., Saez, A., Damas, J. A., Lee, S., Pomares, H., Rojas, I. Design, validation of a novel open framework for agile development of mobile health applications. BioMedical Engineering OnLine, vol. 14, 14, no. S2:S6, pp. 1-20 (2015). http://orestibanos.com/
  12. 12. Online model implementation and evaluation http://orestibanos.com/
  13. 13. Online model implementation and evaluation Activities: • L1: Standing still (1 min) • L2: Sitting and relaxing (1 min) • L3: Lying down (1 min) • L4: Walking (1 min) • L5: Climbing/descending stairs (1 min) • L6: Waist bends forward (20x) • L7: Frontal elevation of arms (20x) • L8: Knees bending (crouching) (20x) • L9: Cycling (1 min) • L10: Jogging (1 min) • L11: Running (1 min) • L12: Jump front & back (20x) F-score = 0.97 http://orestibanos.com/
  14. 14. Lessons learned / Future work • Granularity level (movement, gesture, action, activity) • “Jogging & Arms elevation” • Activities interleaving • “Arms Elevation & Sitting” • Transitions • “Sitting to Standing” • Null class • “¬(Standing | Sitting | Lying down | Climbing/descending stairs | Waist bending | Arms elevation | Crouching | Cycling | Jogging | Running | Jumping)” • Resource consumption • Battery duration (<3h) • Validation • 10 subjects (offline), 5 subjects (online) http://orestibanos.com/
  15. 15. Conclusions • Most activity recognition works contribute with models not validated in realistic conditions • This work summarized the process for the realization of a multi-sensor activity recognition system oriented to real-time applications • We especially contribute with an open multimodal activity recognition dataset for the benchmarking and development of new recognition systems • Next steps include to address identified barriers to create activity recognition applications for the real-world http://orestibanos.com/
  16. 16. Work supported by the Junta de Andalucia Project P12-TIC-2082. This work was also supported by the ICTD Program (10049079, Development of mining core technology exploiting personal big data) funded by the Ministry of Trade, Industry and Energy (MOTIE, Korea). We want to specially thank the participants who helped us to collect this dataset. Thank you for your attention. Questions? Oresti Baños Legrán Ubiquitous Computing Lab, Kyung Hee University, Korea Email: oresti@oslab.khu.ac.kr Web: http://orestibanos.com/

×