This document discusses recognizing the on-body position of wearable sensors. It presents an approach using accelerometer data and machine learning classifiers to determine if a sensor is on the head, breast, wrist, or in the trousers. An experimental setup had subjects wear sensors in those locations during various activities. Results found the approach could correctly classify location 98% of the time. Future work involves improving the system for real-time use and analyzing more data with different body positions. The goal is to enable sensors and devices to recognize their on-body position and communicate to provide context recognition.
3. Motivation
• Current State of
Context Recognition:
Dedicated sensors at
predefined positions
• Fixed number of sensors
• Known orientation
Limited to specific
recognition tasks
4. Motivation
• Context Recognition Vision:
Self-organizing body network of
sensors
Placement of sensors in body
worn appliances
Part of EU funded ‘RELATE’ Project
Partners: TECO Karlsruhe, Lancaster University, TU Delft
5. Motivation
• Issues:
Appliances need to recognize
their location on the body.
They need to exchange
information (privacy concerns)
Context Recognition algorithms
need to cope with varying
numbers of sensors/appliances.
6. Apporach
Vertical Axis Wrist
Vertical Axis Head
7. Approach
Vertical Axis Pocket
Walking Pocket Walking Head
8. Procedure
Raw Data
Extract Features
(1 sec. sliding window)
Only accelerometer data
Frame-by-Frame
Walking Classification Using absolute sum
Majority Decision
(10 sec. jumping window)
Walk Classification with
penalty on false positives.
No
Walking?
Yes
Frame-by-Frame
Location Classification
Majority Decision
(10 sec. jumping window)
9. Procedure
Features: 19 computed, 6 finally used
• RMS
• 75% Percentile
• Inter Quartile Range
• Frequency Range Power
• Frequency Entropy
• Sums Power Wave Det Coefficient
10. Experimental Setup
Measurement Course:
• Working on a desk
• Walking along a corridor
• Making coffee
• Walking
• Giving a ’presentation’.
• Walking.
• Walking up and down a
staircase.
Sensor Placement:
• (optional) working at desk.
• Right side of head (above eyes)
• Left breast
• Left wrist
• Right trouser pocket
18 data sets recorded (12-15 min.)
(6 subjects 3 trials each).
12. Results
Reference labeled
frame-by-frame
Frame-by-frame with walk smoothing Event based approach:
Both with smoothing
100 % location recognition
13. Results
Confusion Matrix for the mean of
C4.5 over all data sets of both smoothed walking and location
(98% correctly classified)
a b c d Classified as
865 2 31 2 a = Head
0 847 4 49 b = Trousers
42 0 883 1 c = Breast
17 68 10 921 d = Wrist
14. Future Work
• Currently: Mathlab scripts /Java, Batch processing
Shift to use the Sensor Toolbox
Real-time
• Analysis of more data (containing also different body
locations)
ETH
Georgia Tech students
• Bootstrapping the Body-Network
• Collaboration of devices containing the sensors with
‘intelligent objects’ in the environment.
15. Future Work
Recognize Bootstrapping Determine
On-Body + Communication Relative Positioning
Position