Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
58 towards a new gaze tracker
1. VISION
for ASSISTIVE TECHNOLOGIES
Edwige Pissaloux, ISIR/UPMC & CNRS (UMR 7222), Paris
Andrea Carbone, ISIR/UPMC & CNRS (UMR 7222), Paris
Christophe Veigl, FHTW, Vienna
Christophe Weiss, FHTW, Vienna
/ ^ / Z /^/Z
2. Outline
1. Interaction and human capabilities :
definition & new technologies.
2. Vision based technologies for interaction with pointing.
3. AsTeRICS project contribution to vision technology design.
4. Results of first user evaluations of AsTeRICS gaze tracker.
5. Conclusions.
W /Z Z/d /Z h Z > ^ /Z d hWD
3. 1. Interaction and human capabilities :
definitions new technologies.
Definitions
• Interaction = reciprocal actions/influences
- key concept of modern societies
- based on human attention
- involves different actors
(real person, virtual characters, objects, ubiquitous environments, etc.)
• Interaction allows
- accessibility to all computer based ICT solutions
- new skills acquisition development
- new communication modes elaboration
(ART= Attention Responsive Technology; multimodal paradigms, etc.).
4. New technologies for interaction
• New assistive technologies consider a mix of human
capabilities not integrated in classic interactive tools:
- brain physiological signals (BCI)
- brain plasticity
- visual perception (visual attention)
- pure technological « intelligent tools »
(tactile, haptic, robots, orthotics, computer vision).
This presentation objective :
• visual perception computer vision
• pointing operation
5. 2. Vision based AT
for interaction with pointing
• Pointing = the designation of an object by mediation of
the arm, hand and sight.
• Two steps natural pointing operation usual execution :
(1) pointed object localization in space and identification via sight
(point of regard (PoR) “touches” the object);
(2) arm/hand lifting for /physical or virtual (distal) “touch” of the object.
• In the case of upper limb impairments the second step
should be executed by other means (other body parts).
6. New AT
for pointing of an object on a PC
• Implementation outline
use of image and vision processing
targeted body part(s) detection and tracking
mouse different operation simulation.
7. Finger as a pointer
• h d
W
D
E d , z t z d / D /
D / ' / d ^/''Z W,
8. Limb as a pointer
• the concept of a pictorial structure (not fully connect graph) to
represent the investigated part of the human body.
h ^ h^
D/d
• d h^
W , W W / : s
^ D D ' Z Z d h d s^^E K ^
9. Shoulder elbow
as a pointer
• EU 6th FWP IST IP AMI, University of Twente (1)
- approach oriented at the estimation and recognition of poses
which generalizes detection of different body parts
(different limb sections such as shoulder, elbow)
K
• E d
: W Z W D h W D / / : s
^ , D / : s
10. Face and head as pointer
• Different facial features : nose, global face, eyebrows, and their combination
• Processing
- targeted feature is detected in acquired images during the calibration
- targeted feature is tracked during the interaction.
Mouse functionalities emulation:
- the mouse spatial displacements can be deduced from nose/head/eyebrows movements,
- the mouse click (or object selection) is implemented through the (left or right)
eye double blinks or through dwell time.
t D Z D D , / /
^ / /^W D
' Z ' E
/ s K
11. Eye gaze as a pointer
• Object selection via gaze is a fundamental interaction modality,
as the gaze position anticipates and finally allows,
an action execution on the gazed object.
• Two configurations for eye- gaze trackers
Z d t Z /
• The main characteristics :
- remote systems : no devices/sensors have to be mounted on the subject’s body,
but restriction of the movement in the interaction space ;
- Wearable systems : high accuracy, gaze-estimation in natural viewing context,
unlimited interaction space
12. 3. AsTeRICS project contribution
to vision technology design (http://www.asterics.eu)
Objectives
- the design of an adaptable and scalable system supporting
also unconventional peripherals (BCI, vision, etc.)
for people with severely reduced motor capabilities interactions
- the evaluation of system with primary and secondary users.
d Z/^ ^
13. AsTeRICS wearable gaze tracker
Main characteristics
- adjustable to the end-user anatomy (head size, distance eye/camera, etc.)
- adaptable to specific needs (small-amplitude head movements compensation,
easy to wear, precision of detection compatible with targeted skills for interaction, etc.)
Hardware
- hot-mirrors or telescopic arms
- multiple combinations of sensors coping with different capabilities and interaction
needs
- batteries for long autonomy.
Foreseen combinations :
- a three camera system using only visible lighting
for full gaze tracking in 3D space ;
- a ‘minimal’ set with only one IR eye-camera and
a custom PCB integrating an IR tracker, a gyroscope and a Sip/Puff sensor.
16. 4. Results of first user evaluations.
AsTeRICS system prototype 1 – remote (web-camera based) gaze tracker
- June-August 2011
- different sensors and sensor-combinations
- 50 users in Austria, Poland and Spain
- spasms and involuntary head movements
Tests
- Interaction with a screen
Results
- spasms and involuntary head movements represented a big problem,
preventing precise pointing or computer mouse control
- BUT highest level of acceptance.
Future developments
- tremor reduction algorithms and
- evaluation of the head mounted eye tracker
17. 5. Conclusion
Vast impact of the vision technology a on the quality of life,
Vision allows new modes
- to access ICT and internet-based services
(games, e-library, e-shopping, e-health, e-rehabilitation, e-learning, etc.)
- to access smart environments which constitute the infrastructure for gaze interaction
with environmental control systems
(lighting control, heating/ventilation or usage of home entertainment devices);
- for new skills acquisition (such as navigation in virtual worlds) (with training).
Future (second phase) of the AsTeRICS project
- the head-mounted gaze estimation system will be finished,
- both, the remote and the head-mounted solution,
will be evaluated in qualitative and quantitative user tests.
18. Thanks to
- EU FP7 ICT program
- France Soudage
Francis Martinez, ISIR/UPMC
Darius Mazeika, ISIR/UPMC Kaunas University (Lithuania)
Isabelle Liu, Master Student, UPMC
Jaza Gul Mohammed, Master Student, UPMC
Jacky Chen, EM, MIT
Faith Keza, CS, MIT
Thank to you for your participation