This document discusses considerations for eye gaze interfaces to help people with disabilities from a human-computer interaction perspective. It explores how users visually scan unfamiliar digital interfaces to understand their interactive potential. Several examples of new interaction modes are presented, including using gestures, head movements, touch, and eye tracking for control. The document concludes that interface paradigms need to change to better incorporate tangible and natural user interfaces rather than just adding features to graphical user interfaces. More research is needed on multi-sensory feedback to interface with eyes, gestures, voice, touch, emotions and cognition.
Handwritten Text Recognition for manuscripts and early printed texts
Considerations about Eye Gaze interfaces for people with disabilities: from the HCI perspective
1. Considerations about Eye Gaze
interfaces for people with disabilities:
from the HCI perspective.
Jacques Chueke
PhD Researcher, Centre for HCI Design
Dr. George Buchanan
(1st Supervisor)
Senior Lecturer, Centre for HCI Design
Stephanie Wilson
(2nd Supervisor)
Senior Lecturer, Centre for HCI Design
Web Accessibility London (a11yLDN) event
London, UK, May 2011
1
2. Opportunity
I am especially interested on user exploration of new and unfamiliar digital interfaces to better
understand how users visually scan them to obtain the gist of its interactive potential. The
theory of Perceptible Affordances (Gibson: 1986, Norman: 1988, Gaver: 1991) was identified as
a proper tool to better understand users’ interactions with novel forms of input and new
generations of user interface.
MIT Media Lab: DepthJS – 2011
2
3. Norman’s Theory of Action
Execution Cycle
Execution Specification of Formulation of
Actions Intention
Sequence
Interaction
Perception Interpretation Evaluation
Evaluation Cycle
Preece et al (2009: 121) quoting Norman, (1986)
3
4. New Modes of Interaction
Xbox 360 (Kinect) Dashboard, 2011
4
5. New Modes of Interaction
• PrimeSense / MS Kinect: Swim Browser
Prime sense browser competition winner: Stolarsky 'SwimBrowser’ – 2011
5
6. New Modes of Interaction
• eviGroup Paddle Pro
Front-facing webcam to track head movements for cursor control – 2011
6
7. New Modes of Interaction
• Activity Centered Design: The tool is the way (Norman, 2005) +
Embodiment (Heidegger ‘being in the world’ and Dourish): We’re the tool.
Microsoft Surface, 2007
7
8. New Modes of Interaction
• Hitachi: Hitachi's Gesture Remote Control TV Prototype
CES 2009: Hitachi's Gesture Remote Control TV Prototype
8
11. New Modes of Interaction
• Fast text input: 8Pen Style Android / FlowMenu
The marking
menu system
teaches users to
make pen-based
gestures (pg. 150)
•The Dasher Project
www.inference.phy.cam.ac.uk/dasher/
11
18. Tobii Lenovo
Pointing With Your Eyes, to Give the Mouse a Break – 2011
18
19. New Visual Cues/Feedback
UI affordances are
Just-in-time chrome shown on tap. Applying
Can be triggered by the principle of
touch or proximity – scaffolding will lead you
hover effect (pg. 153) to far more successful
multi-touch and gesture
UI’s. (pg. 154)
Tethers indicate
that a size
constraint (MS
Surface) has
been reached on
an item being
scaled (pg. 91)
Wigdor, D. Wixton, D. Brave NUI World, 2011
19
20. Conclusions
The paradigms that define GUI and establish the conventions for
manipulation, besides the presentation of the visual interface itself -
within different digital environments (e.g. Mac and Windows OS and the
WWW) - do not match and better display/encompass the most suitable
use of novel modes of interaction.
The interface should change to encompass TUI and NUI, rather than
just co-exist with addictive features in an already exceeded GUI.
Research about the different feedback (multi-sensorial) should take place in
order to encompass more efficiently the possibilities of interfacing
with eyes, gestures, voice, touch, emotions and the very mind itself.
20
21. Thank you for your attention!
Jacques Chueke
Jacques.chueke.1@city.ac.uk
21
22. Bibliography
http://www.callscotland.org.uk/Common-Assets/spaw2/uploads/files/A-Guide-to-Picture-and-
Symbol-Sets-for-Communication-2010.pdf
Buxton, W. (2001). Less is More (More or Less), in P. Denning (Ed.), The Invisible Future: The seamless
integration of technology in everyday life. New York: McGraw Hill, 145–179 ITU Internet Reports 2005: The
Internet of Things – Executive Summary.
Dam, A. (February 1997). "POST-WIMP User Interfaces". Communications of the ACM (ACM Press) 40
(2): pp. 63–67. doi:10.1145/253671.253708.
Dourish, P. Where the Action Is: The Foundations of Embodied Interaction. A Bradford Book: The MIT
Press, USA, 2004.
Gaver, W. Technology Affordances. Copyright 1991 ACM 0-89791-383-3/91/0004/0079.
Jacob, R. et al. (2008). "Reality-Based Interaction: A Framework for Post-WIMP Interfaces". CHI '08:
Proceedings of the Twenty-Sixth Annual SIGCHI Conference on Human Factors in Computing Systems.
Florence, Italy: ACM. pp. 201–210. doi:http://doi.acm.org.ezproxy.lib.ucf.edu/10.1145/1357054.1357089.
ISBN 978-1-60558-011-1.
22
23. Bibliography
Nielsen, J. (April 1993). "Noncommand User Interfaces". Communications of the ACM (ACM Press) 36 (4):
pp. 83–99. doi:10.1145/255950.153582. http://www.useit.com/papers/noncommand.html.
Norman, D. (1999). Affordance, Conventions and Design. In ACM Interactions, (May + June, 1999), 38-42.
Picard, R. Affective Computing. The MIT Press, Cambridge, Massachusetts. London, England, 1998.
Preece, J. Sharp, H. Rogers, Y. Interaction Design: Beyond Human-Computer Interaction [2nd
edition]. John Wiley & Sons, Ltd, 2009. West Sussex, UK.
PsychNology Journal: Communication by Gaze Interaction. Vol. 7 N. 2, ISSN 1720-7525. In
Cooperation with COGAIN.
Sorensen, M. Making a Case for Biological and Tangible Interfaces. Proceedings of the Third International
Workshop on Physicality. Cambridge, England, 2009.
WIGDOR, Deniel. WIXON, Dennis. Brave NUI World: designing natural user interfaces for touch and gesture.
Morgan Kauffman Publishers, USA, 2011.
23
Notes de l'éditeur
The way we perceive things is changing. The way we interact with digital artifacts is changing. We need to re-interpret the shift we’re living and review the language itself that would better convey the message of NUI.I’ve also participated in a workshop on Assistive Technology with an eye tracking device for interface control (Tobii P-10).On June of 2011 I met researchers from the main eye tracking equipment manufacturer Tobii in London. They presented me Tobii-Lenovo laptop prototype with active eye tracking for control, aimed at able-bodied users. I was really impressed with the experience and the possibility of interfacing with the computer through eye tracking in combination with mouse & keyboard/track pad. This was a valuable contact and Tobii team displayed great interest on my research and continuity on exchanging findings; with the possibility to leave me the prototype for testing purposes.I have just returned from the first Tobii conference on eye tracking behavioral research in Frankfurt, and my knowledge from that venue may be of particular value to others who are using or considering eye tracking methods.
Mention iPhone, iPad The most important achievement on TUI is bridging the gap between input and output by displaying outputs and inputs on the same surface, helping to integrate perception and action seamlessly into one environment. Sorensen (2009), apudSharlin (2004)When a tool is physically acted upon, the result is twofold: causality is instantly observed and time is inherently felt. This coupling between user and tool allows for embodiment. ACD models and TUIs alternatively are more open-ended in application, requiring the user to explore rather than CONFORM or ADAPT to the tool. ‘In other words, as we act through technology that has become ready-to-hand, the technology itself disappears from our immediate concerns. We are caught up in the performance of the work; our mode of being is one of “absorbed coping.” The equipment fades into the background. This unspoken background against which our actions are played out is at the heart of Heidegger’s view of being-in-the-world. Dourish (2004:109)
Igonnaletyou decide what’swrongwiththispicture.
Igonnaletyou decide what’swrongwiththispicture.
Accessibility: the technology conveys opportunities for people with special needs to interface with digital devices (multi-sensory).
Great Trans disciplinary research evolving HCI, Cognitive Psychology, Linguistics, Semiotics, Visual Perception Theory, Gestalt Theory, etcLanguage mixing symbols, icons, almost a ideogram.
TASK: OpenSoftw > Print Screen > Open Sotw > Paste > Save File > Select Folder > Tried to change file format (FAIL) > Close softwChange system setings > look downChange mouse settings > look left
Alt + Tab with Eye Gaze selection of Window + SpacebarPicture Viewer > Gaze to zoom in (non reactive visual cues for swith to left / right / bring back the thumbnail barZooming in at the focus pointPresentation slider (Gaze + Spacebar)Scroll text (top / down gaze) – no visual cues for top/down
The way we perceive things is changing. We need to re-interpret the shift we’re living and review the language itself that would better convey the message of Post-WIMP/NUI and encompass the experience of the very interaction itself.GUI additions such as Natural User Interfaces, Microsoft’s Surface Computer, eye-tracking and other Haptic interfaces are not transforming the underlying problems created with the GUI.Sorensen (2009)