SlideShare une entreprise Scribd logo
1  sur  8
Télécharger pour lire hors ligne
ceCursor, a Contextual Eye Cursor for General Pointing
                                     in Windows Environments
                                                Marco Porta, Alice Ravarelli, Giovanni Spagnoli
                                         Dipartimento di Informatica e Sistemistica – Università di Pavia
                                                     Via Ferrata, 1 – 27100 – Pavia – Italy
                                 marco.porta@unipv.it, alice.ravarelli@unipv.it, giovanni.spagnoli01@ateneopv.it

Abstract                                                                                           nological obstacles which limit pure eye-based pointing. On the
                                                                                                   one hand, even during fixations the eyes are not perfectly still,
Eye gaze interaction for disabled people is often dealt with by                                    but are characterized by jitters of different kinds [Yarbus 1967],
designing ad-hoc interfaces, in which the big size of their ele-                                   such as microsaccades [Engbert and Kliegl 2004]; unless a me-
ments compensates for both the inaccuracy of eye trackers and                                      chanism for stabilizing the detected gaze position is employed,
the instability of the human eye. Unless solutions for reliable eye                                the eye-controlled pointer will tremble to some extent. On the
cursor control are employed, gaze pointing in ordinary graphical                                   other hand, even very recent eye trackers have a limited preci-
operating environments is a very difficult task. In this paper we                                  sion (typically, 0.5 degrees), and consecutive gaze samples ac-
present an eye-driven cursor for MS Windows which behaves                                          quired by the device cannot be exactly centered on the same
differently according to the “context”. When the user’s gaze is                                    point. For these reasons, the basic approach which simply dis-
perceived within the desktop or a folder, the cursor can be dis-                                   plays the cursor where the user’s gaze is detected on the screen
cretely shifted from one icon to another. Within an application                                    is hardly practicable — a shaking cursor is generally annoying,
window or where there are no icons, on the contrary, the cursor                                    and precise pointing on small targets is practically impossible.
can be continuously and precisely moved. Shifts in the four di-
rections (up, down, left, right) occur through dedicated buttons.                                  Indeed, most existing eye-controlled interfaces are specifically
To increase user awareness of the currently pointed spot on the                                    designed to make up for such limitations. For instance, they are
screen while continuously moving the cursor, a replica of the                                      characterized by big graphical elements (e.g. buttons), which can
spot is provided within the active direction button, resulting in                                  be easily selected even if the user’s gaze is detected in slightly
improved pointing performance.                                                                     different positions. For object selection, the dwell time principle
                                                                                                   is typically exploited: the mouse click (or double click) is simu-
CR Categories: H.1.2 [Models and Principles]: User/Machine                                         lated by looking at a target for a certain time. Usually, conti-
Systems—Human Factors; H.5.2 [Information Interfaces and                                           nuous gaze feedback is avoided, thus eliminating the bad effect
Presentation]: User Interfaces—Input Devices and Strategies,                                       of a trembling cursor constantly displayed on the screen. De-
Interaction Styles                                                                                 pending on the application, other kinds of feedback may be
                                                                                                   used, associated with elements of the interface (for example, a
Keywords: gaze interaction, eye tracking, eye cursor, eye point-                                   button may change its color progressively as it is fixed and the
ing, assistive technology, alternative communication                                               dwell time approaches). Program suites developed for commer-
                                                                                                   cially-available eye trackers (e.g. MyTobii [Tobii 2009]) are col-
                                                                                                   lections of applications sharing graphical look and interaction
1       Introduction                                                                               mechanisms, designed on-purpose for eye gaze interaction.

People affected by severe motor impairments need effective me-                                     While a dedicated environment for the execution of eye-
thods for providing input to the computer. Exploiting eye gaze                                     controlled programs has undoubtedly a number of advantages, it
as a substitute for the mouse is potentially the most intuitive way                                has some limitations as well. First of all, it constrains the user to
to interact with a PC without using the hands: the “point-and-                                     employ only the software available in the suite: any other appli-
click” paradigm at the basis of current operative environments is                                  cation installed on the computer cannot be controlled by means
universally adopted, and probably also the one most suitable for                                   of the eyes (or, if so, the task is very difficult, because elements
two-dimensional interfaces.                                                                        of ordinary graphical user interfaces are usually small and not
                                                                                                   designed for easy eye pointing). Moreover, program suites are
However, while pointing tasks are inherently connected with eye                                    often associated with specific eye trackers: if, for any reason, the
fixations [Smith et al. 2000] — using the mouse, we look at a                                      user wants to change the device, the old applications may not
target and then move the cursor to it by means of a precise ocu-                                   work properly on the new system. When the market of eye
lar-hand coordination — there are both physiological and tech-                                     trackers will expand (in a hopefully not too far future), the de-
                                                                                                   crease of prices is likely to accentuate such problem.

                                                                                                   Whatever the reason why a software purposely designed for eye
Copyright © 2010 by the Association for Computing Machinery, Inc.
                                                                                                   pointing is not available, the possibility to efficiently use the
Permission to make digital or hard copies of part or all of this work for personal or              eyes like a mouse is desirable in many situations. However, until
classroom use is granted without fee provided that copies are not made or distributed              eye trackers will become extremely precise machines, proper
for commercial advantage and that copies bear this notice and the full citation on the             interaction mechanisms are necessary to compensate for their
first page. Copyrights for components of this work owned by others than ACM must be
honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on
                                                                                                   lack of accuracy, as well as to make up for the intrinsic instable
servers, or to redistribute to lists, requires prior specific permission and/or a fee.             behavior of the human eye. Several approaches have been pro-
Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail               posed to date for reliable eye pointing, trying to find good tra-
permissions@acm.org.
ETRA 2010, Austin, TX, March 22 – 24, 2010.
© 2010 ACM 978-1-60558-994-7/10/0003 $10.00

                                                                                             331
deoffs between accuracy and ease of use. In this paper we                     then performed as soon as the user releases the key. Although
present ceCursor, a special pointer which can be controlled                   this method requires the user to perform a certain physical action
through the eyes in different ways, according to the specific con-            (e.g. press a key) to accomplish the selection process, which
text. The cursor, designed for Microsoft Windows operating sys-               may not be possible for a disabled person, other solutions could
tems, allows both “rough” and accurate pointing within applica-               be adopted as well (e.g. dwell time). An interesting variant of
tion windows, while icon selection (within folders and on the                 the zooming technique is the so-called “fish eye” lens effect
desktop) occurs in a “discrete” way.                                          [Ashmore et al. 2005]. Like when looking through a magnifying
                                                                              lens, the fixed area is expanded, allowing the user to maintain an
The paper is structured as follows. Section 2 briefly presents                overview of the screen while selectively zooming in on the re-
some research projects related to eye cursors and eye pointing in             gion of interest.
general. Section 3 describes the features of ceCursor and the
way it can be employed. Section 4 provides a few technical de-                Whatever the pointing strategy adopted, the improvement of eye
tails about the system. Section 5 illustrates and discusses expe-             pointing precision is among the main desiderata of people need-
rimental results. Section 6, at last, draws some conclusions.                 ing eye-based interaction. For instance, Zhang et al. [2008] pro-
                                                                              pose three methods to increase eye cursor stability, namely force
2     Related Work                                                            field, speed reduction, and warping to target center. The pur-
                                                                              pose of these techniques is to adjust eye cursor trajectories by
                                                                              offsetting eye jitters, which are the main cause of destabilizing
The implementation of reliable eye-controlled cursors has been a              the eye cursor. As another example of recent research of this
stimulating challenge for many years.                                         kind, Kumar et al. [2008] propose an algorithm for real-time
                                                                              saccade detection, which is used to smooth eye tracking data in
Among the oldest projects, it is worth citing Eye Mouse, a com-               real-time. Such algorithm tries to identify gaze jitters within sac-
munication aid based on electrooculogram (EOG) signals allow-                 cades, which could be misled for new saccades and deceive the
ing the user to control a normal mouse with a combination of                  eye tracker.
eye movements and blinks [Norris and Wilson 1997]. While ra-
ther primitive, Eye Mouse was one of the first attempts at relia-             Because of the limitations in the steadiness and accuracy of cur-
bly controlling an on-screen cursor for general computer interac-             sor control provided by eye trackers, there are also approaches
tion. The famous MAGIC (Manual And Gaze Input Cascaded)                       which combine gaze detection with electromyogram (EMG) sig-
pointing project by IBM came shortly after [Zhai et al. 1999].                nals generated by the facial muscles (e.g. [Chin et al. 2008]).
Starting from the observation that it is unnatural to overload a              These solutions, although generally slower, can be more accu-
perceptual channel such as vision with motor control duties,                  rate than eye-only control, but are unfortunately more invasive,
gaze in MAGIC is only used to approximately position the                      since the user has to wear electrodes on the face.
pointing cursor, while the small movements necessary to pre-
cisely move it are made by hand — a good approach for people                  There are also several implementations of very cheap eye input
with normal motor abilities, but a totally unsuitable strategy for            systems which use normal webcams as an input source. For ex-
severely disabled users, unfortunately. After MAGIC, several                  ample, the systems by Gorodnichy and Roth [2004] and by Siri-
techniques for eye-hand mixed input have been developed,                      luck et al. [2007] exploit face movements to control mouse
aimed at improving the performance of common mouse-based                      pointing position, and eye blinking to generate mouse clicks.
operations. For example, very recent are the Ninja [Räihä and                 Performances of such solutions, however, are usually very li-
Špakov 2009] and Rake [Blanch and Ortega 2009] cursors me-                    mited and may not be suitable for individuals who can only
thods, where several cursors are displayed on the screen at the               move the eyes.
same time and eye gaze is exploited to select the currently active
one.
                                                                              3     System Description
Limiting our investigation to pure eye-based interaction, given
the small size of ordinary interface components, help to precise              ceCursor is basically composed of a square (whose central point
eye pointing can come from zooming. If the fixed area on the                  indicates the actual pointer position) and of four direction but-
screen is enlarged, it becomes easier to select small elements.               tons placed around it (Figure 1).
First studies in this direction date back to ten years ago [Bates
1999], with experiments aimed at comparing eye-only and eye-
with-zoom interaction in target acquisition tests. Successive re-
search definitely demonstrated that zooming makes usable eye
interaction possible, and that target size is the overriding factor
affecting device performance [Bates and Istance 2002]. One of
the first projects where zooming was practically exploited to
interact with a “normal” operating environment (Microsoft Win-
dows, in particular) is ERICA [Lankford 2000]. In this system, if
the user looks at a specific spot on the screen for more than a
dwell time, a window appears where the region around which
the user was fixating is displayed magnified. Looking at a cer-
tain point within such window, mouse clicks are triggered using
again the dwell time principle. An analogous approach is fol-
lowed by Kumar et al. [2007] in the more recent EyePoint
project. In this case, if the user looks at a certain location on the                                Figure 1 ceCursor
screen and, at the same time, presses a specific key on the key-
board, the observed screen portion is magnified, and a grid of                Direction buttons are in the shape of triangles, and are pressed
dots appears over it. Single, double and right click actions are              by eye gaze. The cursor is displayed on the screen with a semi-



                                                                        332
transparent effect, and its size depends on the precision of the              not successful, it is very easy to shift the cursor to the right icon.
employed eye tracker, as well as on the eye pointing ability of               On the one hand, precise pointing is difficult, and it may be hard
the user (a cursor 300 pixels high and large is usually fine, un-             for the user to select an icon at the first attempt (especially if it is
less the user is totally novice). The side of the central square is           small). On the other hand, since there are no other possible ac-
one third of the cursor width.                                                tions that can be performed, it would be useless — or better,
                                                                              slower — to move the cursor in a “continuous” manner by
As will be explained in the next subsections, ceCursor behaves                means of direction buttons: a discrete motion strategy has the
differently according to where it is at a certain moment. In any              advantage of both simplifying the pointing task and speeding up
case, looking inside the central square causes a mouse click to               the selection process. Figure 4 shows an example with icons on
be generated in its center after a dwell time (for instance, one              the desktop.
second). Time lapsing is graphically represented by concentric
circles progressively appearing within the square and filling it
toward the center (Figure 2). After the first click, if another click
is generated in the same position, it is interpreted as a double-
click.



                     ....                ....



               Figure 2 Click generation process

If the user looks outside the cursor (that is, neither within the
central square nor in direction buttons), after a dwell time it is
shifted to a new position — the nearest icon if the cursor is on
the desktop or within a folder, or the user fixation point if the
cursor is within an application. A typical dwell time value is one
second.                                                                         Figure 4 Discrete movement of ceCursor for icon selection
                                                                                                    on the Desktop
The ‘M’ placed in the lower-right area near ceCursor, when
fixed for a certain time, causes the icon of a mouse to appear                On the desktop, there is a threshold distance from icons beyond
(Figure 3): looking at it, the user can change the currently active           which the “capture process” does not occur (350 pixels in our
mouse button (right/left and vice versa, alternatively).                      experiments), and the cursor is moved like within an application
                                                                              window (see Section 3.2). The reason for this is because on the
                                                                              desktop the cursor may be moved to select other elements be-
                                                                              sides icons, such as parts of application windows. Moreover,
                                                                              when ceCursor is too close to a screen edge where there are
                                                                              icons, it is automatically shifted to the nearest outermost one.
                                                                              “Too close” means that the cursor, if moved further, would not
                                                                              be totally included in the screen, because a direction button
                                                                              would be partially or totally concealed through the edge (which
                                                                              would make other shifts in that direction difficult, or even im-
                                                                              possible). Once “hooked” at an icon on the edge, ceCursor can
                                                                              be easily moved to the desired icon using the opposite direction
      Figure 3 Icon for changing the active mouse button                      button.
The small circle located in the upper-right area near ceCursor is             Within a folder, ceCursor can operate with any visualization
instead used to “stick” it in a certain place on the screen (it be-           mode of MS Windows (small and big icons, preview, details,
comes more transparent and its color changes to red). This func-              etc.): the cursor is able to recognize the way icons are arranged,
tion allows the cursor not to be in the way of other user activities          as well as their size, to correctly move among them (Figure 5).
(e.g. reading) when not necessary.

3.1     Case 1: ceCursor on the Desktop or within
        a Folder

In presence of icons, ceCursor is “captured” by them. In other
words, if the user looks at an area where there are icons, the cur-
sor is automatically positioned on the nearest one. This behavior
is in line with usual activities carried out within a folder or on
the desktop, which necessarily involve icons.

When ceCursor is positioned over an icon and the user looks at
a direction button, the cursor “jumps” over the next icon in that
direction (if there is one). This way, if the direct pointing was                 Figure 5 ceCursor with big (left) and small (right) icons


                                                                        333
Actually, it is especially with small icons that the “jumping” mo-          cle) is over the target, the user can look inside the central square
tion modality of ceCursor can be appreciated, since in this case            and start the click generation process.
the pointing task becomes extremely difficult.
                                                                            Indeed, recognizing that the cursor is over the desired (maybe
To simplify the three common operations performed on a folder               small) target is not always so easy. After a first implementation
window, i.e. “Minimize”, “Up one level” and “Close”, when a                 of ceCursor, we soon realized that the pointing task through di-
folder is opened, three big buttons are displayed over it, which            rection buttons is characterized by very frequent shifts between
work like the standard buttons of any window (Figure 6). Look-              the button and the central square: accurate adjustments require
ing at them for a certain time, the corresponding actions are per-          the user to alternatively look at the pointed spot, to check
formed.                                                                     whether the target has been reached, and at direction buttons, to
                                                                            move the cursor further. Through several informal trials, we
                                                                            found that such a pointing mechanism, besides not being as fast
                                                                            as we would expect, may become annoying in the long run. We
                                                                            therefore implemented a new version of ceCursor, which turned
                                                                            out to be more effective.

                                                                            In this new version, during cursor movement the area included
                                                                            in the central square is replicated within the active direction but-
                                                                            ton (Figure 8). This way, the user can always be aware of what
                                                                            is being pointed by the cursor at a certain moment, even while
                                                                            constantly looking at a direction button to reach the target.




   Figure 6 Control buttons displayed over a folder window


3.2       Case 2: ceCursor within an “Icon Free”
          Area

When ceCursor is within an application window, or on the desk-                               a                                    b
top but sufficiently far from icons, it can be precisely moved to
point at the desired target.
                                                                               Figure 8 Replica of the currently pointed area displayed
                                                                             within the direction button (the cursor is moving rightward in
Looking anywhere within an “icon free” area causes the cursor
to be shifted to the fixed spot. However, small interface ele-                                   a and downward in b)
ments may be difficult to achieve at the first attempt. To exactly
position the cursor, the user can then use direction buttons. As            Such a solution makes it possible for the user not to loose the
long as a direction button is fixed, the cursor is continuously and         “context” of the cursor, avoiding repeated gaze shifts between
smoothly moved in that direction (Figure 7). Speed, initially rel-          the central square and the direction button. Indeed, the adopted
atively low (50 pixels/sec), raises progressively (with an in-              strategy is especially effective if two “mental steps” are fol-
crease of 50 pixels/sec every two seconds).                                 lowed in sequence:

                                                                              1. Identification of a desired target in the central square
                                                                              2. Cursor movement by means of direction buttons, with the
                                                                                 target clearly in mind

                                                                            As will be illustrated in Section 5, our experiments have shown
                                                                            that this last implementation of ceCursor, besides being very
                                                                            appreciated by users, provides better performances in terms of
                                                                            time to complete pointing tasks.

                                                                            Analogously to what happens within an area containing icons,
                                                                            when ceCursor gets too close to a screen edge (that is, one of the
                                                                            direction buttons starts disappearing), it is shifted so that its cen-
        Figure 7 Schematization of the continuous motion of
                                                                            ter is exactly on the border. The cursor can then be moved pre-
      ceCursor (1 pixel every 1/50 sec in the first two seconds)
                                                                            cisely to the desired target using the opposite direction button.
                                                                            Without such a mechanism, it would, for example, be impossible
The motion of the cursor stops as soon as the user looks outside            to click the ‘close’ button of an application opened in full
the button. Once the center of ceCursor (identified by a red cir-           screen, or to select one of its menus (Figure 9).



                                                                      334
a                                  b
      Figure 9 ceCursor is automatically shifted to the upper
                      border of the screen                                               Figure 10 Folders used for test TA

4      A Few Technical Details

ceCursor is implemented in C# within the Microsoft .NET
framework. As an eye tracker, we used the Tobii 1750 [Tobii
Technology AB 2004], which integrates all its components
(camera, near-infrared lighting, etc.) into a 17’’ monitor. The
sampling rate of the device is 50 Hz, i.e. gaze data are acquired
50 times a second.

The system was developed for and tested with MS Windows XP
Home Edition. To access the several Windows data and features
necessary for ceCursor to work (e.g. information on folder visu-
alization modes, icon size and position, etc.), functions from the
user32.dll and kernel32.dll libraries were imported in C#. Cursor
rendering was double-buffered, to avoid flickering effects.                                 Figure 11 Panel used for test TB
A control panel allows all system parameters (e.g. dwell times
and level of transparency) to be set through text textboxes and            For both TA and TB, the dependent variable was the time to
sliders, as well as to perform eye tracker calibration.                    complete the task (on a single button). Moreover, we introduced
                                                                           a binary sub-variable “Success”, whose value was 1 if the user
                                                                           finished the task correctly within a timeout of 30 seconds, 0 oth-
5      Experiments                                                         erwise. “Correctly” means that no wrong operations were per-
                                                                           formed (such as, for example in TA, opening the wrong folder).
Besides informally testing ceCursor many times during its de-              For TB, we used a further variable, “Number of Attempts”,
velopment, we also carried out two more structured experiments             which measured the number of clicks generated until the button
(E1 and E2) once it was fully implemented.                                 was correctly pressed (unless the timeout was reached).

Nine testers (aged between 19 and 45, 25.22 on average, seven              In order to compare ceCursor with the more “traditional” way of
males and two females) took part in experiment E1. None of                 interacting with interfaces through the eyes, we also imple-
these testers had any previous experience with eye tracking de-            mented a simple cursor (simpleC in the following) which merely
vices and eye-controlled interfaces. Two testers (26 and 20,               displayed an empty small square where the user’s gaze was per-
males) participated in experiment E2. Both of them were not                ceived. For the equivalent of a mouse double-click to be gener-
totally novice, as they had been involved in some eye tracking             ated, 100 consecutive gaze samples (i.e. a dwell time of two
tests before.                                                              seconds) had to be detected within a circle with a radius of 10
                                                                           pixels; the click was centered on the point with coordinates giv-
5.1      Procedure                                                         en by the mean values of the acquired samples.

Both E1 and E2 were composed of two tests, TA and TB, struc-               For test TB, we employed two versions of ceCursor, one with
tured as follows:                                                          the replica of the currently pointed area — we will simply indi-
                                                                           cate this version with ceCursor — and one without the replica,
    TA. Within a folder containing seven other folders in the              like in the first implementation — we will call this other version
        form of icons (Figure 10a), the user had to open ‘fold-            ceCursorWR. For both cases, parameter values used in the expe-
        er3’ (task 1) and then, in that folder, which contained in         riments were the following:
        turn seven small folders (Figure 10b), to open ‘folder5’
        (task 2).                                                              Cursor size: 351 pixels (a relatively big cursor, since all the
                                                                               testers in experiment E1 were new to eye gaze input and
    TB. Within an empty panel displayed in full screen (Figure                 had a very short training period)
        11), the user had to click, for five times, a small button
        appearing in five random positions. The size of the but-               Number of samples to be perceived within the central
        ton was the same as that of the “close window” button of               square for the first click to be generated: 60 (dwell time of a
        folders in MS Windows XP.                                              little more than one second)




                                                                     335
Number of samples for the second click (double-click) to be           5.4      Experiment E2 - Test TA
      generated: 60
      Number of samples outside the cursor area for the cursor to           Task 1: opening a folder within a folder containing big icons.
      move there (both in an area with icons and not): 60                   With both cursors, the testers succeeded in the task. Times
      Number of samples on a direction button for the cursor to             measured with simpleC were 5.1 sec for the first tester and 3.5
      move in that direction (both in an area with icons and not):          for the second (mean: 4.3). Times measured with ceCursor were
      60                                                                    3.6 sec for the first tester and 4.5 of the second (mean: 4.05).
                                                                            Comparing these values with the corresponding means for the
Each tester tried both TA and TB. For TA, only simpleC and                  same test and task of experiment E1 (4.03 and 8.14 for the two
ceCursor were used (since in areas with icons there are no repli-           cursors, respectively), it is evident how in the two cases the per-
cas), while for TB all the three cursors were employed. Cursor              formances of simpleC are similar, while they are very different
order was randomized. Screen resolution was 1280x1024.                      for ceCursor (Figure 12a): it seems that a longer training period
                                                                            can actually help speeding up the pointing action.
In E1, prior to the actual test session each tester was clearly ex-
plained how to use the cursors and assisted in exercising with              Task 2: opening a folder within a folder containing small
them (five minutes for each one, thus resulting in a 15 minutes             icons. None of the two testers succeeded in the task with simp-
total training time). The two testers of E2 could instead exercise          leC (they both opened the wrong folder). Despite the extended
with the three cursors for a much longer time — 15 minutes                  training time, the pointing precision is so limited that opening
each, with a total training period of 45 minutes.                           the right folder becomes probably a matter of pure chance. Defi-
                                                                            nitely better results were instead provided by ceCursor: 4 and
5.2      Experiment E1 - Test TA                                            8.5 sec, with a mean of 6.25 sec. Comparing this value with the
                                                                            corresponding mean for the same test and task of experiment E1
Task 1: opening a folder within a folder containing big icons.              (11.95 sec), also in this case a longer training period seems to be
With both simpleC and ceCursor, all the testers succeeded in the            helpful (Figure 12b).
task. A repeated-measures ANOVA (within-subjects design) did
not show a clear relation between cursor type and times (F=3.37,
p=.1), but the means were significantly different (4.03 sec for              10,00
simpleC and 8.14 sec for ceCursor). As could be expected, with                8,00                      E1
big elements that are sufficiently separated each other simpleC               6,00
can provide good results in terms of time to complete the task: if            4,00
                                                                                                                        15,00
                                                                                              E2
the user is able to maintain the gaze adequately focused on a                 2,00
                                                                                         E1                   E2        10,00     E1
                                                                                                                                       E2
(large) target, there is no real need to use mechanisms for pre-              0,00
                                                                                                                         5,00
                                                                                                                         0,00
cisely tuning the position of the cursor.                                               simpleC        ceCursor                  ceCursor

Task 2: opening a folder within a folder containing small                                          a                               b
icons. In this case, all the testers succeeded with ceCursor, but
only two out of nine (22.22%) managed to open the small folder                 Figure 12 Test TA: results of experiment E1 vs. results of
with simpleC: the trembling behavior of this cursor makes it ex-                  experiment E2 (mean times) – (a) task 1, (b) task 2
tremely difficult to aim at small targets. When successful, simp-
leC was relatively fast (mean of 4.2 sec for the two positive tri-
als, versus 11.95 sec for the nine positive outcomes of ceCur-
                                                                            5.5      Experiment E2 - Test TB
sor), but cannot be used for reliable pointing.
                                                                            Considering a time value of 31 seconds when the timeout of 30
                                                                            seconds was reached, the following results (average times to
5.3      Experiment E1 - Test TB                                            click the button, in seconds) were obtained.
Considering a time value of 31 seconds when the timeout of 30                     simpleC: Tester 1      13.91, Tester 2     15.73, Tester 1 +
seconds was reached (i.e. the trial was unsuccessful), a repeated-                Tester 2    14.82 (four successful trials out of five for both
measures ANOVA did not show any relation between cursor                           Tester 1 and Tester 2).
type and time to complete the task (F=.86, p=0.43). Nonetheless,
although means were similar (15.32 sec for simpleC, 14.64 sec                     ceCursorWR: Tester 1       8.64, Tester 2    13.8, Tester 1 +
for ceCursorWR and 13.1 sec for ceCursor), ceCursor showed a                      Tester 2 11.22 (all successful trials).
slightly better performance.                                                      ceCursor: Tester 1      6.7, Tester 2      10.86, Tester 1 +
                                                                                  Tester 2 8.78 (all successful trials).
Looking at success percentages (73.33% for simpleC, 93.33%
for ceCursorWR and 97.78% for ceCursor), it is clear that ce-               As can be seen, while the mean time for simpleC is about the
Cursor resulted a little more effective than its counterpart with-          same as for experiment E1, for ceCursorWR and ceCursor sig-
out the replica — and much more effective than the basic cursor.            nificant reductions can be noted (Figure 13). Moreover, in this
This becomes even more evident if we consider the number of                 case too, ceCursor provided a better performance compared to
clicks generated until button press (or until the available 30              ceCursorWR.
seconds were over). A repeated-measures ANOVA showed a
plain relation between cursor type and number of clicks                     As for the number of clicks generated until button press (or until
(F=26.39, p<.001), with mean values of 4.33 for simpleC, 1.44               the available 30 seconds were over), while only one attempt was
for ceCursorWR and 1.2 for ceCursor.                                        necessary with both ceCursorWR and ceCursor, an average of




                                                                      336
confused by content duplication within the cursor: in our expe-
                                                                               riments, ten out of eleven testers said to prefer this solution.
         16,00
         14,00     E1   E2       E1
                                                                               Our tests also show that times to accomplish the pointing tasks
         12,00                                 E1                              exhibit a decreasing trend with increase of the training period.
                                      E2
         10,00
                                                    E2
                                                                               Although we were not able to implement experiment E2 with the
          8,00
          6,00
                                                                               same number of testers as in experiment E1, the tendency seems
          4,00                                                                 to be this. Moreover, times could be further reduced by dimi-
          2,00                                                                 nishing cursor size (especially in test TB, ceCursor was occa-
          0,00                                                                 sionally “captured” by screen borders) and by lowering dwell
                  simpleC      ceCursorWR     ceCursor
                                                                               times.

    Figure 13 Test TB: results of experiment E1 vs. results of                 Acknowledgement
                 experiment E2 (mean times)
                                                                               This work was supported by funds from the Italian FIRB project
5.2 attempts for Tester 1 and of 6.2 for Tester 2 were needed                  “Software and Communication Platforms for High-Performance
with simpleC. In a real usage scenario with MS Windows appli-                  Collaborative Grid” (grant RBIN043TKY).
cations, employing simpleC would mean having a very high
probability to click the wrong target.                                         References
5.6     User Preference                                                        ASHMORE, M., DUCHOWSKI, A. T., AND SHOEMAKER, G. 2005.
                                                                               Efficient Eye Pointing with a Fisheye Lens. In Proceedings of
Both in E1 and E2, at the end of the experiments the testers were              Graphics Interface, Victoria, British Columbia, 203-210.
asked to express a preference regarding the three cursors. In E1,
eight testers out of nine said to prefer ceCursor, and one ceCur-              BATES, R. 1999. Multimodal Eye-Based Interaction for Zoomed
sorWR. In E2, both the testers said to prefer ceCursor.                        Target - Selection on a Standard Graphical User Interface. In
                                                                               Proceedings of Interact'99, vol. II 7-8, Edinburgh, Scotland,
6     Conclusions                                                              UK, British Computer Society, 7-8.

Reliable eye pointing in ordinary operating environments is a                  BATES, R., AND ISTANCE, H. 2002. Zooming interfaces!: enhanc-
challenging problem. Small graphical elements need specific                    ing the performance of eye controlled pointing devices. In Pro-
mechanisms for precise selection: as demonstrated by our tests,                ceedings of the 5th International ACM Conference on Assistive
and easily guessed by anybody who has experienced eye gaze                     Technologies, Edinburgh, Scotland, UK, 119-126.
input, a trembling cursor (visible or not) can only be used when
targets are sufficiently wide and spaced each other.                           BLANCH, R., AND ORTEGA, M. 2009. Rake Cursor: Improving
                                                                               Pointing Performance with Concurrent Input Channels. In Pro-
ceCursor has been designed with the purpose to allow potential-                ceedings of CHI 2009, Boston, MA, USA, 1415-1418.
ly any interface element in MS Windows to be effectively se-
lected. Its simple structure, made up of a central pointing area               CHIN, C. A., BARRETO, A., CREMADES, J. G., AND ADJOUADI, C.
and of four direction buttons around it, implicitly suggests its               M. 2008. Integrated electromyogram and eye-gaze tracking cur-
use. Compared with strategies based on zooming, ceCursor has                   sor control system for computer users with motor disabilities.
the advantage of not requiring any desktop enlargement or de-                  Journal of Rehabilitation Research & Development, Vol. 45,
formation, which, if frequent, may be annoying for the user.                   No. 1, 161-174.

One distinctive feature of the solution adopted for ceCursor is                ENGBERT, R., AND KLIEGL, R. 2004. Microsaccades Keep the
its different behavior according to the “context”: it can be                   Eyes’ Balance During Fixation. Psychological Science, Vol. 15,
moved discretely in areas containing icons and continuously                    No. 6, 431-436.
within application windows and in “icon-free” areas on the desk-
top. This makes it simpler and faster for the user to accomplish               GORODNICHY, D. O., AND ROTH, G. 2004. Nouse ‘use your nose
tasks of different kinds, such as opening applications, navigating             as a mouse’ perceptual vision technology for hands-free games
in the folders structure, selecting links within web pages, press-             and interfaces. Image and Vision Computing, Vol. 22, No. 12,
ing small control buttons, etc.                                                931-942.

Another distinguishing characteristic of our approach is (for                  KUMAR, M., PAEPCKE, A., AND WINOGRAD, T. 2007. EyePoint:
continuous motion) the replica of the currently pointed area                   Practical Point and Selection Using Gaze and Keyboard. In Pro-
within direction buttons. This strategy has proved to be very                  ceedings CHI 2007, 421-430.
helpful for precise pointing, allowing the user not too loose the
“context” of the cursor (i.e. what is being aimed at a certain                 KUMAR, M., KLINGNER, J., PURANIK, R., WINOGRAD, T., AND
moment), without the need for constant shifts between the cen-                 PAEPCKE, A. 2008. Improving the Accuracy of Gaze Input. In
tral square and direction buttons. As our tests have shown, the                Proceedings of ETRA 2008, Savannah, GA, USA, 65-68.
performance of ceCursor is generally better than that of its
counterpart without the replica. Once the pointing mechanism is                LANKFORD, C. 2000. Effective Eye Gaze Input into Windows. In
clear (step 1: identification of a definite target, step 2: search for         Proceedings of ETRA 2000, Palm Beach Garden, FL, USA,
that target in the direction button), there is little chance to be             ACM, 23-27.




                                                                         337
NORRIS, G., AND WILSON, E. 1997. The Eye Mouse, an eye                 http://www.tobii.com/assistive_technology/support_downloads/
communication device. In Proceedings of the 23rd IEEE North-           downloads.aspx (retrieved September 18th, 2009).
east Bioengineering Conference, Durham, NH, USA, 66-67.
                                                                       TOBII Technology AB 2004. Tobii 50 Series – Product Descrip-
RÄIHÄ, K., AND ŠPAKOV, O. 2009. Disambiguating Ninja Cursors           tion.   Available:     http://tobii.se/downloads/Tobii_50series
with Eye Gaze. In Proceedings of CHI 2009, Boston, MA, USA,            _PD_Aug04.pdf, retrieved: October 5, 2009.
1411-1414.
                                                                       YARBUS, A. L. 1967. Eye Movements and Vision. New York:
SIRILUCK, W., KAMOLPHIWONG, S., KAMOLPHIWONG, T., AND                  Plenum Press.
SAE-WHONG, S. 2007. Blink and Click. In Proceedings of the 1st
International Convention on Rehabilitation Engineering & As-           ZHAI, S., MORIMOTO, C., AND IHDE, S. 1999. Manual And Gaze
sistive Technology, Singapore, 43-46.                                  Input Cascaded (MAGIC) Pointing. In Proceedings of CHI
                                                                       1999, Pittsburgh, PA, USA, 246-253.
SMITH, B. A., HO, J., ARK, W., AND ZHAI, S. 2000. Hand eye
coordination patterns in target selection. In Proceedings of           ZHANG, X., REN, X., AND ZHA, H. 2008. Improving Eye Cursor’s
ETRA 2000, Palm Beach Garden, FL, USA, 117-122.                        Stability for Eye Pointing Tasks. In Proceedings of CHI 2008,
                                                                       Florence, Italy, 525-534.
TOBII 2009. MyTobii User Manual, Version 2.4. Available:




                                                                 338

Contenu connexe

Tendances

Screenless display report
Screenless display reportScreenless display report
Screenless display reportVikas Kumar
 
Depth-Image-based Facial Analysis between Age Groups and Recognition of 3D Faces
Depth-Image-based Facial Analysis between Age Groups and Recognition of 3D FacesDepth-Image-based Facial Analysis between Age Groups and Recognition of 3D Faces
Depth-Image-based Facial Analysis between Age Groups and Recognition of 3D FacesIDES Editor
 
Screenless displays seminar report
Screenless displays seminar reportScreenless displays seminar report
Screenless displays seminar reportJeevan Kumar D
 
Screenless displays: visualimage, retinaldisplay,synapticdisplays
Screenless displays: visualimage, retinaldisplay,synapticdisplaysScreenless displays: visualimage, retinaldisplay,synapticdisplays
Screenless displays: visualimage, retinaldisplay,synapticdisplaysnitika vig
 
screen less display documentation
screen less display documentationscreen less display documentation
screen less display documentationmani akuthota
 
Computer Based Human Gesture Recognition With Study Of Algorithms
Computer Based Human Gesture Recognition With Study Of AlgorithmsComputer Based Human Gesture Recognition With Study Of Algorithms
Computer Based Human Gesture Recognition With Study Of AlgorithmsIOSR Journals
 
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...Kalle
 
Password Authentication Framework Based on Encrypted Negative Password
Password Authentication Framework Based on Encrypted Negative PasswordPassword Authentication Framework Based on Encrypted Negative Password
Password Authentication Framework Based on Encrypted Negative PasswordIJSRED
 
International Journal of Engineering and Science Invention (IJESI)
International Journal of Engineering and Science Invention (IJESI)International Journal of Engineering and Science Invention (IJESI)
International Journal of Engineering and Science Invention (IJESI)inventionjournals
 
Observation On Demand (O2D) and Rapid Recognition Training (RRT)
Observation On Demand (O2D) and Rapid Recognition Training (RRT)Observation On Demand (O2D) and Rapid Recognition Training (RRT)
Observation On Demand (O2D) and Rapid Recognition Training (RRT)ngurra
 
Virtual retinal display ppt
Virtual retinal display pptVirtual retinal display ppt
Virtual retinal display pptHina Saxena
 

Tendances (13)

Screenless display report
Screenless display reportScreenless display report
Screenless display report
 
Depth-Image-based Facial Analysis between Age Groups and Recognition of 3D Faces
Depth-Image-based Facial Analysis between Age Groups and Recognition of 3D FacesDepth-Image-based Facial Analysis between Age Groups and Recognition of 3D Faces
Depth-Image-based Facial Analysis between Age Groups and Recognition of 3D Faces
 
Screenless displays seminar report
Screenless displays seminar reportScreenless displays seminar report
Screenless displays seminar report
 
Screenless displays: visualimage, retinaldisplay,synapticdisplays
Screenless displays: visualimage, retinaldisplay,synapticdisplaysScreenless displays: visualimage, retinaldisplay,synapticdisplays
Screenless displays: visualimage, retinaldisplay,synapticdisplays
 
screen less display documentation
screen less display documentationscreen less display documentation
screen less display documentation
 
Computer Based Human Gesture Recognition With Study Of Algorithms
Computer Based Human Gesture Recognition With Study Of AlgorithmsComputer Based Human Gesture Recognition With Study Of Algorithms
Computer Based Human Gesture Recognition With Study Of Algorithms
 
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
Stellmach Advanced Gaze Visualizations For Three Dimensional Virtual Environm...
 
Password Authentication Framework Based on Encrypted Negative Password
Password Authentication Framework Based on Encrypted Negative PasswordPassword Authentication Framework Based on Encrypted Negative Password
Password Authentication Framework Based on Encrypted Negative Password
 
International Journal of Engineering and Science Invention (IJESI)
International Journal of Engineering and Science Invention (IJESI)International Journal of Engineering and Science Invention (IJESI)
International Journal of Engineering and Science Invention (IJESI)
 
Observation On Demand (O2D) and Rapid Recognition Training (RRT)
Observation On Demand (O2D) and Rapid Recognition Training (RRT)Observation On Demand (O2D) and Rapid Recognition Training (RRT)
Observation On Demand (O2D) and Rapid Recognition Training (RRT)
 
Virtual retinal display ppt
Virtual retinal display pptVirtual retinal display ppt
Virtual retinal display ppt
 
Cu31632635
Cu31632635Cu31632635
Cu31632635
 
F04613040
F04613040F04613040
F04613040
 

En vedette

Acta asamblea congresual parador baiona
Acta asamblea congresual parador baionaActa asamblea congresual parador baiona
Acta asamblea congresual parador baionaoscargaliza
 
Demanda tutela carrefour
Demanda tutela carrefourDemanda tutela carrefour
Demanda tutela carrefouroscargaliza
 
TEMA 5A Possessive Adjectives
TEMA 5A Possessive AdjectivesTEMA 5A Possessive Adjectives
TEMA 5A Possessive AdjectivesSenoraAmandaWhite
 
Flex automation. tools comparison
Flex automation. tools comparisonFlex automation. tools comparison
Flex automation. tools comparisonAlex
 
TDR u Srbiji - pregled poslovanja - press konferencija 13.03.2012
TDR u Srbiji - pregled poslovanja - press konferencija 13.03.2012TDR u Srbiji - pregled poslovanja - press konferencija 13.03.2012
TDR u Srbiji - pregled poslovanja - press konferencija 13.03.2012TDR d.o.o Rovinj
 
W270 logical fallacies
W270 logical fallaciesW270 logical fallacies
W270 logical fallaciessfboyle
 
Un Millón de Botellas Templo Budista
Un Millón de Botellas Templo BudistaUn Millón de Botellas Templo Budista
Un Millón de Botellas Templo BudistaEva Cajigas
 
CV_OR(ESP)
CV_OR(ESP)CV_OR(ESP)
CV_OR(ESP)oreyesc
 
Plataforma anged 2013
Plataforma anged 2013Plataforma anged 2013
Plataforma anged 2013oscargaliza
 
מודל שבע השכבות הבנה לעומק
מודל שבע השכבות הבנה לעומקמודל שבע השכבות הבנה לעומק
מודל שבע השכבות הבנה לעומקhaimkarel
 
Depression eng
Depression engDepression eng
Depression engJshi
 

En vedette (20)

Acta asamblea congresual parador baiona
Acta asamblea congresual parador baionaActa asamblea congresual parador baiona
Acta asamblea congresual parador baiona
 
การแบ่งภูมิภาค
การแบ่งภูมิภาคการแบ่งภูมิภาค
การแบ่งภูมิภาค
 
Demanda tutela carrefour
Demanda tutela carrefourDemanda tutela carrefour
Demanda tutela carrefour
 
กำแพงเบอร ล น-1
กำแพงเบอร ล น-1กำแพงเบอร ล น-1
กำแพงเบอร ล น-1
 
TEMA 5A Possessive Adjectives
TEMA 5A Possessive AdjectivesTEMA 5A Possessive Adjectives
TEMA 5A Possessive Adjectives
 
Flex automation. tools comparison
Flex automation. tools comparisonFlex automation. tools comparison
Flex automation. tools comparison
 
TDR u Srbiji - pregled poslovanja - press konferencija 13.03.2012
TDR u Srbiji - pregled poslovanja - press konferencija 13.03.2012TDR u Srbiji - pregled poslovanja - press konferencija 13.03.2012
TDR u Srbiji - pregled poslovanja - press konferencija 13.03.2012
 
งานนำเสนอ1
งานนำเสนอ1งานนำเสนอ1
งานนำเสนอ1
 
W270 logical fallacies
W270 logical fallaciesW270 logical fallacies
W270 logical fallacies
 
Statby school 2553_m3_1057012007
Statby school 2553_m3_1057012007Statby school 2553_m3_1057012007
Statby school 2553_m3_1057012007
 
Un Millón de Botellas Templo Budista
Un Millón de Botellas Templo BudistaUn Millón de Botellas Templo Budista
Un Millón de Botellas Templo Budista
 
Truman
TrumanTruman
Truman
 
CV_OR(ESP)
CV_OR(ESP)CV_OR(ESP)
CV_OR(ESP)
 
Plataforma anged 2013
Plataforma anged 2013Plataforma anged 2013
Plataforma anged 2013
 
เศรษฐกิจระหว่างประเทศ
เศรษฐกิจระหว่างประเทศเศรษฐกิจระหว่างประเทศ
เศรษฐกิจระหว่างประเทศ
 
מודל שבע השכבות הבנה לעומק
מודל שבע השכבות הבנה לעומקמודל שבע השכבות הבנה לעומק
מודל שבע השכבות הבנה לעומק
 
Sanaaaaaaaa 1
Sanaaaaaaaa 1Sanaaaaaaaa 1
Sanaaaaaaaa 1
 
สงครามโลก..[2]
สงครามโลก..[2]สงครามโลก..[2]
สงครามโลก..[2]
 
ParaEmpezarGreetings
ParaEmpezarGreetingsParaEmpezarGreetings
ParaEmpezarGreetings
 
Depression eng
Depression engDepression eng
Depression eng
 

Similaire à Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Environments

Mc Kenzie An Eye On Input Research Challenges In Using The Eye For Computer I...
Mc Kenzie An Eye On Input Research Challenges In Using The Eye For Computer I...Mc Kenzie An Eye On Input Research Challenges In Using The Eye For Computer I...
Mc Kenzie An Eye On Input Research Challenges In Using The Eye For Computer I...Kalle
 
Skovsgaard Small Target Selection With Gaze Alone
Skovsgaard Small Target Selection With Gaze AloneSkovsgaard Small Target Selection With Gaze Alone
Skovsgaard Small Target Selection With Gaze AloneKalle
 
Mardanbegi.2011.mobile gaze based screen interaction in 3 d environments
Mardanbegi.2011.mobile gaze based screen interaction in 3 d environmentsMardanbegi.2011.mobile gaze based screen interaction in 3 d environments
Mardanbegi.2011.mobile gaze based screen interaction in 3 d environmentsmrgazer
 
Van der kamp.2011.gaze and voice controlled drawing
Van der kamp.2011.gaze and voice controlled drawingVan der kamp.2011.gaze and voice controlled drawing
Van der kamp.2011.gaze and voice controlled drawingmrgazer
 
Stellmach.2011.designing gaze supported multimodal interactions for the explo...
Stellmach.2011.designing gaze supported multimodal interactions for the explo...Stellmach.2011.designing gaze supported multimodal interactions for the explo...
Stellmach.2011.designing gaze supported multimodal interactions for the explo...mrgazer
 
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural TasksRyan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural TasksKalle
 
Istance Designing Gaze Gestures For Gaming An Investigation Of Performance
Istance Designing Gaze Gestures For Gaming An Investigation Of PerformanceIstance Designing Gaze Gestures For Gaming An Investigation Of Performance
Istance Designing Gaze Gestures For Gaming An Investigation Of PerformanceKalle
 
Control the computer with your eyes
Control the computer with your eyesControl the computer with your eyes
Control the computer with your eyesDominick Maino
 
Schneider.2011.an open source low-cost eye-tracking system for portable real-...
Schneider.2011.an open source low-cost eye-tracking system for portable real-...Schneider.2011.an open source low-cost eye-tracking system for portable real-...
Schneider.2011.an open source low-cost eye-tracking system for portable real-...mrgazer
 
Facial-Expression Based Mouse Cursor Control for Physically Challenged Indivi...
Facial-Expression Based Mouse Cursor Control for Physically Challenged Indivi...Facial-Expression Based Mouse Cursor Control for Physically Challenged Indivi...
Facial-Expression Based Mouse Cursor Control for Physically Challenged Indivi...IRJET Journal
 
Engelman.2011.exploring interaction modes for image retrieval
Engelman.2011.exploring interaction modes for image retrievalEngelman.2011.exploring interaction modes for image retrieval
Engelman.2011.exploring interaction modes for image retrievalmrgazer
 
IRJET- Sign Language Interpreter
IRJET- Sign Language InterpreterIRJET- Sign Language Interpreter
IRJET- Sign Language InterpreterIRJET Journal
 
IRJET- Finger Gesture Recognition Using Linear Camera
IRJET-  	  Finger Gesture Recognition Using Linear CameraIRJET-  	  Finger Gesture Recognition Using Linear Camera
IRJET- Finger Gesture Recognition Using Linear CameraIRJET Journal
 
HUMAN COMPUTER INTERACTION TECHNIQUES BY SAIKIRAN PANJALA
HUMAN COMPUTER INTERACTION TECHNIQUES BY SAIKIRAN PANJALAHUMAN COMPUTER INTERACTION TECHNIQUES BY SAIKIRAN PANJALA
HUMAN COMPUTER INTERACTION TECHNIQUES BY SAIKIRAN PANJALASaikiran Panjala
 
Moshnyaga The Use Of Eye Tracking For Pc Energy Management
Moshnyaga The Use Of Eye Tracking For Pc Energy ManagementMoshnyaga The Use Of Eye Tracking For Pc Energy Management
Moshnyaga The Use Of Eye Tracking For Pc Energy ManagementKalle
 
IRJET- Sixth Sense Technology in Image Processing
IRJET-  	  Sixth Sense Technology in Image ProcessingIRJET-  	  Sixth Sense Technology in Image Processing
IRJET- Sixth Sense Technology in Image ProcessingIRJET Journal
 
Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...
Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...
Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...Kalle
 

Similaire à Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Environments (20)

Mc Kenzie An Eye On Input Research Challenges In Using The Eye For Computer I...
Mc Kenzie An Eye On Input Research Challenges In Using The Eye For Computer I...Mc Kenzie An Eye On Input Research Challenges In Using The Eye For Computer I...
Mc Kenzie An Eye On Input Research Challenges In Using The Eye For Computer I...
 
Skovsgaard Small Target Selection With Gaze Alone
Skovsgaard Small Target Selection With Gaze AloneSkovsgaard Small Target Selection With Gaze Alone
Skovsgaard Small Target Selection With Gaze Alone
 
F0932733
F0932733F0932733
F0932733
 
Mardanbegi.2011.mobile gaze based screen interaction in 3 d environments
Mardanbegi.2011.mobile gaze based screen interaction in 3 d environmentsMardanbegi.2011.mobile gaze based screen interaction in 3 d environments
Mardanbegi.2011.mobile gaze based screen interaction in 3 d environments
 
Van der kamp.2011.gaze and voice controlled drawing
Van der kamp.2011.gaze and voice controlled drawingVan der kamp.2011.gaze and voice controlled drawing
Van der kamp.2011.gaze and voice controlled drawing
 
Stellmach.2011.designing gaze supported multimodal interactions for the explo...
Stellmach.2011.designing gaze supported multimodal interactions for the explo...Stellmach.2011.designing gaze supported multimodal interactions for the explo...
Stellmach.2011.designing gaze supported multimodal interactions for the explo...
 
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural TasksRyan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
Ryan Match Moving For Area Based Analysis Of Eye Movements In Natural Tasks
 
Istance Designing Gaze Gestures For Gaming An Investigation Of Performance
Istance Designing Gaze Gestures For Gaming An Investigation Of PerformanceIstance Designing Gaze Gestures For Gaming An Investigation Of Performance
Istance Designing Gaze Gestures For Gaming An Investigation Of Performance
 
Control the computer with your eyes
Control the computer with your eyesControl the computer with your eyes
Control the computer with your eyes
 
Schneider.2011.an open source low-cost eye-tracking system for portable real-...
Schneider.2011.an open source low-cost eye-tracking system for portable real-...Schneider.2011.an open source low-cost eye-tracking system for portable real-...
Schneider.2011.an open source low-cost eye-tracking system for portable real-...
 
Facial-Expression Based Mouse Cursor Control for Physically Challenged Indivi...
Facial-Expression Based Mouse Cursor Control for Physically Challenged Indivi...Facial-Expression Based Mouse Cursor Control for Physically Challenged Indivi...
Facial-Expression Based Mouse Cursor Control for Physically Challenged Indivi...
 
Engelman.2011.exploring interaction modes for image retrieval
Engelman.2011.exploring interaction modes for image retrievalEngelman.2011.exploring interaction modes for image retrieval
Engelman.2011.exploring interaction modes for image retrieval
 
V4 n2 139
V4 n2 139V4 n2 139
V4 n2 139
 
IRJET- Sign Language Interpreter
IRJET- Sign Language InterpreterIRJET- Sign Language Interpreter
IRJET- Sign Language Interpreter
 
IRJET- Finger Gesture Recognition Using Linear Camera
IRJET-  	  Finger Gesture Recognition Using Linear CameraIRJET-  	  Finger Gesture Recognition Using Linear Camera
IRJET- Finger Gesture Recognition Using Linear Camera
 
HUMAN COMPUTER INTERACTION TECHNIQUES BY SAIKIRAN PANJALA
HUMAN COMPUTER INTERACTION TECHNIQUES BY SAIKIRAN PANJALAHUMAN COMPUTER INTERACTION TECHNIQUES BY SAIKIRAN PANJALA
HUMAN COMPUTER INTERACTION TECHNIQUES BY SAIKIRAN PANJALA
 
Moshnyaga The Use Of Eye Tracking For Pc Energy Management
Moshnyaga The Use Of Eye Tracking For Pc Energy ManagementMoshnyaga The Use Of Eye Tracking For Pc Energy Management
Moshnyaga The Use Of Eye Tracking For Pc Energy Management
 
IRJET- Sixth Sense Technology in Image Processing
IRJET-  	  Sixth Sense Technology in Image ProcessingIRJET-  	  Sixth Sense Technology in Image Processing
IRJET- Sixth Sense Technology in Image Processing
 
Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...
Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...
Yamamoto Development Of Eye Tracking Pen Display Based On Stereo Bright Pupil...
 
G0342039042
G0342039042G0342039042
G0342039042
 

Plus de Kalle

Blignaut Visual Span And Other Parameters For The Generation Of Heatmaps
Blignaut Visual Span And Other Parameters For The Generation Of HeatmapsBlignaut Visual Span And Other Parameters For The Generation Of Heatmaps
Blignaut Visual Span And Other Parameters For The Generation Of HeatmapsKalle
 
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...Kalle
 
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...Kalle
 
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...Kalle
 
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze ControlUrbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze ControlKalle
 
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...Kalle
 
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic TrainingTien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic TrainingKalle
 
Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...
Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...
Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...Kalle
 
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser OphthalmoscopeStevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser OphthalmoscopeKalle
 
San Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
San Agustin Evaluation Of A Low Cost Open Source Gaze TrackerSan Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
San Agustin Evaluation Of A Low Cost Open Source Gaze TrackerKalle
 
Rosengrant Gaze Scribing In Physics Problem Solving
Rosengrant Gaze Scribing In Physics Problem SolvingRosengrant Gaze Scribing In Physics Problem Solving
Rosengrant Gaze Scribing In Physics Problem SolvingKalle
 
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual SearchQvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual SearchKalle
 
Prats Interpretation Of Geometric Shapes An Eye Movement Study
Prats Interpretation Of Geometric Shapes An Eye Movement StudyPrats Interpretation Of Geometric Shapes An Eye Movement Study
Prats Interpretation Of Geometric Shapes An Eye Movement StudyKalle
 
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...Kalle
 
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...Kalle
 
Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...
Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...
Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...Kalle
 
Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...
Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...
Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...Kalle
 
Nagamatsu User Calibration Free Gaze Tracking With Estimation Of The Horizont...
Nagamatsu User Calibration Free Gaze Tracking With Estimation Of The Horizont...Nagamatsu User Calibration Free Gaze Tracking With Estimation Of The Horizont...
Nagamatsu User Calibration Free Gaze Tracking With Estimation Of The Horizont...Kalle
 
Nagamatsu Gaze Estimation Method Based On An Aspherical Model Of The Cornea S...
Nagamatsu Gaze Estimation Method Based On An Aspherical Model Of The Cornea S...Nagamatsu Gaze Estimation Method Based On An Aspherical Model Of The Cornea S...
Nagamatsu Gaze Estimation Method Based On An Aspherical Model Of The Cornea S...Kalle
 
Mulligan Robust Optical Eye Detection During Head Movement
Mulligan Robust Optical Eye Detection During Head MovementMulligan Robust Optical Eye Detection During Head Movement
Mulligan Robust Optical Eye Detection During Head MovementKalle
 

Plus de Kalle (20)

Blignaut Visual Span And Other Parameters For The Generation Of Heatmaps
Blignaut Visual Span And Other Parameters For The Generation Of HeatmapsBlignaut Visual Span And Other Parameters For The Generation Of Heatmaps
Blignaut Visual Span And Other Parameters For The Generation Of Heatmaps
 
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
Zhang Eye Movement As An Interaction Mechanism For Relevance Feedback In A Co...
 
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
Wastlund What You See Is Where You Go Testing A Gaze Driven Power Wheelchair ...
 
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
Vinnikov Contingency Evaluation Of Gaze Contingent Displays For Real Time Vis...
 
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze ControlUrbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
Urbina Pies With Ey Es The Limits Of Hierarchical Pie Menus In Gaze Control
 
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
Urbina Alternatives To Single Character Entry And Dwell Time Selection On Eye...
 
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic TrainingTien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
Tien Measuring Situation Awareness Of Surgeons In Laparoscopic Training
 
Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...
Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...
Takemura Estimating 3 D Point Of Regard And Visualizing Gaze Trajectories Und...
 
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser OphthalmoscopeStevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
Stevenson Eye Tracking With The Adaptive Optics Scanning Laser Ophthalmoscope
 
San Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
San Agustin Evaluation Of A Low Cost Open Source Gaze TrackerSan Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
San Agustin Evaluation Of A Low Cost Open Source Gaze Tracker
 
Rosengrant Gaze Scribing In Physics Problem Solving
Rosengrant Gaze Scribing In Physics Problem SolvingRosengrant Gaze Scribing In Physics Problem Solving
Rosengrant Gaze Scribing In Physics Problem Solving
 
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual SearchQvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
Qvarfordt Understanding The Benefits Of Gaze Enhanced Visual Search
 
Prats Interpretation Of Geometric Shapes An Eye Movement Study
Prats Interpretation Of Geometric Shapes An Eye Movement StudyPrats Interpretation Of Geometric Shapes An Eye Movement Study
Prats Interpretation Of Geometric Shapes An Eye Movement Study
 
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
 
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
Park Quantification Of Aesthetic Viewing Using Eye Tracking Technology The In...
 
Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...
Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...
Palinko Estimating Cognitive Load Using Remote Eye Tracking In A Driving Simu...
 
Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...
Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...
Nakayama Estimation Of Viewers Response For Contextual Understanding Of Tasks...
 
Nagamatsu User Calibration Free Gaze Tracking With Estimation Of The Horizont...
Nagamatsu User Calibration Free Gaze Tracking With Estimation Of The Horizont...Nagamatsu User Calibration Free Gaze Tracking With Estimation Of The Horizont...
Nagamatsu User Calibration Free Gaze Tracking With Estimation Of The Horizont...
 
Nagamatsu Gaze Estimation Method Based On An Aspherical Model Of The Cornea S...
Nagamatsu Gaze Estimation Method Based On An Aspherical Model Of The Cornea S...Nagamatsu Gaze Estimation Method Based On An Aspherical Model Of The Cornea S...
Nagamatsu Gaze Estimation Method Based On An Aspherical Model Of The Cornea S...
 
Mulligan Robust Optical Eye Detection During Head Movement
Mulligan Robust Optical Eye Detection During Head MovementMulligan Robust Optical Eye Detection During Head Movement
Mulligan Robust Optical Eye Detection During Head Movement
 

Porta Ce Cursor A Contextual Eye Cursor For General Pointing In Windows Environments

  • 1. ceCursor, a Contextual Eye Cursor for General Pointing in Windows Environments Marco Porta, Alice Ravarelli, Giovanni Spagnoli Dipartimento di Informatica e Sistemistica – Università di Pavia Via Ferrata, 1 – 27100 – Pavia – Italy marco.porta@unipv.it, alice.ravarelli@unipv.it, giovanni.spagnoli01@ateneopv.it Abstract nological obstacles which limit pure eye-based pointing. On the one hand, even during fixations the eyes are not perfectly still, Eye gaze interaction for disabled people is often dealt with by but are characterized by jitters of different kinds [Yarbus 1967], designing ad-hoc interfaces, in which the big size of their ele- such as microsaccades [Engbert and Kliegl 2004]; unless a me- ments compensates for both the inaccuracy of eye trackers and chanism for stabilizing the detected gaze position is employed, the instability of the human eye. Unless solutions for reliable eye the eye-controlled pointer will tremble to some extent. On the cursor control are employed, gaze pointing in ordinary graphical other hand, even very recent eye trackers have a limited preci- operating environments is a very difficult task. In this paper we sion (typically, 0.5 degrees), and consecutive gaze samples ac- present an eye-driven cursor for MS Windows which behaves quired by the device cannot be exactly centered on the same differently according to the “context”. When the user’s gaze is point. For these reasons, the basic approach which simply dis- perceived within the desktop or a folder, the cursor can be dis- plays the cursor where the user’s gaze is detected on the screen cretely shifted from one icon to another. Within an application is hardly practicable — a shaking cursor is generally annoying, window or where there are no icons, on the contrary, the cursor and precise pointing on small targets is practically impossible. can be continuously and precisely moved. Shifts in the four di- rections (up, down, left, right) occur through dedicated buttons. Indeed, most existing eye-controlled interfaces are specifically To increase user awareness of the currently pointed spot on the designed to make up for such limitations. For instance, they are screen while continuously moving the cursor, a replica of the characterized by big graphical elements (e.g. buttons), which can spot is provided within the active direction button, resulting in be easily selected even if the user’s gaze is detected in slightly improved pointing performance. different positions. For object selection, the dwell time principle is typically exploited: the mouse click (or double click) is simu- CR Categories: H.1.2 [Models and Principles]: User/Machine lated by looking at a target for a certain time. Usually, conti- Systems—Human Factors; H.5.2 [Information Interfaces and nuous gaze feedback is avoided, thus eliminating the bad effect Presentation]: User Interfaces—Input Devices and Strategies, of a trembling cursor constantly displayed on the screen. De- Interaction Styles pending on the application, other kinds of feedback may be used, associated with elements of the interface (for example, a Keywords: gaze interaction, eye tracking, eye cursor, eye point- button may change its color progressively as it is fixed and the ing, assistive technology, alternative communication dwell time approaches). Program suites developed for commer- cially-available eye trackers (e.g. MyTobii [Tobii 2009]) are col- lections of applications sharing graphical look and interaction 1 Introduction mechanisms, designed on-purpose for eye gaze interaction. People affected by severe motor impairments need effective me- While a dedicated environment for the execution of eye- thods for providing input to the computer. Exploiting eye gaze controlled programs has undoubtedly a number of advantages, it as a substitute for the mouse is potentially the most intuitive way has some limitations as well. First of all, it constrains the user to to interact with a PC without using the hands: the “point-and- employ only the software available in the suite: any other appli- click” paradigm at the basis of current operative environments is cation installed on the computer cannot be controlled by means universally adopted, and probably also the one most suitable for of the eyes (or, if so, the task is very difficult, because elements two-dimensional interfaces. of ordinary graphical user interfaces are usually small and not designed for easy eye pointing). Moreover, program suites are However, while pointing tasks are inherently connected with eye often associated with specific eye trackers: if, for any reason, the fixations [Smith et al. 2000] — using the mouse, we look at a user wants to change the device, the old applications may not target and then move the cursor to it by means of a precise ocu- work properly on the new system. When the market of eye lar-hand coordination — there are both physiological and tech- trackers will expand (in a hopefully not too far future), the de- crease of prices is likely to accentuate such problem. Whatever the reason why a software purposely designed for eye Copyright © 2010 by the Association for Computing Machinery, Inc. pointing is not available, the possibility to efficiently use the Permission to make digital or hard copies of part or all of this work for personal or eyes like a mouse is desirable in many situations. However, until classroom use is granted without fee provided that copies are not made or distributed eye trackers will become extremely precise machines, proper for commercial advantage and that copies bear this notice and the full citation on the interaction mechanisms are necessary to compensate for their first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on lack of accuracy, as well as to make up for the intrinsic instable servers, or to redistribute to lists, requires prior specific permission and/or a fee. behavior of the human eye. Several approaches have been pro- Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail posed to date for reliable eye pointing, trying to find good tra- permissions@acm.org. ETRA 2010, Austin, TX, March 22 – 24, 2010. © 2010 ACM 978-1-60558-994-7/10/0003 $10.00 331
  • 2. deoffs between accuracy and ease of use. In this paper we then performed as soon as the user releases the key. Although present ceCursor, a special pointer which can be controlled this method requires the user to perform a certain physical action through the eyes in different ways, according to the specific con- (e.g. press a key) to accomplish the selection process, which text. The cursor, designed for Microsoft Windows operating sys- may not be possible for a disabled person, other solutions could tems, allows both “rough” and accurate pointing within applica- be adopted as well (e.g. dwell time). An interesting variant of tion windows, while icon selection (within folders and on the the zooming technique is the so-called “fish eye” lens effect desktop) occurs in a “discrete” way. [Ashmore et al. 2005]. Like when looking through a magnifying lens, the fixed area is expanded, allowing the user to maintain an The paper is structured as follows. Section 2 briefly presents overview of the screen while selectively zooming in on the re- some research projects related to eye cursors and eye pointing in gion of interest. general. Section 3 describes the features of ceCursor and the way it can be employed. Section 4 provides a few technical de- Whatever the pointing strategy adopted, the improvement of eye tails about the system. Section 5 illustrates and discusses expe- pointing precision is among the main desiderata of people need- rimental results. Section 6, at last, draws some conclusions. ing eye-based interaction. For instance, Zhang et al. [2008] pro- pose three methods to increase eye cursor stability, namely force 2 Related Work field, speed reduction, and warping to target center. The pur- pose of these techniques is to adjust eye cursor trajectories by offsetting eye jitters, which are the main cause of destabilizing The implementation of reliable eye-controlled cursors has been a the eye cursor. As another example of recent research of this stimulating challenge for many years. kind, Kumar et al. [2008] propose an algorithm for real-time saccade detection, which is used to smooth eye tracking data in Among the oldest projects, it is worth citing Eye Mouse, a com- real-time. Such algorithm tries to identify gaze jitters within sac- munication aid based on electrooculogram (EOG) signals allow- cades, which could be misled for new saccades and deceive the ing the user to control a normal mouse with a combination of eye tracker. eye movements and blinks [Norris and Wilson 1997]. While ra- ther primitive, Eye Mouse was one of the first attempts at relia- Because of the limitations in the steadiness and accuracy of cur- bly controlling an on-screen cursor for general computer interac- sor control provided by eye trackers, there are also approaches tion. The famous MAGIC (Manual And Gaze Input Cascaded) which combine gaze detection with electromyogram (EMG) sig- pointing project by IBM came shortly after [Zhai et al. 1999]. nals generated by the facial muscles (e.g. [Chin et al. 2008]). Starting from the observation that it is unnatural to overload a These solutions, although generally slower, can be more accu- perceptual channel such as vision with motor control duties, rate than eye-only control, but are unfortunately more invasive, gaze in MAGIC is only used to approximately position the since the user has to wear electrodes on the face. pointing cursor, while the small movements necessary to pre- cisely move it are made by hand — a good approach for people There are also several implementations of very cheap eye input with normal motor abilities, but a totally unsuitable strategy for systems which use normal webcams as an input source. For ex- severely disabled users, unfortunately. After MAGIC, several ample, the systems by Gorodnichy and Roth [2004] and by Siri- techniques for eye-hand mixed input have been developed, luck et al. [2007] exploit face movements to control mouse aimed at improving the performance of common mouse-based pointing position, and eye blinking to generate mouse clicks. operations. For example, very recent are the Ninja [Räihä and Performances of such solutions, however, are usually very li- Špakov 2009] and Rake [Blanch and Ortega 2009] cursors me- mited and may not be suitable for individuals who can only thods, where several cursors are displayed on the screen at the move the eyes. same time and eye gaze is exploited to select the currently active one. 3 System Description Limiting our investigation to pure eye-based interaction, given the small size of ordinary interface components, help to precise ceCursor is basically composed of a square (whose central point eye pointing can come from zooming. If the fixed area on the indicates the actual pointer position) and of four direction but- screen is enlarged, it becomes easier to select small elements. tons placed around it (Figure 1). First studies in this direction date back to ten years ago [Bates 1999], with experiments aimed at comparing eye-only and eye- with-zoom interaction in target acquisition tests. Successive re- search definitely demonstrated that zooming makes usable eye interaction possible, and that target size is the overriding factor affecting device performance [Bates and Istance 2002]. One of the first projects where zooming was practically exploited to interact with a “normal” operating environment (Microsoft Win- dows, in particular) is ERICA [Lankford 2000]. In this system, if the user looks at a specific spot on the screen for more than a dwell time, a window appears where the region around which the user was fixating is displayed magnified. Looking at a cer- tain point within such window, mouse clicks are triggered using again the dwell time principle. An analogous approach is fol- lowed by Kumar et al. [2007] in the more recent EyePoint project. In this case, if the user looks at a certain location on the Figure 1 ceCursor screen and, at the same time, presses a specific key on the key- board, the observed screen portion is magnified, and a grid of Direction buttons are in the shape of triangles, and are pressed dots appears over it. Single, double and right click actions are by eye gaze. The cursor is displayed on the screen with a semi- 332
  • 3. transparent effect, and its size depends on the precision of the not successful, it is very easy to shift the cursor to the right icon. employed eye tracker, as well as on the eye pointing ability of On the one hand, precise pointing is difficult, and it may be hard the user (a cursor 300 pixels high and large is usually fine, un- for the user to select an icon at the first attempt (especially if it is less the user is totally novice). The side of the central square is small). On the other hand, since there are no other possible ac- one third of the cursor width. tions that can be performed, it would be useless — or better, slower — to move the cursor in a “continuous” manner by As will be explained in the next subsections, ceCursor behaves means of direction buttons: a discrete motion strategy has the differently according to where it is at a certain moment. In any advantage of both simplifying the pointing task and speeding up case, looking inside the central square causes a mouse click to the selection process. Figure 4 shows an example with icons on be generated in its center after a dwell time (for instance, one the desktop. second). Time lapsing is graphically represented by concentric circles progressively appearing within the square and filling it toward the center (Figure 2). After the first click, if another click is generated in the same position, it is interpreted as a double- click. .... .... Figure 2 Click generation process If the user looks outside the cursor (that is, neither within the central square nor in direction buttons), after a dwell time it is shifted to a new position — the nearest icon if the cursor is on the desktop or within a folder, or the user fixation point if the cursor is within an application. A typical dwell time value is one second. Figure 4 Discrete movement of ceCursor for icon selection on the Desktop The ‘M’ placed in the lower-right area near ceCursor, when fixed for a certain time, causes the icon of a mouse to appear On the desktop, there is a threshold distance from icons beyond (Figure 3): looking at it, the user can change the currently active which the “capture process” does not occur (350 pixels in our mouse button (right/left and vice versa, alternatively). experiments), and the cursor is moved like within an application window (see Section 3.2). The reason for this is because on the desktop the cursor may be moved to select other elements be- sides icons, such as parts of application windows. Moreover, when ceCursor is too close to a screen edge where there are icons, it is automatically shifted to the nearest outermost one. “Too close” means that the cursor, if moved further, would not be totally included in the screen, because a direction button would be partially or totally concealed through the edge (which would make other shifts in that direction difficult, or even im- possible). Once “hooked” at an icon on the edge, ceCursor can be easily moved to the desired icon using the opposite direction Figure 3 Icon for changing the active mouse button button. The small circle located in the upper-right area near ceCursor is Within a folder, ceCursor can operate with any visualization instead used to “stick” it in a certain place on the screen (it be- mode of MS Windows (small and big icons, preview, details, comes more transparent and its color changes to red). This func- etc.): the cursor is able to recognize the way icons are arranged, tion allows the cursor not to be in the way of other user activities as well as their size, to correctly move among them (Figure 5). (e.g. reading) when not necessary. 3.1 Case 1: ceCursor on the Desktop or within a Folder In presence of icons, ceCursor is “captured” by them. In other words, if the user looks at an area where there are icons, the cur- sor is automatically positioned on the nearest one. This behavior is in line with usual activities carried out within a folder or on the desktop, which necessarily involve icons. When ceCursor is positioned over an icon and the user looks at a direction button, the cursor “jumps” over the next icon in that direction (if there is one). This way, if the direct pointing was Figure 5 ceCursor with big (left) and small (right) icons 333
  • 4. Actually, it is especially with small icons that the “jumping” mo- cle) is over the target, the user can look inside the central square tion modality of ceCursor can be appreciated, since in this case and start the click generation process. the pointing task becomes extremely difficult. Indeed, recognizing that the cursor is over the desired (maybe To simplify the three common operations performed on a folder small) target is not always so easy. After a first implementation window, i.e. “Minimize”, “Up one level” and “Close”, when a of ceCursor, we soon realized that the pointing task through di- folder is opened, three big buttons are displayed over it, which rection buttons is characterized by very frequent shifts between work like the standard buttons of any window (Figure 6). Look- the button and the central square: accurate adjustments require ing at them for a certain time, the corresponding actions are per- the user to alternatively look at the pointed spot, to check formed. whether the target has been reached, and at direction buttons, to move the cursor further. Through several informal trials, we found that such a pointing mechanism, besides not being as fast as we would expect, may become annoying in the long run. We therefore implemented a new version of ceCursor, which turned out to be more effective. In this new version, during cursor movement the area included in the central square is replicated within the active direction but- ton (Figure 8). This way, the user can always be aware of what is being pointed by the cursor at a certain moment, even while constantly looking at a direction button to reach the target. Figure 6 Control buttons displayed over a folder window 3.2 Case 2: ceCursor within an “Icon Free” Area When ceCursor is within an application window, or on the desk- a b top but sufficiently far from icons, it can be precisely moved to point at the desired target. Figure 8 Replica of the currently pointed area displayed within the direction button (the cursor is moving rightward in Looking anywhere within an “icon free” area causes the cursor to be shifted to the fixed spot. However, small interface ele- a and downward in b) ments may be difficult to achieve at the first attempt. To exactly position the cursor, the user can then use direction buttons. As Such a solution makes it possible for the user not to loose the long as a direction button is fixed, the cursor is continuously and “context” of the cursor, avoiding repeated gaze shifts between smoothly moved in that direction (Figure 7). Speed, initially rel- the central square and the direction button. Indeed, the adopted atively low (50 pixels/sec), raises progressively (with an in- strategy is especially effective if two “mental steps” are fol- crease of 50 pixels/sec every two seconds). lowed in sequence: 1. Identification of a desired target in the central square 2. Cursor movement by means of direction buttons, with the target clearly in mind As will be illustrated in Section 5, our experiments have shown that this last implementation of ceCursor, besides being very appreciated by users, provides better performances in terms of time to complete pointing tasks. Analogously to what happens within an area containing icons, when ceCursor gets too close to a screen edge (that is, one of the direction buttons starts disappearing), it is shifted so that its cen- Figure 7 Schematization of the continuous motion of ter is exactly on the border. The cursor can then be moved pre- ceCursor (1 pixel every 1/50 sec in the first two seconds) cisely to the desired target using the opposite direction button. Without such a mechanism, it would, for example, be impossible The motion of the cursor stops as soon as the user looks outside to click the ‘close’ button of an application opened in full the button. Once the center of ceCursor (identified by a red cir- screen, or to select one of its menus (Figure 9). 334
  • 5. a b Figure 9 ceCursor is automatically shifted to the upper border of the screen Figure 10 Folders used for test TA 4 A Few Technical Details ceCursor is implemented in C# within the Microsoft .NET framework. As an eye tracker, we used the Tobii 1750 [Tobii Technology AB 2004], which integrates all its components (camera, near-infrared lighting, etc.) into a 17’’ monitor. The sampling rate of the device is 50 Hz, i.e. gaze data are acquired 50 times a second. The system was developed for and tested with MS Windows XP Home Edition. To access the several Windows data and features necessary for ceCursor to work (e.g. information on folder visu- alization modes, icon size and position, etc.), functions from the user32.dll and kernel32.dll libraries were imported in C#. Cursor rendering was double-buffered, to avoid flickering effects. Figure 11 Panel used for test TB A control panel allows all system parameters (e.g. dwell times and level of transparency) to be set through text textboxes and For both TA and TB, the dependent variable was the time to sliders, as well as to perform eye tracker calibration. complete the task (on a single button). Moreover, we introduced a binary sub-variable “Success”, whose value was 1 if the user finished the task correctly within a timeout of 30 seconds, 0 oth- 5 Experiments erwise. “Correctly” means that no wrong operations were per- formed (such as, for example in TA, opening the wrong folder). Besides informally testing ceCursor many times during its de- For TB, we used a further variable, “Number of Attempts”, velopment, we also carried out two more structured experiments which measured the number of clicks generated until the button (E1 and E2) once it was fully implemented. was correctly pressed (unless the timeout was reached). Nine testers (aged between 19 and 45, 25.22 on average, seven In order to compare ceCursor with the more “traditional” way of males and two females) took part in experiment E1. None of interacting with interfaces through the eyes, we also imple- these testers had any previous experience with eye tracking de- mented a simple cursor (simpleC in the following) which merely vices and eye-controlled interfaces. Two testers (26 and 20, displayed an empty small square where the user’s gaze was per- males) participated in experiment E2. Both of them were not ceived. For the equivalent of a mouse double-click to be gener- totally novice, as they had been involved in some eye tracking ated, 100 consecutive gaze samples (i.e. a dwell time of two tests before. seconds) had to be detected within a circle with a radius of 10 pixels; the click was centered on the point with coordinates giv- 5.1 Procedure en by the mean values of the acquired samples. Both E1 and E2 were composed of two tests, TA and TB, struc- For test TB, we employed two versions of ceCursor, one with tured as follows: the replica of the currently pointed area — we will simply indi- cate this version with ceCursor — and one without the replica, TA. Within a folder containing seven other folders in the like in the first implementation — we will call this other version form of icons (Figure 10a), the user had to open ‘fold- ceCursorWR. For both cases, parameter values used in the expe- er3’ (task 1) and then, in that folder, which contained in riments were the following: turn seven small folders (Figure 10b), to open ‘folder5’ (task 2). Cursor size: 351 pixels (a relatively big cursor, since all the testers in experiment E1 were new to eye gaze input and TB. Within an empty panel displayed in full screen (Figure had a very short training period) 11), the user had to click, for five times, a small button appearing in five random positions. The size of the but- Number of samples to be perceived within the central ton was the same as that of the “close window” button of square for the first click to be generated: 60 (dwell time of a folders in MS Windows XP. little more than one second) 335
  • 6. Number of samples for the second click (double-click) to be 5.4 Experiment E2 - Test TA generated: 60 Number of samples outside the cursor area for the cursor to Task 1: opening a folder within a folder containing big icons. move there (both in an area with icons and not): 60 With both cursors, the testers succeeded in the task. Times Number of samples on a direction button for the cursor to measured with simpleC were 5.1 sec for the first tester and 3.5 move in that direction (both in an area with icons and not): for the second (mean: 4.3). Times measured with ceCursor were 60 3.6 sec for the first tester and 4.5 of the second (mean: 4.05). Comparing these values with the corresponding means for the Each tester tried both TA and TB. For TA, only simpleC and same test and task of experiment E1 (4.03 and 8.14 for the two ceCursor were used (since in areas with icons there are no repli- cursors, respectively), it is evident how in the two cases the per- cas), while for TB all the three cursors were employed. Cursor formances of simpleC are similar, while they are very different order was randomized. Screen resolution was 1280x1024. for ceCursor (Figure 12a): it seems that a longer training period can actually help speeding up the pointing action. In E1, prior to the actual test session each tester was clearly ex- plained how to use the cursors and assisted in exercising with Task 2: opening a folder within a folder containing small them (five minutes for each one, thus resulting in a 15 minutes icons. None of the two testers succeeded in the task with simp- total training time). The two testers of E2 could instead exercise leC (they both opened the wrong folder). Despite the extended with the three cursors for a much longer time — 15 minutes training time, the pointing precision is so limited that opening each, with a total training period of 45 minutes. the right folder becomes probably a matter of pure chance. Defi- nitely better results were instead provided by ceCursor: 4 and 5.2 Experiment E1 - Test TA 8.5 sec, with a mean of 6.25 sec. Comparing this value with the corresponding mean for the same test and task of experiment E1 Task 1: opening a folder within a folder containing big icons. (11.95 sec), also in this case a longer training period seems to be With both simpleC and ceCursor, all the testers succeeded in the helpful (Figure 12b). task. A repeated-measures ANOVA (within-subjects design) did not show a clear relation between cursor type and times (F=3.37, p=.1), but the means were significantly different (4.03 sec for 10,00 simpleC and 8.14 sec for ceCursor). As could be expected, with 8,00 E1 big elements that are sufficiently separated each other simpleC 6,00 can provide good results in terms of time to complete the task: if 4,00 15,00 E2 the user is able to maintain the gaze adequately focused on a 2,00 E1 E2 10,00 E1 E2 (large) target, there is no real need to use mechanisms for pre- 0,00 5,00 0,00 cisely tuning the position of the cursor. simpleC ceCursor ceCursor Task 2: opening a folder within a folder containing small a b icons. In this case, all the testers succeeded with ceCursor, but only two out of nine (22.22%) managed to open the small folder Figure 12 Test TA: results of experiment E1 vs. results of with simpleC: the trembling behavior of this cursor makes it ex- experiment E2 (mean times) – (a) task 1, (b) task 2 tremely difficult to aim at small targets. When successful, simp- leC was relatively fast (mean of 4.2 sec for the two positive tri- als, versus 11.95 sec for the nine positive outcomes of ceCur- 5.5 Experiment E2 - Test TB sor), but cannot be used for reliable pointing. Considering a time value of 31 seconds when the timeout of 30 seconds was reached, the following results (average times to 5.3 Experiment E1 - Test TB click the button, in seconds) were obtained. Considering a time value of 31 seconds when the timeout of 30 simpleC: Tester 1 13.91, Tester 2 15.73, Tester 1 + seconds was reached (i.e. the trial was unsuccessful), a repeated- Tester 2 14.82 (four successful trials out of five for both measures ANOVA did not show any relation between cursor Tester 1 and Tester 2). type and time to complete the task (F=.86, p=0.43). Nonetheless, although means were similar (15.32 sec for simpleC, 14.64 sec ceCursorWR: Tester 1 8.64, Tester 2 13.8, Tester 1 + for ceCursorWR and 13.1 sec for ceCursor), ceCursor showed a Tester 2 11.22 (all successful trials). slightly better performance. ceCursor: Tester 1 6.7, Tester 2 10.86, Tester 1 + Tester 2 8.78 (all successful trials). Looking at success percentages (73.33% for simpleC, 93.33% for ceCursorWR and 97.78% for ceCursor), it is clear that ce- As can be seen, while the mean time for simpleC is about the Cursor resulted a little more effective than its counterpart with- same as for experiment E1, for ceCursorWR and ceCursor sig- out the replica — and much more effective than the basic cursor. nificant reductions can be noted (Figure 13). Moreover, in this This becomes even more evident if we consider the number of case too, ceCursor provided a better performance compared to clicks generated until button press (or until the available 30 ceCursorWR. seconds were over). A repeated-measures ANOVA showed a plain relation between cursor type and number of clicks As for the number of clicks generated until button press (or until (F=26.39, p<.001), with mean values of 4.33 for simpleC, 1.44 the available 30 seconds were over), while only one attempt was for ceCursorWR and 1.2 for ceCursor. necessary with both ceCursorWR and ceCursor, an average of 336
  • 7. confused by content duplication within the cursor: in our expe- riments, ten out of eleven testers said to prefer this solution. 16,00 14,00 E1 E2 E1 Our tests also show that times to accomplish the pointing tasks 12,00 E1 exhibit a decreasing trend with increase of the training period. E2 10,00 E2 Although we were not able to implement experiment E2 with the 8,00 6,00 same number of testers as in experiment E1, the tendency seems 4,00 to be this. Moreover, times could be further reduced by dimi- 2,00 nishing cursor size (especially in test TB, ceCursor was occa- 0,00 sionally “captured” by screen borders) and by lowering dwell simpleC ceCursorWR ceCursor times. Figure 13 Test TB: results of experiment E1 vs. results of Acknowledgement experiment E2 (mean times) This work was supported by funds from the Italian FIRB project 5.2 attempts for Tester 1 and of 6.2 for Tester 2 were needed “Software and Communication Platforms for High-Performance with simpleC. In a real usage scenario with MS Windows appli- Collaborative Grid” (grant RBIN043TKY). cations, employing simpleC would mean having a very high probability to click the wrong target. References 5.6 User Preference ASHMORE, M., DUCHOWSKI, A. T., AND SHOEMAKER, G. 2005. Efficient Eye Pointing with a Fisheye Lens. In Proceedings of Both in E1 and E2, at the end of the experiments the testers were Graphics Interface, Victoria, British Columbia, 203-210. asked to express a preference regarding the three cursors. In E1, eight testers out of nine said to prefer ceCursor, and one ceCur- BATES, R. 1999. Multimodal Eye-Based Interaction for Zoomed sorWR. In E2, both the testers said to prefer ceCursor. Target - Selection on a Standard Graphical User Interface. In Proceedings of Interact'99, vol. II 7-8, Edinburgh, Scotland, 6 Conclusions UK, British Computer Society, 7-8. Reliable eye pointing in ordinary operating environments is a BATES, R., AND ISTANCE, H. 2002. Zooming interfaces!: enhanc- challenging problem. Small graphical elements need specific ing the performance of eye controlled pointing devices. In Pro- mechanisms for precise selection: as demonstrated by our tests, ceedings of the 5th International ACM Conference on Assistive and easily guessed by anybody who has experienced eye gaze Technologies, Edinburgh, Scotland, UK, 119-126. input, a trembling cursor (visible or not) can only be used when targets are sufficiently wide and spaced each other. BLANCH, R., AND ORTEGA, M. 2009. Rake Cursor: Improving Pointing Performance with Concurrent Input Channels. In Pro- ceCursor has been designed with the purpose to allow potential- ceedings of CHI 2009, Boston, MA, USA, 1415-1418. ly any interface element in MS Windows to be effectively se- lected. Its simple structure, made up of a central pointing area CHIN, C. A., BARRETO, A., CREMADES, J. G., AND ADJOUADI, C. and of four direction buttons around it, implicitly suggests its M. 2008. Integrated electromyogram and eye-gaze tracking cur- use. Compared with strategies based on zooming, ceCursor has sor control system for computer users with motor disabilities. the advantage of not requiring any desktop enlargement or de- Journal of Rehabilitation Research & Development, Vol. 45, formation, which, if frequent, may be annoying for the user. No. 1, 161-174. One distinctive feature of the solution adopted for ceCursor is ENGBERT, R., AND KLIEGL, R. 2004. Microsaccades Keep the its different behavior according to the “context”: it can be Eyes’ Balance During Fixation. Psychological Science, Vol. 15, moved discretely in areas containing icons and continuously No. 6, 431-436. within application windows and in “icon-free” areas on the desk- top. This makes it simpler and faster for the user to accomplish GORODNICHY, D. O., AND ROTH, G. 2004. Nouse ‘use your nose tasks of different kinds, such as opening applications, navigating as a mouse’ perceptual vision technology for hands-free games in the folders structure, selecting links within web pages, press- and interfaces. Image and Vision Computing, Vol. 22, No. 12, ing small control buttons, etc. 931-942. Another distinguishing characteristic of our approach is (for KUMAR, M., PAEPCKE, A., AND WINOGRAD, T. 2007. EyePoint: continuous motion) the replica of the currently pointed area Practical Point and Selection Using Gaze and Keyboard. In Pro- within direction buttons. This strategy has proved to be very ceedings CHI 2007, 421-430. helpful for precise pointing, allowing the user not too loose the “context” of the cursor (i.e. what is being aimed at a certain KUMAR, M., KLINGNER, J., PURANIK, R., WINOGRAD, T., AND moment), without the need for constant shifts between the cen- PAEPCKE, A. 2008. Improving the Accuracy of Gaze Input. In tral square and direction buttons. As our tests have shown, the Proceedings of ETRA 2008, Savannah, GA, USA, 65-68. performance of ceCursor is generally better than that of its counterpart without the replica. Once the pointing mechanism is LANKFORD, C. 2000. Effective Eye Gaze Input into Windows. In clear (step 1: identification of a definite target, step 2: search for Proceedings of ETRA 2000, Palm Beach Garden, FL, USA, that target in the direction button), there is little chance to be ACM, 23-27. 337
  • 8. NORRIS, G., AND WILSON, E. 1997. The Eye Mouse, an eye http://www.tobii.com/assistive_technology/support_downloads/ communication device. In Proceedings of the 23rd IEEE North- downloads.aspx (retrieved September 18th, 2009). east Bioengineering Conference, Durham, NH, USA, 66-67. TOBII Technology AB 2004. Tobii 50 Series – Product Descrip- RÄIHÄ, K., AND ŠPAKOV, O. 2009. Disambiguating Ninja Cursors tion. Available: http://tobii.se/downloads/Tobii_50series with Eye Gaze. In Proceedings of CHI 2009, Boston, MA, USA, _PD_Aug04.pdf, retrieved: October 5, 2009. 1411-1414. YARBUS, A. L. 1967. Eye Movements and Vision. New York: SIRILUCK, W., KAMOLPHIWONG, S., KAMOLPHIWONG, T., AND Plenum Press. SAE-WHONG, S. 2007. Blink and Click. In Proceedings of the 1st International Convention on Rehabilitation Engineering & As- ZHAI, S., MORIMOTO, C., AND IHDE, S. 1999. Manual And Gaze sistive Technology, Singapore, 43-46. Input Cascaded (MAGIC) Pointing. In Proceedings of CHI 1999, Pittsburgh, PA, USA, 246-253. SMITH, B. A., HO, J., ARK, W., AND ZHAI, S. 2000. Hand eye coordination patterns in target selection. In Proceedings of ZHANG, X., REN, X., AND ZHA, H. 2008. Improving Eye Cursor’s ETRA 2000, Palm Beach Garden, FL, USA, 117-122. Stability for Eye Pointing Tasks. In Proceedings of CHI 2008, Florence, Italy, 525-534. TOBII 2009. MyTobii User Manual, Version 2.4. Available: 338