SlideShare a Scribd company logo
1 of 18
An Overview of Interactive Surfaces: Applications,
                  Sensors, and Dimensions

                           Song Li1,2, Kathryn Clagett1,3, Geisa Bugs1,4

                                        1
                                          Mobile Interaction
                Institute for Geoinformatics – IFGI, University of Münster, Germany
            2
             felisong@gmail.com, 3kathryn.clagett@gmail.com, 4geisa80@yahoo.com.br



       Abstract. This paper seeks to summarize several aspects of interactive surface
       technology by exploring their applications, sensors, and dimensions. Application
       areas are surveyed and sub-divided into categories of entertainment, collaboration,
       communication, computer interface, and customer-vendor interfaces.           Sensing
       technology is also broken down into smaller groups where three sensors—capacitive
       touch screen, optical imaging, and frustrated total internal reflection—are
       investigated in depth, with a general overview given to other types. In regards to
       dimension, the size and complexity of different interactive services are overviewed.
       It concludes by offering the possible benefits of using interactive surface
       technology.
       Keywords: interactive surfaces, touch screen, sensing technology.



1 Introduction

At the end of the 1980’s, Mark Weiser, author of Ubiquitous Computing [1] that addresses
the role of human/computer relationships, proposed a ‘third computing era,’ where
technology recedes into the background of our lives. In other words, in the third era
computers are everywhere, with each user accessing many computers on a regular basis.
Twenty years later this is being realized; it is becoming possible to not only transfer
digital photography to a wireless phone device without having to plug into a computer,
but also to move a cursor with your fingers to interact with interfaces, or even to enter
commands by voice. We are now living in an era where people are surrounded by
intelligent surfaces and where digital information is coupled with everyday physical
objects and environments.
   Interactive surfaces are one manifestation of this new era. In general, an interactive
surface is indicated by “the transformation of …surface[s] within architectural space (e.g.,
walls, desktops, ceilings, doors, windows) into an active interface between the physical
and virtual worlds [2].” Broadly speaking, interactive surfaces are those that can track
2   Li, Clagett, Bugs




human gesture and voluntary manipulation of the object, the media, and the space
involved in the interaction by a user. Consequently, interactive surfaces can be much more
intuitive than platforms based on keyboard and mouse, and therefore much closer to
spontaneous human behavior. This is a whole new way of interacting with information.
   The way people relate with the surface depends on the technology applied. One of the
greatest advents of user interface design is touch screen technology. Although research in
this area started in the 1980’s, this technology has witnessed increasing exposure recently
through the launch of the iPhone, the first cellular phone that allows the user to touch the
digital content directly with their fingers, a revolutionary notion. This technology will
undoubtedly become more ubiquitous, with interactive surfaces likely to become common
on the surface of a kitchen table or on the television screen for example. Basically, touch
screen technology allows for interaction with the surface, and the manipulation of more
than one thing at time with the movement of the hands or gestures from the human body.
   The goal of this research is to give an overview of interactive surfaces today, focusing
for the most part in touch screen technology, where the hands are making the gesture to
which the surface responds. The paper divides the overview into three main parts:
application areas, technologies, and dimensions of interactive surfaces. First, the
application areas are classified according to the function/objective along with devices
examples. Next, the many solutions available for particular applications, or simple the
technologies behind it, are explored. Finally, the third section addresses the size and
complexity dimensions of interactive surfaces. Lastly, the conclusion will summarize the
findings, emphasizing the benefits, disadvantages, and future speculation on interactive
surfaces.


2 Application Areas

This section presents an effort to illustrate the broad spectrum of viable applications of
interactive surfaces. The devices were classified in five areas based on the
objective/function: 1) entertainment, 2) collaboration, 3) communication, 4) computer
interface, and 5) customer-vendor interface. Each application area is described along with
examples. It is important to highlight that the categories are not mutually exclusive and
some devices can fall into more than one category.


2.1 Entertainment

There are a considerable number of current and possible applications for entertainment
purposes. It is quite difficult to evaluate all of the achievable applications for surfaces of
any shape or size that sense the location of objects and gestures. Commercial multi-touch
displays [3] [4] are already being employed mainly in bars, discos, and events. Through
fantastic graphic design on tables, floors, and walls the surfaces attract curiosity,
An Overview of Interactive Surfaces: Applications, Sensors, and Dimensions 3




transforming customers of these bars or discos into active participants. For example, by
setting their glass on the bar, clients can create light ‘paths’ on tables with their glasses, or
patrons may have fun ‘drawing’ on the floor while dance with their feet. One can imagine
a lounge full of walls projections that respond with the body movement or with the touch,
changing color and display. It is at least an interesting and attractive application of
interactive surfaces, and may serve to draw in more customers, thereby being a
worthwhile investment for business owners.
   Additionally, interactive games are another entertainment area with huge potential for
employing technologies that let the content be controlled in the real world by user’s
gestures. In order to offer more ‘degrees of freedom,’ game design industry is
concentrating on the concept of augmented reality, which combines real and virtual world
elements, in real-time interactive 3D. The most popular example may be the Nintendo
Wii. Even though the players don’t touch the screen, by using the Wii Remote, users are
able to control the game with physical gestures and also navigate on the menu operating
system, rather than using the button press traditional form.
   For an example where touch screen technology is used, the Invisible Train [5] is a
simple multi-player game in which players guide virtual trains, only visible through the
PDAs (Personal Digital Assistant) video display, over a real wooden miniature railroad
track. In the game operations, track switches and speed adjustments are activated by
responding to a tap on the PDA’s touch screen. Likewise, entertainment can be linked
with knowledge acquisition. The iFloor [6] is an interactive floor where the ultimate goal
is to make the physical place of libraries more attractive and appealing to its users. The
iFloor allows multiple people to post questions/answers with a cursor that is dragged
around the floor via body movement (Figure 1). The purpose of this application is closely
related to the collaboration area, which will be addressed next.




                                          Fig. 1. iFloor
4   Li, Clagett, Bugs




2.2 Collaboration

Oversized displays like whiteboards, projections, or plasma screens are typical in
organizations or educational spaces with the intent to share information. The interactive
whiteboard is the precursor to bring interaction to these displays. The difference
comparing to the simple computer desktop projection is that users can interact with the
board using a pen, finger, or other device (some are even adapting the Wii Remote).
   Several projects intend to enhance these public surfaces and make them yet more
interactive in an attempt to facilitate group collaboration in meeting rooms, supporting a
wide range of activities. These surfaces can promote knowledge, information, and even
objects sharing in brainstorms. Public interactive surfaces are also able to save stages of
work process by employing multiple displays.
   For instance, the DiamondTouch Table [7] is one of the first multi-touch technologies
that support small group collaboration (Figure 2). This table works by running an
electronic signal to the user’s finger via their chair to differentiate each user. Another
example, the Dynamo [8] multi-user interactive surface, was designed to enable the
sharing and exchange of digital media in public spaces. The surface is composed of one or
more displays that can be horizontally and vertically. Users can attach multiple USB mice
and keyboards to the surface or access remotely. Perceptive Pixel [9] company also
creates large and medium size displays adopting multi-touch technology that claim to
enhance collaborative spaces, allowing multiple users, up to 20 fingers, to work and
interact. In fact, the technology is currently being utilized at CNN news transmissions.




                                Fig.2. DiamondTouch Table
An Overview of Interactive Surfaces: Applications, Sensors, and Dimensions 5




2.3 Communication

Large surfaces are often employed for the communication of useful information about
products or services in public spaces like museums, airports, and shopping malls. Despite
that, being just projections these surfaces usually don’t allow direct public contact; it is
likely that greater levels of interaction will be reached soon since it is a topic of
significant interest among researchers. Conversely, handhelds, mobile, and wearable
devices are more likely to accommodate user input. Large or small, these devices can
equally be explored as communication tools using touch screen technology.
   The Gesture Wall project [10], for example, first introduced at the Brain Opera in
1996, make use of electrodes to measure the position and movement of the user's hands
and body in front of a projection screen. Additionally, Sensitive Wall [4] is a
commercially available interactive surface with a large vertical touchless display able to
detect hands presence in real time, up to 30 centimeters from the surface. Alternatively it
is easy to envision an interactive surface device that might accompany a user during an
exhibition, similar to the PEACH (Personal Experience with Active Cultural Heritage)
project [11], a suite of interactive and user-adaptive technologies for museum visitors,
using together with others, video documentaries on mobile devices.


2.4 Computer Interface

The multi-touch technology will rapidly become commonplace on computers interfaces.
This technology will likely overtake traditional user interface techniques such as graphical
user interface and desktops in which people traditionally use a mouse on a horizontal
surface to control a visual pointer on the screen. Interactive surfaces are more situation-
aware or assistance-oriented rather than command-oriented.
   Following the success of the iPhone, Apple expanded of the use of multi-touch
computing to the iPod Touch (portable media player and wi-fi mobile platform), as well to
the notebook lines. Additionally, Asus has already included multi-touch function on the
touchpad of Asus’ EEE PC, a small portable notebook announced last April.
   Although the multi-touch technology evidences the technical evolution of computer
interfaces, still other kinds of interactions may be supported on newer types of interactive
computer interfaces. This may lead to a completely change on operating systems, make
USB cords obsolete, and/or enable new recognition commands. Users will be able to
interact with real world that is augmented by computer synthetic information.


2.5 Costumer-Vendor Interface

Of course the marketplace has also adopted/embraced this new technology. Interactive
surfaces are captivating and thereby an effective advertising tool, catching a consumer’s
6   Li, Clagett, Bugs




attention instantaneously, in a crowd-stopping effect. Consequently there is a significant
field for costumer-vendor interface applications ranging from organizing pictures and
videos to ordering and paying.
    Microsoft Surface [12] utilizes gestures detection system and movements through
infrared cameras and the image is shown in the screen through a system of projection.
This technology has been explored by Microsoft partners since April 2008. One
demonstration of the device shows how customers could user it to customize snowboards.
By taking a tag from a snowboard in a shop that contains a chip identifying the it, the user
can then place this tag down on the interactive surface, which immediately recognizes
which snowboard the customers is interested in. The costumer is then able to resize,
rotate, and change colors of graphic symbols which can be added to the product. After
virtually ‘creating’ their product design, the layout can be saved on a mobile phone or
other storage device, meaning that the customer can leave the store but save their design
should they wish to come back and purchase that model. One additional example of the
device utilization from advertising videos shows a restaurant table in which wine menus
can pop-up on screen, showing a list of preferences by categories and information about
it, such as cities of origin, producers, the winery, and a virtual map of the region. After
having purchased an appropriate wine to accompany their dinner, the costumer can then
pay the bill by simply placing the credit card on the surface, where the table recognizes
the card and can charge the bill accordingly.
    In the same fashion, the system called Tap Tracker [10] shows costumers interacting
with glass windows of stores (Figure 3). Four contact microphones were glued to the
inside corners of a glass window and users could choose to watch brief video clips or
engage in a game that was a ploy to get people into the store. Results showed that during
the experiment the store’s sales increased, suggesting that in this instance the use of an
interactive surface was an effective means to bringing in potential customers.




                                     Fig.3. Tap Tracker
An Overview of Interactive Surfaces: Applications, Sensors, and Dimensions 7




3 Technology

Alex Pentland wrote in the Scientific American: “The problem, in my opinion, is that our
current computers are both deaf and blind: they experience the world only by way of a
keyboard and a mouse…I believe computers must be able to see and hear what we do
before they can prove truly helpful [13].” Today, all kinds of sensing technology are used
as eyes and ears of interactive surfaces. The signal collected by sensors is processed by
software to identify properties (location, direction, speed and other properties) of the
action. Based on the identified properties, corresponding reactions, which can be visual,
acoustic or even olfactory, will be executed by the computer.
    Sensors are the first step in an interactive surface and as such, they play a key role in
the whole process. Different sensors, varied in functions, resolutions, installation
conditions, etc., are used to meet the requirements in the different application areas
illustrated above. Since it is impossible to include all technologies in this report, three
different sensors--captive touch screens, optical imaging touch screens, frustrated total
internal reflection touch screens--are introduced in detail while the others are only
overviewed briefly.


3.1 Capacitive touch screens

The idea of using capacitive sensing in the field of human-computer interfaces has a long
history. Generally it works with an array of wires interacting with fingers touching the
screen behind the board. The interaction between the different wires (laminated in a X-
and Y-axis manner) and the tip of the finger is measured and calculated to a (x,y)
coordinate. However, the interaction is sensitive to temperature and humidity.
   Smart-Skin [14], a capacitive sensing architecture developed by Jun Rekimoto, is an
example. Constructed by laying a mesh of vertical transmitter/horizontal receiver
electrodes on the surface (Figure 4) the receiver receives this wave signal because each
crossing point (transmitter/ receiver pairs) acts as a (very weak) capacitor when one of the
transmitters is excited by a wave signal (of typically several hundred kilohertz). The
magnitude of the received signal is proportional to the frequency and voltage of the
transmitted signal, as well as to the capacitance between the two electrodes. When a
conductive and grounded object approaches a crossing point, it couples to the electrodes,
and drains the wave signal. As a result, the received signal amplitude becomes weak. By
measuring this effect, it is possible to detect the proximity of a conductive object, such as
a human hand.
8    Li, Clagett, Bugs




    Fig. 4. The Smart-Skin sensor configuration: a meshshaped sensor grid is used to determine the
                                     hand’s position and shape.

   When the user’s hand is placed within 5-10 cm from the table, the system recognizes
the effect of the capacitance change. A potential field is created when the hand is in the
proximity to the table surface. To accurately determine the hand position, which is the
peak of the potential field, a bi-cubic interpolation method is used to analyze the sensed
data (Figure 5). By using this interpolation, the position of the hand can be determined by
finding the peak on the interpolated curve.
   In the future, the Smart-Skin may not be limited to a tabletop—instead, a non-flat or
even flexible surface like interactive paper, could act as a vehicle for such a surface. Also,
Smart-Skin can combine with tactile feedback. If Smart-Skin could make the surface
vibrate by using a transducer or a piezoactuator, the user could “feel” as if he/she were
manipulating a real object.
An Overview of Interactive Surfaces: Applications, Sensors, and Dimensions 9




Fig. 5. Gestures and corresponding sensor values (top: a hand on the sensor mesh, middle: raw input
                             values, bottom: after bicubic interpolation)


3.2 Optical imaging touch screens

Optical imaging touch screen is a relatively modern development in touch screen
technology. Two or more image sensors are placed around the edges (primarily the
corners) of the screen. Infrared backlights are placed in the camera's field of view on the
other sides of the screen. A touch shows up as a shadow and each pair of cameras can then
be triangulated to locate the touch. This technology is growing in popularity, due to its
scalability, versatility, and affordability, especially for larger units.
   An example of it is Digital Vision Touch (DViT) technology developed by SMART
Technologies Inc. Four cameras, one in each corner, constantly scan the surface to detect
a target object (Figure 6). When a camera detects a target, its processor identifies the
affected pixel and calculates the angle at which it occurs. Each camera detects the target
and calculates its own angle. Mathematical formulas are developed that automatically
record the distance between two cameras and their viewing angles in relation to each
other. With this information known, the technology can then triangulate the location of the
contact point. Mathematically, only two cameras are needed to calculate a contact point,
but to ensure a robust system, DViT technology uses four cameras and has every camera
pair report its triangulated result. The location of the contact point is sent to the user’s
computer.
10   Li, Clagett, Bugs




                         Fig. 6. Camera identification of a contact point

   In addition, it’s possible to detect both when an object contacts the surface and when it
hovers above it. The cameras scan a layer of stacked pixels. If an object breaks into this
pixel area, it registers with the cameras regardless of whether it has touched the surface.
   With specially designed cameras that have a high frame rate (100 FPS) and support fast
image processing techniques, touch accuracy is high enough for mouse operation and
writing. It is also scalable to almost any practical size because the technology resides in
the corners, not the surface area. This technology can also support multi-touch after some
improvement and in the future, it may even be possible to distinguish between different
pointers.


3.3 Frustrated Total Internal Reflection touch screens

When light encounters an interface to a medium with a lower index of refraction (e.g.
glass to air), the light becomes refracted to an extent that depends on its angle of incidence
and beyond a certain critical angle, and the light undergoes total internal reflection (TIR).
Fiber optics, light pipes, and other optical waveguides rely on this phenomenon to
transport light efficiently with very little loss. However, another material at the interface
can frustrate this total internal reflection, causing light to escape the waveguide there
instead. This phenomenon is well known as Frustrated Total Internal Reflection (FTIR).
   The first application to touch input appears to have been disclosed in 1970 in a binary
device that detects the attenuation of light through a platen waveguide caused by a finger
in contact [13]. This technique was further applied in fingerprint sensing and robotics
field by Mallos [14] and Kasday separately [15]. Based on Mallos/Kasday design,
Jefferson Y. Han [16] proposed a scaled-up FTIR fingerprint sensor, or a FTIR robot
tactile sensor, used for interactive surface.
   This technique provides full imaging touch information without occlusion or ambiguity
issues. The touch sense is zero-force and true: it accurately discriminates touch from a
very slight hover. It samples at both high temporal and spatial resolutions. Pressure is not
An Overview of Interactive Surfaces: Applications, Sensors, and Dimensions 11




sensed, though this is largely mitigated by the precision with which it can determine the
contact area of a depressed finger. It is inexpensive to construct, and trivially scalable to
much larger surfaces.
   A drawback of the approach is that, being camera-based, it requires a significant
amount of space behind the interaction surface, though it is primarily expected application
scenarios where rear-projection would have been employed anyway (e.g. interactive
walls, tables). Also, as an optical system, it remains susceptible to harsh lighting
environments.
   As a more robust alternative to Frustrated Total Internal Reflection, Zachary Scott
Carpenter developed Diffused Laser Imaging (DLI) touch screen. DLI uses the same
software and imaging system as FTIR. Instead of the light coming from inside the screen
it comes from diffused laser light being conducted through human flesh and onto the
diffuser of the screen. A two dimensional plane of laser light is emitted directly over a
rigid screen (diffused glass). When a human finger breaks this plane of laser light the light
is diffused by the finger and thus passes from the intersection point through the finger and
down to the diffuser of the rigid screen. This system works just as well as FTIR but is not
vulnerable to smudges or dirt on the screen. Furthermore most all multi touch functions
can be achieved with only two laser line sources versus the many light sources required to
internally illuminate the internal portion of a FTIR screen. DLI can also be used on most
any kind of flat monitor by placing one or two position sensing cameras out in front of the
display. In this case the cameras pick up the diffused finger tips as blobs of light instead
of dots as viewed from the back side.


3.4 Other sensing technologies

Resistive touch screens, which are composed of two flexible sheets coated with a resistive
material and separated by an air gap or microdots. When contact is made to the surface of
the touch screens, the two sheets are pressed together, registering the precise location of
the touch. Resistive touch screens typically have a relative high resolution providing
accurate touch control and allow one to use a finger, a stylus, or any other pointing device
on the surface of the board. However, by their nature, they do not support the full function
of a mouse and interface effects such as hover/pop-ups require additional interactions or
software helper functions [18].
   The Electromagnetic touch screens feature an array of wires embedded behind the
board surface interacts with a coil in the stylus tip to determine the (X,Y) coordinate of
the stylus. In other words, there are magnetic sensors in the board that react and send a
message back to the computer when they are activated by a magnetic pen. Styli are either
active (require a battery or wire back to the whiteboard) or passive (alter electrical signals
produced by the board, but contain no power source). These screens typically have a very
high resolution and usually support the full range of mouse functions, moreover, they may
accommodate pressure sensing and multiple users touching and pointing on the surface
[19].
12   Li, Clagett, Bugs




   Surface Acoustic Wave (SAW) touch screens employ transducers on the borders of a
glass surface to vibrate the glass and produce acoustic waves that ripple over the glass
surface. When a contact is made on the glass surface, the waves reflect back and the
contact position is determined from the signature of the reflected waves. However, SAW
touch screens suffer several drawbacks: 1) can only provide medium resolution; 2) exhibit
noticeable parallax due to the thickness of the vibrating glass which is placed over the
surface of the video or computer display; 3) hovering is not supported; 4) cannot scale
beyond a few feet diagonal [20].
   Strain gauge touch screens use one or more, preferably symmetrically positioned strain
gage sensors to monitor relative distribution of force. The strain gauge sensors are
positioned such that bending strain on the touch screen, engendered by touch, is detected
by the one or more strain gages and accurately measured by an electronic controller,
connected to the strain gauge(s). The electronic controller, or associated hardware, is
programmed to relate relative bending force to a unique position on the screen and is a
charge balancing and multiplying analog-to-digital converter which provides accurate
position determination, even with very low forces and with very minor differentiation in
position related forces [21].
   An infrared touch screen panel employs one of two very different methods. One
method uses thermal induced changes of the surface resistance. This method is sometimes
slow and requires warm hands. Another method is an array of vertical and horizontal IR
sensors that detect the interruption of a modulated light beam near the surface of the
screen. IR touch screens have the most durable surfaces and are used in many military
applications that require a touch panel display [22].
   Acoustic Pulse Recognition (APR) touch screen comprises a glass display overlay or
other rigid substrate, with four piezoelectric transducers mounted on the back surface. The
transducers are mounted on two diagonally opposite corners out of the visible area and
connected via a flex cable to a controller card. The impact when the screen is touched, or
the friction caused while dragging between a user’s finger or stylus and the glass, creates
an acoustic wave. The wave radiates away from the touch point, making its way to the
transducers which produce electrical signals proportional to the acoustic waves. These
signals are amplified in the controller card and then converted into a digital stream of
data. The touch location is determined by comparing the data to a profile. APR is
designed to reject ambient and extraneous sounds, as these do not match a stored sound
profile. The touch screen itself is actually pure glass, giving it the optics and durability of
the glass out of which it is made. It works with scratches and dust on the screen, and
accuracy is very good [23].
   Dispersive signal touch screen, introduced in 2002, uses sensors to detect the
mechanical energy in the glass that occur due to a touch. Complex algorithms then
interpret this information and provide the actual location of the touch. The technology
claims to be unaffected by dust and other outside elements, including scratches. Since
there is no need for additional elements on screen, it also claims to provide excellent
optical clarity. Also, since mechanical vibrations are used to detect a touch event, any
object can be used to generate these events, including fingers and styli. A downside is that
An Overview of Interactive Surfaces: Applications, Sensors, and Dimensions 13




after the initial touch the system cannot detect a motionless finger [22]. More information
can be found in [24].


3.5 How to choose appropriate interactive surface?

To compare all the sensing technologies and find the most suitable one for the application,
besides the cost, lots of technical factors need to be considered. The scale is the first
limitation. The capacitive technology cannot be used for interactive floors and walls while
some optical technology cannot be used on mobile phone. Secondly, the application
requirements should be identified. Is multi-touch necessary? Is the full function of mouse
required? Is the ability to identify objects required? Input with finger or stylus? Last, but
not least important factors include for example, the resistance to scratch and bad working
conditions, the reaction speed, the existence of parallax, need external power supply or
not, etc. The factors mentioned above along with some others are listed in table 1 ;a full
comparison of these technologies will be a very difficult task especially considering the
technology innovation day by day. However, a complete and authoritative comparison can
be helpful for potential users of interactive surface.
               Table 1. Factors in Choosing Proper Sensing Technologies [adapted from 25]

Performance:                        speed, sensitivity, resolution, accuracy, calibration stability, drag, z-axis,
                                    double touch, parallax (kind of)

Input flexibility:                  glove, finger, fingernail, credit card, pen, signature capture, handwriting
                                    recognition

Optics:                             light transmission, reflection (lack thereof), clarity, color purity

Mechanical:                         small sizes (<10’’), large sizes (>19’’), curved CRTs, ease of integration,
                                    sealability, IP 651/NEMA 41

Electrical:                         controller chip available, lower power battery option, operation with poor
                                    ground, ESD1, EMI1, RFI1

Environmental:                      temperature, humidity, shock/vibration, altitude, in-vehicle, chemical
                                    resistance, scrath resistance, breakage resistance, safe break pattern, dust/dirt,
                                    rain/liquids, snow/ice, metal, ambient/UV light, fly on screen, non-glass
                                    surface possible, works through other materials, durability/wear, EMI1/RFI1,
                                    surroundings




1   These are not directly referred on the text; see the source for more information.
14   Li, Clagett, Bugs




4 Dimensions of Interactive Surfaces

Interactive surfaces can come in many shapes, sizes, and complexities. While this may
seem self-evident, it is important to consider, as it is this very characteristic that makes
them such a valuable and viable technology; interactive surfaces are not constrained to
just one form, thereby limiting their applicability. This section of the paper will explore
some of the many manifestations that interactive surfaces can take.


4.1 Size

As the technology has advanced, a wide variety of sizes for interactive surfaces exist. The
smallest devices are mobile and hand-held. Popular examples of such a device include the
iPod Touch and the iPhone, both introduced in the summer of 2007. These devices’
dimensions come in at just 11.5cm x 6.1cm x 1.2cm for the iPhone and 11cm x 6.2cm x
.8cm for the iPod Touch [26]. Such small sizes make these gadgets extremely portable
while maintaining sophisticated interaction capabilities. Other common small interactive
surfaces include most GPS systems and PDAs, many of which have no keyboard or
mouse and rely only on surface touch. In fact, these systems may be smaller than the
Apple products—GPS systems for bicycle navigation have only recently hit the market
and the Garmin model (Edge 705) comes in at only 5.1cm x 10.9cm x 2.5cm, weighing
slightly less than the iPod Touch [27].
   Also to be included in this ‘small’ category is other common touch screens such as
those on bank machines or for buying train tickets. Although the screens in these systems
are small, they are generally part of a much larger machine, which has this example
straddling the line between smaller and larger interactive surfaces.
   Moving up in scale are those surfaces that can be defined as being of ‘medium’ size
including all tabletop interactive surfaces. There is, of course, some difference in
dimension, but, for example, the Microsoft Surface, the first commercially-available
device of its kind, has a 30 inch screen but the entire unit is 56cm x 53cm x 107cm [12].
Obviously, a unit such as this is not particularly mobile, although since it’s marketed as an
alternative to an actual tabletop, one assumes that it is portable to some degree.
   Since many of these medium-sized surfaces are created for the purpose of group
interaction, it is important to consider how the dimensions of this device might affect the
success of this device at facilitating collaboration. One study explores how the extent of
the display influences the speed with which a group was able to complete a certain
activity; in this case the recreation of a poem based on words scattered about the surface.
The research found that in fact the different sizes of interactive surfaces used in this
experiment (87cm and 107 cm diagonally) had no effect on the success of the given
activity, although they did note that the size of the group seemed to influence the group’s
success [28]. In the comparison of these two extents, apparently size is not a determining
factor of success, although it seems likely that, looking at the global range of dimensions
An Overview of Interactive Surfaces: Applications, Sensors, and Dimensions 15




of interactive surfaces, this medium size in general is probably the most effective for this
type of group collaboration.
   Moving up to the largest devices is possible to find surfaces often requiring entire
body movements for interaction. In this category live interactive floors, walls, and entire
spaces. All of these can generally be as large as the designer wants them to be. Floors
react to footsteps across them while walls generally act as a large tabletop surface
mounted vertically; spaces, of course, such as the SensitiveSpaceSystem designed by iO,
combine elements of both floors and walls to make entire rooms or hallways that interact
with the person in it.
   The size of all of these units, to some degree, can be determined either for the display
or for the unit as a whole—for example, the actual surface of a tabletop display is quite
thin while the entire machine is quite large and in a ticket kiosk, the screen is only a small
part of the larger device. Perhaps the best example of this is for projected interactive
surfaces where the technology is a projector, and the actual size of the display depends on
the surface being displayed upon, be it the floor, wall, or even a window.


4.2 Complexity

Beyond size, the complexity of an interactive surface can range widely. The ‘complexity’
of an interactive surface can be defined in many ways: according to the exact way humans
can interact with it (single-touch versus multi-touch), by the type of application, and by
the operating system employed in the technology.
   At the simplest end of traditional interactive surfaces are single-touch interactive
surfaces; included in this category are most information kiosks, bank machines, GPS
devices, and most current PDAs. These surfaces function with primarily string gauge
touch screens and react only to one touch. Many of these devices, such as some
information kiosks and bank machines are set up to look like their predecessors where you
hit actual buttons, rather than virtual ones. Some of single-touch platforms combine this
button model with that of typical computer interfaces, such as many museum displays
where often your touch drags a pointer around the screen. While these interfaces are very
simple, in the context of their purpose, they are perfectly adequate and appropriate.
   Multi-touch systems, where the user can use more than one point to interact on a
surface, have been slowly moving into more mainstream technology, particularly with the
release of the iPhone and iPod Touch last summer, as well as with the MacBook models
that come equipped with a multi-touch mouse pad. Multi-touch functionality allows users
to, for example, resize images by dragging out the corners or scroll horizontally/vertically
on a laptop touch pad by using two fingers as well as encouraging collaboration so that
many users can be working in the same interface simultaneously. Besides these
computing interface examples of multi-touch, there are other surfaces that respond to
multiple touch points such as interactive floors and bars that have primarily aesthetic
displays that respond somehow to pressure from human gestures. Clearly, relative to
16   Li, Clagett, Bugs




single-touch surfaces, multi-touch surfaces are more interactive and the possibilities for
applications of such a technology are limitless.
   Complexity can also be considered with regards to the application of the interactive
surface. As mentioned in Section 2, there is an enormous range of applications for
interactive surfaces. Surfaces such as dance floors and bars are interesting, but serve as a
means to a primarily aesthetic end—while the technology behind the surface may still be
quite sophisticated, the difficulty in using them is minimal and the actual benefit from
using the surface (other than entertainment) is fairly minimal. While this type of
application is fairly basic, as presented in Section 2, many interactive surfaces provide
excellent, highly interactive platforms for computing, collaboration, and showcasing
products, among others. Therefore, applications too can also be a further measure of
complexity for interactive surfaces.
   Lastly, in reference to complexity, one can look at the degree of involvedness in the
operating system behind the interactive surface. This measure parallels the degree of
complexity as determined by application. On one hand you have the previously
mentioned dance floors and bars where the user simply touches the surface and something
simply of aesthetic interest happens at that point. On the other end of the spectrum are
actual computer interfaces that are built as an operating system of sorts, often for
collaboration on group project, such as tabletop computing systems, and now increasingly
as handheld devices, such as Apple’s iPhone. These systems not only react to multi-
touch, but also serve a much more complex purpose than that of the previously mentioned
floor or bar. These platforms allow users to do most everything they can do on their
regular computer but in a way that eliminates the need for a mouse/touchpad. The
sophistication of the applications of interactive surfaces can vary widely—from purely
entertainment-based functionality to complete operating systems.


5 Conclusion

This paper emphasizes the diversity inherent in all aspects of interactive surfaces, be it in
their application, the technology that makes them work, or their dimensions. Applications
run the gamut from purely aesthetic to sophisticated collaborative tools. Furthermore, a
seemingly endless and ever growing array of sensing technologies supports these
applications. What’s more, the shape, size, and complexity of these surfaces can vary
greatly—some will fit in your pocket and can organize your life while others decorate
huge spaces providing aesthetic appeal. The versatility of interactive surfaces are what
makes them so appealing to the general public and ensures that they will become only
more prevalent in our lives.
   While the future prevalence of interactive surfaces is all but ensured, the precise future
is wide open. The success of technologies such as the iPhone would seem to suggest that
the general population is willing to embrace interactive surface technology. Since
technology often evolves less to fill a void but rather to solve a problem people never
An Overview of Interactive Surfaces: Applications, Sensors, and Dimensions 17




even knew they had in the first place, it is hard to determine exactly what interactive
surface may look like or be used for in the future. What is certain is that there are already
some clear benefits from this technology. Firstly, and most basically, this technology has
a profound crowd-stopping effect; people are fascinated by it and even when it is serving
no practical purpose (such as with the interactive surface bar), it still manages to provoke
great interest. More pragmatically, this technology facilitates communication in many
forms: be it between customers and vendors or between colleagues as many people can
easily collaborate together on one surface.
   Furthermore, many argue that this type of interface is more natural and therefore more
intuitive for users [4]. Perhaps interactive surfaces will democratize computing the way
that virtual globes have democratized Geographic Information Systems [29]. It remains to
be seen exactly to what degree interactive surfaces will be adopted by the general public
and it will likely not know for many years yet as many of the surfaces described here are
either only prototypes or are not yet available to the general public. It is apparent,
however, that the potential is there for interactive surfaces to revolutionize not only the
way we perform our every day computing tasks, but also the way we order meals at
restaurants, shop for goods, and work with our peers. Interactive surfaces are a
technology we are best served to fully embrace as they will likely develop to be a major
presence in our everyday lives.


References

1. Ubiquitous Computing, http://www.ubiq.com/hypertext/weiser/UbiHome.html
2. Ishii, H., Ullmer B.: Tangible Bits: Towards Seamless Interfaces between People, Bits and
   Atoms. In CHI ’97: Proceedings of the SIGCHI conference on human factors in computing
   systems, pp. 234-241, ACM Press (1997)
3. Mindstorm Interactive Surface Solution, http://www.mindstorm.eu.com/
4. Valli, A.: Natural Interaction White Paper. http://www.naturalinteraction.org/images/
   whitepaper.pdf (2007)
5. Wagner, D., Pintaric, T., Ledermann, F., Schmalstieg, D.: Towards Massively Multi-user
   Augmented Reality on Handheld Devices. In Pervasive Computing: Third International
   Conference, PERVASIVE 2005, Lecture Notes in Computer Science (LNCS). Springer-Verlag,
   (2005)
6. Petersen, M. G., Krogh, P. G., Ludvigsen, M., Lykke-Olesen, A.: Floor Interaction: HCI
   Reaching New Ground. In CHI’05 extended abstracts on human factors in computing systems,
   Portland, OR, USA (2005)
7. Dietz, P., Leigh, D.: DiamondTouch: A Multi-User Touch Technology. Mitsubishi Electric
   Research Laboratories, http://www.merl.com/papers/docs/TR2003-125.pdf (2003).
8. Izadi, S., Brignull, H., Rodden, T., Rogers, Y., Underwood, M.: Dynamo: A public interactive
   surface supporting the cooperative sharing and exchange of media. UIST’03, Vancouver, BC,
   Canada (2003)
9. Perspective Pixel, http://www.perceptivepixel.com/
18   Li, Clagett, Bugs




10.Paradiso, J. A.: Tracking contact and free gesture across large interactive surfaces.
    Communications of the ACM, vol. 46, pp. 62-69 (2003)
11.Stock, O., Zancanaro, M., Busetta, P., Callaway, C., Krüger, A., Kruppa, M., Kuflik, T., Not, E.,
    Rocchi, C.: Adaptive, intelligent presentation of information for the museum visitor in PEACH.
    User Model User-Adap Inter, vol. 17, pp. 257-304, (2007)
12. Microsoft Surface, http://www.microsoft.com/surface/index.html
13. Johnson, R. and Fryberger, D. 1972. Touch Actuable Data Input Panel Assembly. U.S. Patent
    3,673,327. Jun. 1972
14. Mallos, J. 1982. Touch Position Sensitive Surface. U.S. Patent 4,346,376. Aug. 1982
15. Kasday, L. 1984. Touch Position Sensitive Surface. U.S. Patent 4,484,179. Nov. 1984
16. Han, J. Y. 2005. Low-Cost Multi-Touch Sensing through Frustrated Total Internal Reflection
17. Proceedings of the 18th Annual ACM Symposium on User Interface Software and Technology:
    http://delivery.acm.org/10.1145/1100000/1095054/p115-
    han.pdf?key1=1095054&key2=1556913121&coll=GUIDE&dl=GUIDE&CFID=72511020&CF
    TOKEN=44570496
18. http://en.wikipedia.org/wiki/Analog_resistive_touchscreen. Retrieved on June 10, 2008
19. http://en.wikipedia.org/wiki/Interactive_whiteboard#Interactive_Whiteboard_Technologies.
    Retrieved on June 10, 2008
20. Passive touch system and method of detecting user input. http://www.freshpatents.com/Passive-
    touch-system-and-method-of-detecting-user-input-
    dt20070405ptan20070075982.php?type=description. Retrieved on June 10, 2008
21. US Patent 5708460 - Touch screen, http://www.patentstorm.us/patents/5708460/fulltext.html
22. http://en.wikipedia.org/wiki/Touchscreen#Surface_acoustic_wave. Retrieved on June 10, 2008
23. http://media.elotouch.com/pdfs/marcom/apr_wp.pdf
24.
    http://multimedia.mmm.com/mws/mediawebserver.dyn?6666660Zjcf6lVs6EVs66SJAbCOrrrrQ-.
    Retrieved on June 10, 2008
25. http://www.elotouch.com/Technologies/compare_all.asp#notes
26. Apple Computers, http://www.apple.com
27. Garmin GPS. http://buy.garmin.com
28. Ryall, K., Forlines, C., Shen, C., Ringel Morris, M.: Exploring the effects of group size and
    table size on interactions with tabletop shared-display groupware, Proceedings of the 2004 ACM
    conference on Computer supported cooperative work, Chicago, Illinois, USA, (2004)
29. Butler, D.: Virtual globes: the web-wide world. Nature, vol. 439, pp. 776-778, (2006)

More Related Content

What's hot

Multi-Touch System
Multi-Touch SystemMulti-Touch System
Multi-Touch System
kerry14cn
 
Technology Insight Report Touch Technology
Technology Insight Report  Touch TechnologyTechnology Insight Report  Touch Technology
Technology Insight Report Touch Technology
Prashant Nair
 
Sixth sense technolgy
Sixth sense technolgySixth sense technolgy
Sixth sense technolgy
Anvesh Ranga
 
SMARCOS HIG Paper on Designing Touch Screen Interfaces
SMARCOS HIG Paper on Designing Touch Screen InterfacesSMARCOS HIG Paper on Designing Touch Screen Interfaces
SMARCOS HIG Paper on Designing Touch Screen Interfaces
Smarcos Eu
 

What's hot (20)

Multi-Touch System
Multi-Touch SystemMulti-Touch System
Multi-Touch System
 
IRJET - Touchless Technology
IRJET - Touchless TechnologyIRJET - Touchless Technology
IRJET - Touchless Technology
 
Technology Insight Report Touch Technology
Technology Insight Report  Touch TechnologyTechnology Insight Report  Touch Technology
Technology Insight Report Touch Technology
 
Signal rich media white paper
Signal rich media white paperSignal rich media white paper
Signal rich media white paper
 
Surface computing
Surface computingSurface computing
Surface computing
 
Bally Sloane
Bally SloaneBally Sloane
Bally Sloane
 
multitouch screen
multitouch screenmultitouch screen
multitouch screen
 
Sixth sense technolgy
Sixth sense technolgySixth sense technolgy
Sixth sense technolgy
 
Multi touch
Multi touchMulti touch
Multi touch
 
Microsoft Surface
Microsoft SurfaceMicrosoft Surface
Microsoft Surface
 
Designing Our Future: Technologies and Behaviors that Impact Design
Designing Our Future: Technologies and Behaviors that Impact DesignDesigning Our Future: Technologies and Behaviors that Impact Design
Designing Our Future: Technologies and Behaviors that Impact Design
 
SMARCOS HIG Paper on Designing Touch Screen Interfaces
SMARCOS HIG Paper on Designing Touch Screen InterfacesSMARCOS HIG Paper on Designing Touch Screen Interfaces
SMARCOS HIG Paper on Designing Touch Screen Interfaces
 
Surface computing
Surface computingSurface computing
Surface computing
 
RBI paper, CHI 2008
RBI paper, CHI 2008RBI paper, CHI 2008
RBI paper, CHI 2008
 
Multi touch technology
Multi touch technologyMulti touch technology
Multi touch technology
 
Adil ppt gesture
Adil ppt gestureAdil ppt gesture
Adil ppt gesture
 
Optical Multi Touch Technology
Optical Multi Touch TechnologyOptical Multi Touch Technology
Optical Multi Touch Technology
 
Future of interface design 2010
Future of interface design 2010Future of interface design 2010
Future of interface design 2010
 
Introduction to the touch projector technology
Introduction to the touch projector technologyIntroduction to the touch projector technology
Introduction to the touch projector technology
 
2 (1)
2 (1)2 (1)
2 (1)
 

Viewers also liked

Manooi_abudhabi_prezi_140523_vége(1)
Manooi_abudhabi_prezi_140523_vége(1)Manooi_abudhabi_prezi_140523_vége(1)
Manooi_abudhabi_prezi_140523_vége(1)
Greg Papp
 
Picture and kay selya@print.com
Picture and kay selya@print.comPicture and kay selya@print.com
Picture and kay selya@print.com
Bhoxz JoYrel
 
Study and measurement of infrared spectroscopy and its applications in medicine
Study and measurement of infrared spectroscopy and its applications in medicineStudy and measurement of infrared spectroscopy and its applications in medicine
Study and measurement of infrared spectroscopy and its applications in medicine
University of Technology
 
Contaminación ambiental en Lima y Callao
Contaminación ambiental en Lima y CallaoContaminación ambiental en Lima y Callao
Contaminación ambiental en Lima y Callao
Francis Leonardo
 
reflective logbook - final submission
reflective logbook - final submissionreflective logbook - final submission
reflective logbook - final submission
Kerrie Noble
 

Viewers also liked (15)

SYNGENTA - Grow The Future
SYNGENTA - Grow The FutureSYNGENTA - Grow The Future
SYNGENTA - Grow The Future
 
Boletin proyecciones
Boletin proyeccionesBoletin proyecciones
Boletin proyecciones
 
Manooi_abudhabi_prezi_140523_vége(1)
Manooi_abudhabi_prezi_140523_vége(1)Manooi_abudhabi_prezi_140523_vége(1)
Manooi_abudhabi_prezi_140523_vége(1)
 
Non profit-role-review-0213
Non profit-role-review-0213Non profit-role-review-0213
Non profit-role-review-0213
 
Redes sociales.
Redes sociales.Redes sociales.
Redes sociales.
 
Smart devine-act now before its too late-0313-v6
Smart devine-act now before its too late-0313-v6Smart devine-act now before its too late-0313-v6
Smart devine-act now before its too late-0313-v6
 
ARTE AL PARQUE 3
ARTE AL PARQUE 3ARTE AL PARQUE 3
ARTE AL PARQUE 3
 
Picture and kay selya@print.com
Picture and kay selya@print.comPicture and kay selya@print.com
Picture and kay selya@print.com
 
Study and measurement of infrared spectroscopy and its applications in medicine
Study and measurement of infrared spectroscopy and its applications in medicineStudy and measurement of infrared spectroscopy and its applications in medicine
Study and measurement of infrared spectroscopy and its applications in medicine
 
Planificación de Enlaces Radio en la Banda de HF
Planificación de Enlaces Radio en la Banda de HFPlanificación de Enlaces Radio en la Banda de HF
Planificación de Enlaces Radio en la Banda de HF
 
SEE-SAW: HOMEOSTASIS
SEE-SAW: HOMEOSTASISSEE-SAW: HOMEOSTASIS
SEE-SAW: HOMEOSTASIS
 
Contaminación ambiental en Lima y Callao
Contaminación ambiental en Lima y CallaoContaminación ambiental en Lima y Callao
Contaminación ambiental en Lima y Callao
 
Tot lenyor de dema
Tot lenyor de demaTot lenyor de dema
Tot lenyor de dema
 
reflective logbook - final submission
reflective logbook - final submissionreflective logbook - final submission
reflective logbook - final submission
 
6 formula jitu nge cad secepat kilat
6 formula jitu nge cad secepat kilat6 formula jitu nge cad secepat kilat
6 formula jitu nge cad secepat kilat
 

Similar to An Overview of Interactive Surfaces: Applications, Sensors, and Dimensions

microsoft-surface-ppt From EnggRoom.ppt
microsoft-surface-ppt From EnggRoom.pptmicrosoft-surface-ppt From EnggRoom.ppt
microsoft-surface-ppt From EnggRoom.ppt
Bree79
 
Sixth sense technology 04
Sixth  sense technology 04Sixth  sense technology 04
Sixth sense technology 04
akki_hearts
 
Surface computer ppt
Surface computer pptSurface computer ppt
Surface computer ppt
tejalc
 
Next generation User interfaces
Next generation User interfacesNext generation User interfaces
Next generation User interfaces
Harshad Kt
 

Similar to An Overview of Interactive Surfaces: Applications, Sensors, and Dimensions (20)

Surfacecomputerppt 130813063644-phpapp02
Surfacecomputerppt 130813063644-phpapp02Surfacecomputerppt 130813063644-phpapp02
Surfacecomputerppt 130813063644-phpapp02
 
Extending Aural and Music Interfaces to Mobile Device Interaction Design Arch...
Extending Aural and Music Interfaces to Mobile Device Interaction Design Arch...Extending Aural and Music Interfaces to Mobile Device Interaction Design Arch...
Extending Aural and Music Interfaces to Mobile Device Interaction Design Arch...
 
Being Human
Being HumanBeing Human
Being Human
 
Multitouch Interaction
Multitouch   InteractionMultitouch   Interaction
Multitouch Interaction
 
Surface computing ppt
Surface computing pptSurface computing ppt
Surface computing ppt
 
microsoft Surface computer
microsoft Surface computermicrosoft Surface computer
microsoft Surface computer
 
microsoft Surface computer
microsoft Surface computer microsoft Surface computer
microsoft Surface computer
 
New to the touch
New to the touchNew to the touch
New to the touch
 
microsoft-surface-ppt From EnggRoom.ppt
microsoft-surface-ppt From EnggRoom.pptmicrosoft-surface-ppt From EnggRoom.ppt
microsoft-surface-ppt From EnggRoom.ppt
 
microsoft-surface-ppt From EnggRoom.ppt
microsoft-surface-ppt From EnggRoom.pptmicrosoft-surface-ppt From EnggRoom.ppt
microsoft-surface-ppt From EnggRoom.ppt
 
Sixth sense technology 04
Sixth  sense technology 04Sixth  sense technology 04
Sixth sense technology 04
 
User Interface and Future Interaction Technologies
User Interface and Future Interaction Technologies User Interface and Future Interaction Technologies
User Interface and Future Interaction Technologies
 
Workshop SXSW 2017 - Iskander Smit - The Shape of New Things
Workshop SXSW 2017 - Iskander Smit - The Shape of New ThingsWorkshop SXSW 2017 - Iskander Smit - The Shape of New Things
Workshop SXSW 2017 - Iskander Smit - The Shape of New Things
 
HGR-thesis
HGR-thesisHGR-thesis
HGR-thesis
 
Hci and its effective use in design and development of good user interface
Hci and its effective use in design and development of good user interfaceHci and its effective use in design and development of good user interface
Hci and its effective use in design and development of good user interface
 
Using semiotic profile
Using semiotic profileUsing semiotic profile
Using semiotic profile
 
Surface computer ppt
Surface computer pptSurface computer ppt
Surface computer ppt
 
Surfacecomputerppt 130813063644-phpapp02
Surfacecomputerppt 130813063644-phpapp02Surfacecomputerppt 130813063644-phpapp02
Surfacecomputerppt 130813063644-phpapp02
 
surface computing
surface computingsurface computing
surface computing
 
Next generation User interfaces
Next generation User interfacesNext generation User interfaces
Next generation User interfaces
 

More from gaup_geo

Apresentação Conferência OIDP
Apresentação Conferência OIDPApresentação Conferência OIDP
Apresentação Conferência OIDP
gaup_geo
 

More from gaup_geo (12)

GAUPSurvey - Questionário online com mapas @FISL18
GAUPSurvey - Questionário online com mapas @FISL18GAUPSurvey - Questionário online com mapas @FISL18
GAUPSurvey - Questionário online com mapas @FISL18
 
Palestra Inteligência Geográfica: Geoinformação e Geolocalização
Palestra Inteligência Geográfica: Geoinformação e GeolocalizaçãoPalestra Inteligência Geográfica: Geoinformação e Geolocalização
Palestra Inteligência Geográfica: Geoinformação e Geolocalização
 
Ferramentas SIG – Sistemas de Informações Geográficas para o Planejamento Pa...
Ferramentas SIG – Sistemas de Informações Geográficas  para o Planejamento Pa...Ferramentas SIG – Sistemas de Informações Geográficas  para o Planejamento Pa...
Ferramentas SIG – Sistemas de Informações Geográficas para o Planejamento Pa...
 
Gerenciamento do patrimônio cultural no âmbito do licenciamento ambiental em ...
Gerenciamento do patrimônio cultural no âmbito do licenciamento ambiental em ...Gerenciamento do patrimônio cultural no âmbito do licenciamento ambiental em ...
Gerenciamento do patrimônio cultural no âmbito do licenciamento ambiental em ...
 
Dados Abertos para Gestão Pública
Dados Abertos para Gestão PúblicaDados Abertos para Gestão Pública
Dados Abertos para Gestão Pública
 
Transvensão LAB
Transvensão LABTransvensão LAB
Transvensão LAB
 
Palestra IAB/RS 2012
Palestra IAB/RS 2012Palestra IAB/RS 2012
Palestra IAB/RS 2012
 
Apresentação Conferência OIDP
Apresentação Conferência OIDPApresentação Conferência OIDP
Apresentação Conferência OIDP
 
Supervised and unsupervised classification techniques for satellite imagery i...
Supervised and unsupervised classification techniques for satellite imagery i...Supervised and unsupervised classification techniques for satellite imagery i...
Supervised and unsupervised classification techniques for satellite imagery i...
 
Urban land value map: a case study in Eldorado do Sul - Brazil
Urban land value map: a case study in Eldorado do Sul - BrazilUrban land value map: a case study in Eldorado do Sul - Brazil
Urban land value map: a case study in Eldorado do Sul - Brazil
 
Model elevation attribute with geostatistical procedures
Model elevation attribute with geostatistical proceduresModel elevation attribute with geostatistical procedures
Model elevation attribute with geostatistical procedures
 
Uso da Cartografia Digital Interativa para Participação Popular na Gestão e n...
Uso da Cartografia Digital Interativa para Participação Popular na Gestão e n...Uso da Cartografia Digital Interativa para Participação Popular na Gestão e n...
Uso da Cartografia Digital Interativa para Participação Popular na Gestão e n...
 

Recently uploaded

Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
WSO2
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Safe Software
 

Recently uploaded (20)

Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
 
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamDEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a Fresher
 
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
 
Corporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxCorporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptx
 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024
 
MS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsMS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectors
 
Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
 
Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)
 
WSO2's API Vision: Unifying Control, Empowering Developers
WSO2's API Vision: Unifying Control, Empowering DevelopersWSO2's API Vision: Unifying Control, Empowering Developers
WSO2's API Vision: Unifying Control, Empowering Developers
 
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
Exploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusExploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with Milvus
 
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdfRising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
Six Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal OntologySix Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal Ontology
 
[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 

An Overview of Interactive Surfaces: Applications, Sensors, and Dimensions

  • 1. An Overview of Interactive Surfaces: Applications, Sensors, and Dimensions Song Li1,2, Kathryn Clagett1,3, Geisa Bugs1,4 1 Mobile Interaction Institute for Geoinformatics – IFGI, University of Münster, Germany 2 felisong@gmail.com, 3kathryn.clagett@gmail.com, 4geisa80@yahoo.com.br Abstract. This paper seeks to summarize several aspects of interactive surface technology by exploring their applications, sensors, and dimensions. Application areas are surveyed and sub-divided into categories of entertainment, collaboration, communication, computer interface, and customer-vendor interfaces. Sensing technology is also broken down into smaller groups where three sensors—capacitive touch screen, optical imaging, and frustrated total internal reflection—are investigated in depth, with a general overview given to other types. In regards to dimension, the size and complexity of different interactive services are overviewed. It concludes by offering the possible benefits of using interactive surface technology. Keywords: interactive surfaces, touch screen, sensing technology. 1 Introduction At the end of the 1980’s, Mark Weiser, author of Ubiquitous Computing [1] that addresses the role of human/computer relationships, proposed a ‘third computing era,’ where technology recedes into the background of our lives. In other words, in the third era computers are everywhere, with each user accessing many computers on a regular basis. Twenty years later this is being realized; it is becoming possible to not only transfer digital photography to a wireless phone device without having to plug into a computer, but also to move a cursor with your fingers to interact with interfaces, or even to enter commands by voice. We are now living in an era where people are surrounded by intelligent surfaces and where digital information is coupled with everyday physical objects and environments. Interactive surfaces are one manifestation of this new era. In general, an interactive surface is indicated by “the transformation of …surface[s] within architectural space (e.g., walls, desktops, ceilings, doors, windows) into an active interface between the physical and virtual worlds [2].” Broadly speaking, interactive surfaces are those that can track
  • 2. 2 Li, Clagett, Bugs human gesture and voluntary manipulation of the object, the media, and the space involved in the interaction by a user. Consequently, interactive surfaces can be much more intuitive than platforms based on keyboard and mouse, and therefore much closer to spontaneous human behavior. This is a whole new way of interacting with information. The way people relate with the surface depends on the technology applied. One of the greatest advents of user interface design is touch screen technology. Although research in this area started in the 1980’s, this technology has witnessed increasing exposure recently through the launch of the iPhone, the first cellular phone that allows the user to touch the digital content directly with their fingers, a revolutionary notion. This technology will undoubtedly become more ubiquitous, with interactive surfaces likely to become common on the surface of a kitchen table or on the television screen for example. Basically, touch screen technology allows for interaction with the surface, and the manipulation of more than one thing at time with the movement of the hands or gestures from the human body. The goal of this research is to give an overview of interactive surfaces today, focusing for the most part in touch screen technology, where the hands are making the gesture to which the surface responds. The paper divides the overview into three main parts: application areas, technologies, and dimensions of interactive surfaces. First, the application areas are classified according to the function/objective along with devices examples. Next, the many solutions available for particular applications, or simple the technologies behind it, are explored. Finally, the third section addresses the size and complexity dimensions of interactive surfaces. Lastly, the conclusion will summarize the findings, emphasizing the benefits, disadvantages, and future speculation on interactive surfaces. 2 Application Areas This section presents an effort to illustrate the broad spectrum of viable applications of interactive surfaces. The devices were classified in five areas based on the objective/function: 1) entertainment, 2) collaboration, 3) communication, 4) computer interface, and 5) customer-vendor interface. Each application area is described along with examples. It is important to highlight that the categories are not mutually exclusive and some devices can fall into more than one category. 2.1 Entertainment There are a considerable number of current and possible applications for entertainment purposes. It is quite difficult to evaluate all of the achievable applications for surfaces of any shape or size that sense the location of objects and gestures. Commercial multi-touch displays [3] [4] are already being employed mainly in bars, discos, and events. Through fantastic graphic design on tables, floors, and walls the surfaces attract curiosity,
  • 3. An Overview of Interactive Surfaces: Applications, Sensors, and Dimensions 3 transforming customers of these bars or discos into active participants. For example, by setting their glass on the bar, clients can create light ‘paths’ on tables with their glasses, or patrons may have fun ‘drawing’ on the floor while dance with their feet. One can imagine a lounge full of walls projections that respond with the body movement or with the touch, changing color and display. It is at least an interesting and attractive application of interactive surfaces, and may serve to draw in more customers, thereby being a worthwhile investment for business owners. Additionally, interactive games are another entertainment area with huge potential for employing technologies that let the content be controlled in the real world by user’s gestures. In order to offer more ‘degrees of freedom,’ game design industry is concentrating on the concept of augmented reality, which combines real and virtual world elements, in real-time interactive 3D. The most popular example may be the Nintendo Wii. Even though the players don’t touch the screen, by using the Wii Remote, users are able to control the game with physical gestures and also navigate on the menu operating system, rather than using the button press traditional form. For an example where touch screen technology is used, the Invisible Train [5] is a simple multi-player game in which players guide virtual trains, only visible through the PDAs (Personal Digital Assistant) video display, over a real wooden miniature railroad track. In the game operations, track switches and speed adjustments are activated by responding to a tap on the PDA’s touch screen. Likewise, entertainment can be linked with knowledge acquisition. The iFloor [6] is an interactive floor where the ultimate goal is to make the physical place of libraries more attractive and appealing to its users. The iFloor allows multiple people to post questions/answers with a cursor that is dragged around the floor via body movement (Figure 1). The purpose of this application is closely related to the collaboration area, which will be addressed next. Fig. 1. iFloor
  • 4. 4 Li, Clagett, Bugs 2.2 Collaboration Oversized displays like whiteboards, projections, or plasma screens are typical in organizations or educational spaces with the intent to share information. The interactive whiteboard is the precursor to bring interaction to these displays. The difference comparing to the simple computer desktop projection is that users can interact with the board using a pen, finger, or other device (some are even adapting the Wii Remote). Several projects intend to enhance these public surfaces and make them yet more interactive in an attempt to facilitate group collaboration in meeting rooms, supporting a wide range of activities. These surfaces can promote knowledge, information, and even objects sharing in brainstorms. Public interactive surfaces are also able to save stages of work process by employing multiple displays. For instance, the DiamondTouch Table [7] is one of the first multi-touch technologies that support small group collaboration (Figure 2). This table works by running an electronic signal to the user’s finger via their chair to differentiate each user. Another example, the Dynamo [8] multi-user interactive surface, was designed to enable the sharing and exchange of digital media in public spaces. The surface is composed of one or more displays that can be horizontally and vertically. Users can attach multiple USB mice and keyboards to the surface or access remotely. Perceptive Pixel [9] company also creates large and medium size displays adopting multi-touch technology that claim to enhance collaborative spaces, allowing multiple users, up to 20 fingers, to work and interact. In fact, the technology is currently being utilized at CNN news transmissions. Fig.2. DiamondTouch Table
  • 5. An Overview of Interactive Surfaces: Applications, Sensors, and Dimensions 5 2.3 Communication Large surfaces are often employed for the communication of useful information about products or services in public spaces like museums, airports, and shopping malls. Despite that, being just projections these surfaces usually don’t allow direct public contact; it is likely that greater levels of interaction will be reached soon since it is a topic of significant interest among researchers. Conversely, handhelds, mobile, and wearable devices are more likely to accommodate user input. Large or small, these devices can equally be explored as communication tools using touch screen technology. The Gesture Wall project [10], for example, first introduced at the Brain Opera in 1996, make use of electrodes to measure the position and movement of the user's hands and body in front of a projection screen. Additionally, Sensitive Wall [4] is a commercially available interactive surface with a large vertical touchless display able to detect hands presence in real time, up to 30 centimeters from the surface. Alternatively it is easy to envision an interactive surface device that might accompany a user during an exhibition, similar to the PEACH (Personal Experience with Active Cultural Heritage) project [11], a suite of interactive and user-adaptive technologies for museum visitors, using together with others, video documentaries on mobile devices. 2.4 Computer Interface The multi-touch technology will rapidly become commonplace on computers interfaces. This technology will likely overtake traditional user interface techniques such as graphical user interface and desktops in which people traditionally use a mouse on a horizontal surface to control a visual pointer on the screen. Interactive surfaces are more situation- aware or assistance-oriented rather than command-oriented. Following the success of the iPhone, Apple expanded of the use of multi-touch computing to the iPod Touch (portable media player and wi-fi mobile platform), as well to the notebook lines. Additionally, Asus has already included multi-touch function on the touchpad of Asus’ EEE PC, a small portable notebook announced last April. Although the multi-touch technology evidences the technical evolution of computer interfaces, still other kinds of interactions may be supported on newer types of interactive computer interfaces. This may lead to a completely change on operating systems, make USB cords obsolete, and/or enable new recognition commands. Users will be able to interact with real world that is augmented by computer synthetic information. 2.5 Costumer-Vendor Interface Of course the marketplace has also adopted/embraced this new technology. Interactive surfaces are captivating and thereby an effective advertising tool, catching a consumer’s
  • 6. 6 Li, Clagett, Bugs attention instantaneously, in a crowd-stopping effect. Consequently there is a significant field for costumer-vendor interface applications ranging from organizing pictures and videos to ordering and paying. Microsoft Surface [12] utilizes gestures detection system and movements through infrared cameras and the image is shown in the screen through a system of projection. This technology has been explored by Microsoft partners since April 2008. One demonstration of the device shows how customers could user it to customize snowboards. By taking a tag from a snowboard in a shop that contains a chip identifying the it, the user can then place this tag down on the interactive surface, which immediately recognizes which snowboard the customers is interested in. The costumer is then able to resize, rotate, and change colors of graphic symbols which can be added to the product. After virtually ‘creating’ their product design, the layout can be saved on a mobile phone or other storage device, meaning that the customer can leave the store but save their design should they wish to come back and purchase that model. One additional example of the device utilization from advertising videos shows a restaurant table in which wine menus can pop-up on screen, showing a list of preferences by categories and information about it, such as cities of origin, producers, the winery, and a virtual map of the region. After having purchased an appropriate wine to accompany their dinner, the costumer can then pay the bill by simply placing the credit card on the surface, where the table recognizes the card and can charge the bill accordingly. In the same fashion, the system called Tap Tracker [10] shows costumers interacting with glass windows of stores (Figure 3). Four contact microphones were glued to the inside corners of a glass window and users could choose to watch brief video clips or engage in a game that was a ploy to get people into the store. Results showed that during the experiment the store’s sales increased, suggesting that in this instance the use of an interactive surface was an effective means to bringing in potential customers. Fig.3. Tap Tracker
  • 7. An Overview of Interactive Surfaces: Applications, Sensors, and Dimensions 7 3 Technology Alex Pentland wrote in the Scientific American: “The problem, in my opinion, is that our current computers are both deaf and blind: they experience the world only by way of a keyboard and a mouse…I believe computers must be able to see and hear what we do before they can prove truly helpful [13].” Today, all kinds of sensing technology are used as eyes and ears of interactive surfaces. The signal collected by sensors is processed by software to identify properties (location, direction, speed and other properties) of the action. Based on the identified properties, corresponding reactions, which can be visual, acoustic or even olfactory, will be executed by the computer. Sensors are the first step in an interactive surface and as such, they play a key role in the whole process. Different sensors, varied in functions, resolutions, installation conditions, etc., are used to meet the requirements in the different application areas illustrated above. Since it is impossible to include all technologies in this report, three different sensors--captive touch screens, optical imaging touch screens, frustrated total internal reflection touch screens--are introduced in detail while the others are only overviewed briefly. 3.1 Capacitive touch screens The idea of using capacitive sensing in the field of human-computer interfaces has a long history. Generally it works with an array of wires interacting with fingers touching the screen behind the board. The interaction between the different wires (laminated in a X- and Y-axis manner) and the tip of the finger is measured and calculated to a (x,y) coordinate. However, the interaction is sensitive to temperature and humidity. Smart-Skin [14], a capacitive sensing architecture developed by Jun Rekimoto, is an example. Constructed by laying a mesh of vertical transmitter/horizontal receiver electrodes on the surface (Figure 4) the receiver receives this wave signal because each crossing point (transmitter/ receiver pairs) acts as a (very weak) capacitor when one of the transmitters is excited by a wave signal (of typically several hundred kilohertz). The magnitude of the received signal is proportional to the frequency and voltage of the transmitted signal, as well as to the capacitance between the two electrodes. When a conductive and grounded object approaches a crossing point, it couples to the electrodes, and drains the wave signal. As a result, the received signal amplitude becomes weak. By measuring this effect, it is possible to detect the proximity of a conductive object, such as a human hand.
  • 8. 8 Li, Clagett, Bugs Fig. 4. The Smart-Skin sensor configuration: a meshshaped sensor grid is used to determine the hand’s position and shape. When the user’s hand is placed within 5-10 cm from the table, the system recognizes the effect of the capacitance change. A potential field is created when the hand is in the proximity to the table surface. To accurately determine the hand position, which is the peak of the potential field, a bi-cubic interpolation method is used to analyze the sensed data (Figure 5). By using this interpolation, the position of the hand can be determined by finding the peak on the interpolated curve. In the future, the Smart-Skin may not be limited to a tabletop—instead, a non-flat or even flexible surface like interactive paper, could act as a vehicle for such a surface. Also, Smart-Skin can combine with tactile feedback. If Smart-Skin could make the surface vibrate by using a transducer or a piezoactuator, the user could “feel” as if he/she were manipulating a real object.
  • 9. An Overview of Interactive Surfaces: Applications, Sensors, and Dimensions 9 Fig. 5. Gestures and corresponding sensor values (top: a hand on the sensor mesh, middle: raw input values, bottom: after bicubic interpolation) 3.2 Optical imaging touch screens Optical imaging touch screen is a relatively modern development in touch screen technology. Two or more image sensors are placed around the edges (primarily the corners) of the screen. Infrared backlights are placed in the camera's field of view on the other sides of the screen. A touch shows up as a shadow and each pair of cameras can then be triangulated to locate the touch. This technology is growing in popularity, due to its scalability, versatility, and affordability, especially for larger units. An example of it is Digital Vision Touch (DViT) technology developed by SMART Technologies Inc. Four cameras, one in each corner, constantly scan the surface to detect a target object (Figure 6). When a camera detects a target, its processor identifies the affected pixel and calculates the angle at which it occurs. Each camera detects the target and calculates its own angle. Mathematical formulas are developed that automatically record the distance between two cameras and their viewing angles in relation to each other. With this information known, the technology can then triangulate the location of the contact point. Mathematically, only two cameras are needed to calculate a contact point, but to ensure a robust system, DViT technology uses four cameras and has every camera pair report its triangulated result. The location of the contact point is sent to the user’s computer.
  • 10. 10 Li, Clagett, Bugs Fig. 6. Camera identification of a contact point In addition, it’s possible to detect both when an object contacts the surface and when it hovers above it. The cameras scan a layer of stacked pixels. If an object breaks into this pixel area, it registers with the cameras regardless of whether it has touched the surface. With specially designed cameras that have a high frame rate (100 FPS) and support fast image processing techniques, touch accuracy is high enough for mouse operation and writing. It is also scalable to almost any practical size because the technology resides in the corners, not the surface area. This technology can also support multi-touch after some improvement and in the future, it may even be possible to distinguish between different pointers. 3.3 Frustrated Total Internal Reflection touch screens When light encounters an interface to a medium with a lower index of refraction (e.g. glass to air), the light becomes refracted to an extent that depends on its angle of incidence and beyond a certain critical angle, and the light undergoes total internal reflection (TIR). Fiber optics, light pipes, and other optical waveguides rely on this phenomenon to transport light efficiently with very little loss. However, another material at the interface can frustrate this total internal reflection, causing light to escape the waveguide there instead. This phenomenon is well known as Frustrated Total Internal Reflection (FTIR). The first application to touch input appears to have been disclosed in 1970 in a binary device that detects the attenuation of light through a platen waveguide caused by a finger in contact [13]. This technique was further applied in fingerprint sensing and robotics field by Mallos [14] and Kasday separately [15]. Based on Mallos/Kasday design, Jefferson Y. Han [16] proposed a scaled-up FTIR fingerprint sensor, or a FTIR robot tactile sensor, used for interactive surface. This technique provides full imaging touch information without occlusion or ambiguity issues. The touch sense is zero-force and true: it accurately discriminates touch from a very slight hover. It samples at both high temporal and spatial resolutions. Pressure is not
  • 11. An Overview of Interactive Surfaces: Applications, Sensors, and Dimensions 11 sensed, though this is largely mitigated by the precision with which it can determine the contact area of a depressed finger. It is inexpensive to construct, and trivially scalable to much larger surfaces. A drawback of the approach is that, being camera-based, it requires a significant amount of space behind the interaction surface, though it is primarily expected application scenarios where rear-projection would have been employed anyway (e.g. interactive walls, tables). Also, as an optical system, it remains susceptible to harsh lighting environments. As a more robust alternative to Frustrated Total Internal Reflection, Zachary Scott Carpenter developed Diffused Laser Imaging (DLI) touch screen. DLI uses the same software and imaging system as FTIR. Instead of the light coming from inside the screen it comes from diffused laser light being conducted through human flesh and onto the diffuser of the screen. A two dimensional plane of laser light is emitted directly over a rigid screen (diffused glass). When a human finger breaks this plane of laser light the light is diffused by the finger and thus passes from the intersection point through the finger and down to the diffuser of the rigid screen. This system works just as well as FTIR but is not vulnerable to smudges or dirt on the screen. Furthermore most all multi touch functions can be achieved with only two laser line sources versus the many light sources required to internally illuminate the internal portion of a FTIR screen. DLI can also be used on most any kind of flat monitor by placing one or two position sensing cameras out in front of the display. In this case the cameras pick up the diffused finger tips as blobs of light instead of dots as viewed from the back side. 3.4 Other sensing technologies Resistive touch screens, which are composed of two flexible sheets coated with a resistive material and separated by an air gap or microdots. When contact is made to the surface of the touch screens, the two sheets are pressed together, registering the precise location of the touch. Resistive touch screens typically have a relative high resolution providing accurate touch control and allow one to use a finger, a stylus, or any other pointing device on the surface of the board. However, by their nature, they do not support the full function of a mouse and interface effects such as hover/pop-ups require additional interactions or software helper functions [18]. The Electromagnetic touch screens feature an array of wires embedded behind the board surface interacts with a coil in the stylus tip to determine the (X,Y) coordinate of the stylus. In other words, there are magnetic sensors in the board that react and send a message back to the computer when they are activated by a magnetic pen. Styli are either active (require a battery or wire back to the whiteboard) or passive (alter electrical signals produced by the board, but contain no power source). These screens typically have a very high resolution and usually support the full range of mouse functions, moreover, they may accommodate pressure sensing and multiple users touching and pointing on the surface [19].
  • 12. 12 Li, Clagett, Bugs Surface Acoustic Wave (SAW) touch screens employ transducers on the borders of a glass surface to vibrate the glass and produce acoustic waves that ripple over the glass surface. When a contact is made on the glass surface, the waves reflect back and the contact position is determined from the signature of the reflected waves. However, SAW touch screens suffer several drawbacks: 1) can only provide medium resolution; 2) exhibit noticeable parallax due to the thickness of the vibrating glass which is placed over the surface of the video or computer display; 3) hovering is not supported; 4) cannot scale beyond a few feet diagonal [20]. Strain gauge touch screens use one or more, preferably symmetrically positioned strain gage sensors to monitor relative distribution of force. The strain gauge sensors are positioned such that bending strain on the touch screen, engendered by touch, is detected by the one or more strain gages and accurately measured by an electronic controller, connected to the strain gauge(s). The electronic controller, or associated hardware, is programmed to relate relative bending force to a unique position on the screen and is a charge balancing and multiplying analog-to-digital converter which provides accurate position determination, even with very low forces and with very minor differentiation in position related forces [21]. An infrared touch screen panel employs one of two very different methods. One method uses thermal induced changes of the surface resistance. This method is sometimes slow and requires warm hands. Another method is an array of vertical and horizontal IR sensors that detect the interruption of a modulated light beam near the surface of the screen. IR touch screens have the most durable surfaces and are used in many military applications that require a touch panel display [22]. Acoustic Pulse Recognition (APR) touch screen comprises a glass display overlay or other rigid substrate, with four piezoelectric transducers mounted on the back surface. The transducers are mounted on two diagonally opposite corners out of the visible area and connected via a flex cable to a controller card. The impact when the screen is touched, or the friction caused while dragging between a user’s finger or stylus and the glass, creates an acoustic wave. The wave radiates away from the touch point, making its way to the transducers which produce electrical signals proportional to the acoustic waves. These signals are amplified in the controller card and then converted into a digital stream of data. The touch location is determined by comparing the data to a profile. APR is designed to reject ambient and extraneous sounds, as these do not match a stored sound profile. The touch screen itself is actually pure glass, giving it the optics and durability of the glass out of which it is made. It works with scratches and dust on the screen, and accuracy is very good [23]. Dispersive signal touch screen, introduced in 2002, uses sensors to detect the mechanical energy in the glass that occur due to a touch. Complex algorithms then interpret this information and provide the actual location of the touch. The technology claims to be unaffected by dust and other outside elements, including scratches. Since there is no need for additional elements on screen, it also claims to provide excellent optical clarity. Also, since mechanical vibrations are used to detect a touch event, any object can be used to generate these events, including fingers and styli. A downside is that
  • 13. An Overview of Interactive Surfaces: Applications, Sensors, and Dimensions 13 after the initial touch the system cannot detect a motionless finger [22]. More information can be found in [24]. 3.5 How to choose appropriate interactive surface? To compare all the sensing technologies and find the most suitable one for the application, besides the cost, lots of technical factors need to be considered. The scale is the first limitation. The capacitive technology cannot be used for interactive floors and walls while some optical technology cannot be used on mobile phone. Secondly, the application requirements should be identified. Is multi-touch necessary? Is the full function of mouse required? Is the ability to identify objects required? Input with finger or stylus? Last, but not least important factors include for example, the resistance to scratch and bad working conditions, the reaction speed, the existence of parallax, need external power supply or not, etc. The factors mentioned above along with some others are listed in table 1 ;a full comparison of these technologies will be a very difficult task especially considering the technology innovation day by day. However, a complete and authoritative comparison can be helpful for potential users of interactive surface. Table 1. Factors in Choosing Proper Sensing Technologies [adapted from 25] Performance: speed, sensitivity, resolution, accuracy, calibration stability, drag, z-axis, double touch, parallax (kind of) Input flexibility: glove, finger, fingernail, credit card, pen, signature capture, handwriting recognition Optics: light transmission, reflection (lack thereof), clarity, color purity Mechanical: small sizes (<10’’), large sizes (>19’’), curved CRTs, ease of integration, sealability, IP 651/NEMA 41 Electrical: controller chip available, lower power battery option, operation with poor ground, ESD1, EMI1, RFI1 Environmental: temperature, humidity, shock/vibration, altitude, in-vehicle, chemical resistance, scrath resistance, breakage resistance, safe break pattern, dust/dirt, rain/liquids, snow/ice, metal, ambient/UV light, fly on screen, non-glass surface possible, works through other materials, durability/wear, EMI1/RFI1, surroundings 1 These are not directly referred on the text; see the source for more information.
  • 14. 14 Li, Clagett, Bugs 4 Dimensions of Interactive Surfaces Interactive surfaces can come in many shapes, sizes, and complexities. While this may seem self-evident, it is important to consider, as it is this very characteristic that makes them such a valuable and viable technology; interactive surfaces are not constrained to just one form, thereby limiting their applicability. This section of the paper will explore some of the many manifestations that interactive surfaces can take. 4.1 Size As the technology has advanced, a wide variety of sizes for interactive surfaces exist. The smallest devices are mobile and hand-held. Popular examples of such a device include the iPod Touch and the iPhone, both introduced in the summer of 2007. These devices’ dimensions come in at just 11.5cm x 6.1cm x 1.2cm for the iPhone and 11cm x 6.2cm x .8cm for the iPod Touch [26]. Such small sizes make these gadgets extremely portable while maintaining sophisticated interaction capabilities. Other common small interactive surfaces include most GPS systems and PDAs, many of which have no keyboard or mouse and rely only on surface touch. In fact, these systems may be smaller than the Apple products—GPS systems for bicycle navigation have only recently hit the market and the Garmin model (Edge 705) comes in at only 5.1cm x 10.9cm x 2.5cm, weighing slightly less than the iPod Touch [27]. Also to be included in this ‘small’ category is other common touch screens such as those on bank machines or for buying train tickets. Although the screens in these systems are small, they are generally part of a much larger machine, which has this example straddling the line between smaller and larger interactive surfaces. Moving up in scale are those surfaces that can be defined as being of ‘medium’ size including all tabletop interactive surfaces. There is, of course, some difference in dimension, but, for example, the Microsoft Surface, the first commercially-available device of its kind, has a 30 inch screen but the entire unit is 56cm x 53cm x 107cm [12]. Obviously, a unit such as this is not particularly mobile, although since it’s marketed as an alternative to an actual tabletop, one assumes that it is portable to some degree. Since many of these medium-sized surfaces are created for the purpose of group interaction, it is important to consider how the dimensions of this device might affect the success of this device at facilitating collaboration. One study explores how the extent of the display influences the speed with which a group was able to complete a certain activity; in this case the recreation of a poem based on words scattered about the surface. The research found that in fact the different sizes of interactive surfaces used in this experiment (87cm and 107 cm diagonally) had no effect on the success of the given activity, although they did note that the size of the group seemed to influence the group’s success [28]. In the comparison of these two extents, apparently size is not a determining factor of success, although it seems likely that, looking at the global range of dimensions
  • 15. An Overview of Interactive Surfaces: Applications, Sensors, and Dimensions 15 of interactive surfaces, this medium size in general is probably the most effective for this type of group collaboration. Moving up to the largest devices is possible to find surfaces often requiring entire body movements for interaction. In this category live interactive floors, walls, and entire spaces. All of these can generally be as large as the designer wants them to be. Floors react to footsteps across them while walls generally act as a large tabletop surface mounted vertically; spaces, of course, such as the SensitiveSpaceSystem designed by iO, combine elements of both floors and walls to make entire rooms or hallways that interact with the person in it. The size of all of these units, to some degree, can be determined either for the display or for the unit as a whole—for example, the actual surface of a tabletop display is quite thin while the entire machine is quite large and in a ticket kiosk, the screen is only a small part of the larger device. Perhaps the best example of this is for projected interactive surfaces where the technology is a projector, and the actual size of the display depends on the surface being displayed upon, be it the floor, wall, or even a window. 4.2 Complexity Beyond size, the complexity of an interactive surface can range widely. The ‘complexity’ of an interactive surface can be defined in many ways: according to the exact way humans can interact with it (single-touch versus multi-touch), by the type of application, and by the operating system employed in the technology. At the simplest end of traditional interactive surfaces are single-touch interactive surfaces; included in this category are most information kiosks, bank machines, GPS devices, and most current PDAs. These surfaces function with primarily string gauge touch screens and react only to one touch. Many of these devices, such as some information kiosks and bank machines are set up to look like their predecessors where you hit actual buttons, rather than virtual ones. Some of single-touch platforms combine this button model with that of typical computer interfaces, such as many museum displays where often your touch drags a pointer around the screen. While these interfaces are very simple, in the context of their purpose, they are perfectly adequate and appropriate. Multi-touch systems, where the user can use more than one point to interact on a surface, have been slowly moving into more mainstream technology, particularly with the release of the iPhone and iPod Touch last summer, as well as with the MacBook models that come equipped with a multi-touch mouse pad. Multi-touch functionality allows users to, for example, resize images by dragging out the corners or scroll horizontally/vertically on a laptop touch pad by using two fingers as well as encouraging collaboration so that many users can be working in the same interface simultaneously. Besides these computing interface examples of multi-touch, there are other surfaces that respond to multiple touch points such as interactive floors and bars that have primarily aesthetic displays that respond somehow to pressure from human gestures. Clearly, relative to
  • 16. 16 Li, Clagett, Bugs single-touch surfaces, multi-touch surfaces are more interactive and the possibilities for applications of such a technology are limitless. Complexity can also be considered with regards to the application of the interactive surface. As mentioned in Section 2, there is an enormous range of applications for interactive surfaces. Surfaces such as dance floors and bars are interesting, but serve as a means to a primarily aesthetic end—while the technology behind the surface may still be quite sophisticated, the difficulty in using them is minimal and the actual benefit from using the surface (other than entertainment) is fairly minimal. While this type of application is fairly basic, as presented in Section 2, many interactive surfaces provide excellent, highly interactive platforms for computing, collaboration, and showcasing products, among others. Therefore, applications too can also be a further measure of complexity for interactive surfaces. Lastly, in reference to complexity, one can look at the degree of involvedness in the operating system behind the interactive surface. This measure parallels the degree of complexity as determined by application. On one hand you have the previously mentioned dance floors and bars where the user simply touches the surface and something simply of aesthetic interest happens at that point. On the other end of the spectrum are actual computer interfaces that are built as an operating system of sorts, often for collaboration on group project, such as tabletop computing systems, and now increasingly as handheld devices, such as Apple’s iPhone. These systems not only react to multi- touch, but also serve a much more complex purpose than that of the previously mentioned floor or bar. These platforms allow users to do most everything they can do on their regular computer but in a way that eliminates the need for a mouse/touchpad. The sophistication of the applications of interactive surfaces can vary widely—from purely entertainment-based functionality to complete operating systems. 5 Conclusion This paper emphasizes the diversity inherent in all aspects of interactive surfaces, be it in their application, the technology that makes them work, or their dimensions. Applications run the gamut from purely aesthetic to sophisticated collaborative tools. Furthermore, a seemingly endless and ever growing array of sensing technologies supports these applications. What’s more, the shape, size, and complexity of these surfaces can vary greatly—some will fit in your pocket and can organize your life while others decorate huge spaces providing aesthetic appeal. The versatility of interactive surfaces are what makes them so appealing to the general public and ensures that they will become only more prevalent in our lives. While the future prevalence of interactive surfaces is all but ensured, the precise future is wide open. The success of technologies such as the iPhone would seem to suggest that the general population is willing to embrace interactive surface technology. Since technology often evolves less to fill a void but rather to solve a problem people never
  • 17. An Overview of Interactive Surfaces: Applications, Sensors, and Dimensions 17 even knew they had in the first place, it is hard to determine exactly what interactive surface may look like or be used for in the future. What is certain is that there are already some clear benefits from this technology. Firstly, and most basically, this technology has a profound crowd-stopping effect; people are fascinated by it and even when it is serving no practical purpose (such as with the interactive surface bar), it still manages to provoke great interest. More pragmatically, this technology facilitates communication in many forms: be it between customers and vendors or between colleagues as many people can easily collaborate together on one surface. Furthermore, many argue that this type of interface is more natural and therefore more intuitive for users [4]. Perhaps interactive surfaces will democratize computing the way that virtual globes have democratized Geographic Information Systems [29]. It remains to be seen exactly to what degree interactive surfaces will be adopted by the general public and it will likely not know for many years yet as many of the surfaces described here are either only prototypes or are not yet available to the general public. It is apparent, however, that the potential is there for interactive surfaces to revolutionize not only the way we perform our every day computing tasks, but also the way we order meals at restaurants, shop for goods, and work with our peers. Interactive surfaces are a technology we are best served to fully embrace as they will likely develop to be a major presence in our everyday lives. References 1. Ubiquitous Computing, http://www.ubiq.com/hypertext/weiser/UbiHome.html 2. Ishii, H., Ullmer B.: Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms. In CHI ’97: Proceedings of the SIGCHI conference on human factors in computing systems, pp. 234-241, ACM Press (1997) 3. Mindstorm Interactive Surface Solution, http://www.mindstorm.eu.com/ 4. Valli, A.: Natural Interaction White Paper. http://www.naturalinteraction.org/images/ whitepaper.pdf (2007) 5. Wagner, D., Pintaric, T., Ledermann, F., Schmalstieg, D.: Towards Massively Multi-user Augmented Reality on Handheld Devices. In Pervasive Computing: Third International Conference, PERVASIVE 2005, Lecture Notes in Computer Science (LNCS). Springer-Verlag, (2005) 6. Petersen, M. G., Krogh, P. G., Ludvigsen, M., Lykke-Olesen, A.: Floor Interaction: HCI Reaching New Ground. In CHI’05 extended abstracts on human factors in computing systems, Portland, OR, USA (2005) 7. Dietz, P., Leigh, D.: DiamondTouch: A Multi-User Touch Technology. Mitsubishi Electric Research Laboratories, http://www.merl.com/papers/docs/TR2003-125.pdf (2003). 8. Izadi, S., Brignull, H., Rodden, T., Rogers, Y., Underwood, M.: Dynamo: A public interactive surface supporting the cooperative sharing and exchange of media. UIST’03, Vancouver, BC, Canada (2003) 9. Perspective Pixel, http://www.perceptivepixel.com/
  • 18. 18 Li, Clagett, Bugs 10.Paradiso, J. A.: Tracking contact and free gesture across large interactive surfaces. Communications of the ACM, vol. 46, pp. 62-69 (2003) 11.Stock, O., Zancanaro, M., Busetta, P., Callaway, C., Krüger, A., Kruppa, M., Kuflik, T., Not, E., Rocchi, C.: Adaptive, intelligent presentation of information for the museum visitor in PEACH. User Model User-Adap Inter, vol. 17, pp. 257-304, (2007) 12. Microsoft Surface, http://www.microsoft.com/surface/index.html 13. Johnson, R. and Fryberger, D. 1972. Touch Actuable Data Input Panel Assembly. U.S. Patent 3,673,327. Jun. 1972 14. Mallos, J. 1982. Touch Position Sensitive Surface. U.S. Patent 4,346,376. Aug. 1982 15. Kasday, L. 1984. Touch Position Sensitive Surface. U.S. Patent 4,484,179. Nov. 1984 16. Han, J. Y. 2005. Low-Cost Multi-Touch Sensing through Frustrated Total Internal Reflection 17. Proceedings of the 18th Annual ACM Symposium on User Interface Software and Technology: http://delivery.acm.org/10.1145/1100000/1095054/p115- han.pdf?key1=1095054&key2=1556913121&coll=GUIDE&dl=GUIDE&CFID=72511020&CF TOKEN=44570496 18. http://en.wikipedia.org/wiki/Analog_resistive_touchscreen. Retrieved on June 10, 2008 19. http://en.wikipedia.org/wiki/Interactive_whiteboard#Interactive_Whiteboard_Technologies. Retrieved on June 10, 2008 20. Passive touch system and method of detecting user input. http://www.freshpatents.com/Passive- touch-system-and-method-of-detecting-user-input- dt20070405ptan20070075982.php?type=description. Retrieved on June 10, 2008 21. US Patent 5708460 - Touch screen, http://www.patentstorm.us/patents/5708460/fulltext.html 22. http://en.wikipedia.org/wiki/Touchscreen#Surface_acoustic_wave. Retrieved on June 10, 2008 23. http://media.elotouch.com/pdfs/marcom/apr_wp.pdf 24. http://multimedia.mmm.com/mws/mediawebserver.dyn?6666660Zjcf6lVs6EVs66SJAbCOrrrrQ-. Retrieved on June 10, 2008 25. http://www.elotouch.com/Technologies/compare_all.asp#notes 26. Apple Computers, http://www.apple.com 27. Garmin GPS. http://buy.garmin.com 28. Ryall, K., Forlines, C., Shen, C., Ringel Morris, M.: Exploring the effects of group size and table size on interactions with tabletop shared-display groupware, Proceedings of the 2004 ACM conference on Computer supported cooperative work, Chicago, Illinois, USA, (2004) 29. Butler, D.: Virtual globes: the web-wide world. Nature, vol. 439, pp. 776-778, (2006)