SlideShare une entreprise Scribd logo
1  sur  47
Télécharger pour lire hors ligne
2 December 2005
Next Generation User Interfaces
Multimodal Interaction
Prof. Beat Signer
Department of Computer Science
Vrije Universiteit Brussel
http://www.beatsigner.com
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 2October 23, 2017
Human Senses and Modalities
 Input as well as output modalities
Sensory Perception Sense Organ Modality
sight eyes visual
hearing ears auditory
smell nose olfactory
taste tongue gustatory
touch skin tactile
(balance) vestibular system vestibular
Modality “refers to the type of communication channel used to convey
or acquire information. It also covers the way an idea is expressed or
perceived, or the manner an action is performed.”
Nigay and Coutaz, 1993
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 3October 23, 2017
Bolt's "Put-that-there" (1980)
Bolt, 1980
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 4October 23, 2017
Video: “Put-that-there”
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 5October 23, 2017
Bolt's "Put-that-there" (1980) ...
 Combination of two input modalities
 speech is used to issue the semantic part of the command
 gestures are used to provide the locations on the screen
 Complementary use of both modalities
 use of only speech or gestures does not enable the same
commands
 The "Put-that-there" interaction is still regarded as a rich
case of multimodal interaction
 however, it is based on an old mouse-oriented metaphor for
selecting objects
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 6October 23, 2017
Multimodal Interaction
 Multimodal interaction is about human-machine
interaction involving multiple modalities
 input modalities
- speech, gesture, gaze, emotions, …
 output modalities
- voice synthesis, visual cues, …
 Humans are inherently multimodal!
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 7October 23, 2017
Example: SpeeG2
User
Speech recognition
(Microsoft SAPI 5.4)
Skeletal tracking
(Microsoft Kinect)
5
4
2
3
SpeeG2 GUI
6
1
Sven De Kock
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 8October 23, 2017
Example: SpeeG2 …
 While as user speaks a
sentence, the most
probable words are shown
on the screen
 By using hand gestures
the user can correct words
(by selection) if necessary
 Improved recognition rate
with up to 21 words per
minute (WPM)
SpeeG2, WISE Lab
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 9October 23, 2017
Video: SpeeG2
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 10October 23, 2017
GUIs vs. Multimodal Interfaces
Graphical User Interfaces Multimodal Interfaces
Single event input stream that controls
the event loop
Typically continuous and simultaneous input
from multiple input streams
Sequential processing Parallel processing
Centralised architecture Often distributed architectures (e.g. multi
agent) with high computational and memory
requirements
No temporal constraints Requires timestamping and temporal
constraints for multimodal fusion
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 11October 23, 2017
Multimodal Human-Machine Interaction Loop
Dumas et al., 2009
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 12October 23, 2017
Multimodal Fusion
 Fusion in a multimodal system can happen at three
different levels
 data level
 feature level
 decision level
Dumas et al., 2009
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 13October 23, 2017
Multimodal Fusion …
 Data-level fusion
 focuses on the fusion of identical or tightly linked types of
multimodal data
- e.g. two video streams capturing the same scene from different angles
- low semantics and highly sensitive to noise (no preprocessing)
 Feature-level fusion
 fusion of features extracted from the data
- e.g. speech and lip movements
 Decision-level fusion
 focuses on interpretation based on semantic data
- e.g. Bolt’s "Put-that-there"  speech and gesture
 fusion of high-level information derived from data- and feature-
level fusion
- highly resistant to noise and failures
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 14October 23, 2017
Comparison of Fusion Levels
Data-Level Fusion Feature-Level Fusion Decision-Level Fusion
Used for raw data of the
same modality
closely coupled
modalities
loosely coupled
modalities
Level of
detail (data)
highest level of
detail
moderate level of
detail
difficult to recover data
that has been fused on
the data and feature
level
Sensitivity
to noise and
failures
highly sensitive moderately sensitive not very sensitive
Usage not frequently used
for multimodal
interfaces
used to improve the
recognition rate
most widely used level
of fusion for multimodal
interfaces
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 15October 23, 2017
Advantages of Multimodal Input
 Support and accommodate a user's perceptual
and communicative capabilities
 natural user interfaces with new ways of engaging interaction
 Integration of computational skillls of computers in the
real world by offering more natural ways of human-
machine interaction
 Enhanced robustness due to the combination of different
(partial) information sources
 Flexible personalisation by using different modalities
based on user preferences and context
 Helps visually or physically impaired users
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 16October 23, 2017
Multimodal Fission
 Forms a comprehensible answer for the user by making
use of multiple output modalities based on the device
and context
 builds an abstract message through a combination of channels
 Three main steps
 selection and structuring of content
 selection of modalities
 output coordination
 Coordination of the output on each channel is necessary
to form a coherent message
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 17October 23, 2017
Ten Myths of Multimodal Interaction
1. If you build a multimodal system, users
will interact multimodally
 users show a strong preference to interact
multimodally but this does not mean that they
always interact multimodally
 unimodal and multimodal interactions are mixed
depending on the task to be carried out
2. Speech and pointing is the dominant
multimodal integration pattern
 heritage from Bolt's "Put-that-there" where gesture and speech
are used to select an object (mouse-oriented metaphor)
 modalities can be used for more than just object selection
- written input (e.g. SpeeG2), facial expressions, ...
Sharon Oviatt
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 18October 23, 2017
Ten Myths of Multimodal Interaction ...
3. Multimodal input involves simultaneous signals
 signals are often not overlapping
 users frequently introduce (consciously or not) a small delay
between two modal inputs
 developers should not rely on an overlap for the fusion process
4. Speech is the primary input mode in any multimodal
system that includes it
 in human-human communication speech is indeed the primary
input mode
 in human-machine communication speech is not the exclusive
carrier of important information and does also not have temporal
precedence over other modalities
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 19October 23, 2017
Ten Myths of Multimodal Interaction ...
5. Multimodal language does not differ linguistically from
unimodal language
 different modalities complement each other
- e.g. avoid error-prone voice descriptions of spatial locations by pointing
 in many respects multimodal language is substantially simplified
- might lead to more robust systems
6. Multimodal integration involves redundancy of content
between modes
 early research in multimodal interaction assumed that different
redundant modalities could improve the recognition
 complementary modalities are more important and designers
should not rely on duplicated content
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 20October 23, 2017
Ten Myths of Multimodal Interaction ...
7. Individual error-prone recognition technologies combine
multimodally to produce even greater unreliability
 assumes that using two error-prone input modes such as speech
and handwriting recognition results in greater unreliability
 however, increased robustness due to mutual disambiguation
between modalities can be observed
8. All users’ multimodal commands are integrated in a
uniform way
 different users use different integration patterns (e.g. parallel
vs. sequential)
 system that can detect and adapt to a user’s dominant integration
pattern might lead to increased recognition rates
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 21October 23, 2017
Ten Myths of Multimodal Interaction ...
9. Different input modes are capable of transmitting
comparable content
 some modalities can be compared (e.g. speech and writing)
 however, each modality has its own very distinctive gestures
- e.g. gaze vs. a pointing device such a the Wiimote
10.Enhanced efficiency is the main advantage of
multimodal systems
 often there is only a minimal increase in efficiency
 however, many other advantages
- less errors, enhanced usability, flexibility, mutual disambiguation, …
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 22October 23, 2017
Formal Models for Combining Modalities
 Formalisation of multimodal human-machine interaction
 Conceptualise the different possible relationships
between input and output modalities
 Two conceptual spaces
 CASE: use and combination or modalities on the system side
(fusion engine)
 CARE: possible combination of modalities at the user level
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 23October 23, 2017
CASE Model
 3 dimensions in the
CASE design space
 levels of abstraction
 use of modalities
 fusion
Nigay and Coutaz, 1993
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 24October 23, 2017
CASE Model: Levels of Abstraction
 Data received from a
device can be processed
at multiple levels of
abstraction
 Speech analysis example
 signal (data) level
 phonetic (feature) level
 semantic level
 A multimodal system is
always classified under
the meaning category
Nigay and Coutaz, 1993
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 25October 23, 2017
CASE Model: Use of Modalities
 The use of modalities
expresses the temporal
availability of multiple
modalities
 parallel use allows the user
to employ multiple modalities
simultaneously
 sequential use forces the
user to use the modalities
one after another
Nigay and Coutaz, 1993
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 26October 23, 2017
CASE Model: Fusion
 Possible combination of
different types of data
 combined means that there
is fusion
 independant means that
there is an absence of fusion
Nigay and Coutaz, 1993
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 27October 23, 2017
CASE Model: Classification
 Concurrent
 two distinctive tasks
executed in parallel
 e.g. draw a circle and draw
a square next to it
 Alternate
 task with temporal alternation
of modalities
 e.g. "Draw a circle there"
followed by pointing to the
location
Nigay and Coutaz, 1993
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 28October 23, 2017
CASE Model: Classification
 Synergistic
 task using several coreferent
modalities in parallel
 e.g. "Draw a circle here" with
concurrent pointing to the
location
 synergistic multimodal
systems are the ultimate goal
 Exclusive
 one task after the other with
one modality at a time (no
coreference)
Nigay and Coutaz, 1993
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 29October 23, 2017
CARE Properties
 Four properties to characterise and assess aspects of
multimodal interaction in terms of the combination of
modalities at the user level [Coutaz et al., 1995]
 Complementarity
 Assignment
 Redundancy
 Equivalence
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 30October 23, 2017
Complementarity
 Multiple modalities are to be used within a temporal
window in order reach a given state
 No single modality on its own is sufficient to reach the
state
 Integration (fusion) can happen sequentially or in parallel
 Example
 “Please show me this list”
and <pointing at ‘list of flights’ label>
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 31October 23, 2017
Assignment
 Only one modality can be used to reach a given state
 Absence of choice
 Example
 Moving the mouse to change the position of a window
(considering that the mouse is the only modality available for that
operation)
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 32October 23, 2017
Redundancy
 If multiple modalities have the same expressive power
(equivalent) and if they are used within the same
temporal window
 Repetitive behaviour without increasing the expressive
power
 Integration (fusion) can happen sequentially or in parallel
 Example
 “Could you show me the list of flights?”
and <click on ‘list of flights’ button>
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 33October 23, 2017
Equivalence
 Necessary and sufficient to use any single of the
available modalities
 Choice between multiple modalities
 No temporal constraints
 Example
 “Could you show me the list of flights?”
or <click on ‘list of flights’ button>
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 34October 23, 2017
Example for Equivalence: EdFest 2004
 Edfest 2004 prototype con-
sisted of a digital pen and
paper interface in combination
with speech input and voice
output [Belotti et al., 2005]
 User had a choice to execute
some commands either via
pen input or through speech
input
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 35October 23, 2017
Multimodal Interaction Frameworks
 Software frameworks for the creation of multimodal
interfaces
 Squidy
 OpenInterface
 HephaisTK
 Mudra
 Three families of frameworks
 stream based
- Squidy, OpenInterface
 event based
- HephaisTK
 hybrid
- Mudra
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 36October 23, 2017
Squidy
 Integrated tool with a GUI
 Interfaces are define by
pipelining components
and filters
 Online knowledge base of
device components and
filters
 Semantic zooming
 e.g. zooming on a filter
shows a graph of the data
passing through the filter
http://www.squidy-lib.de
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 37October 23, 2017
OpenInterface Framework
 European research project
(2006-2009)
 Component-based
framework
 Online component library
with components for
numerous modalities and
devices
 Design an application by
pipelining components
http://www.openinterface.org
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 38October 23, 2017
HephaisTK
 Software agents-based
framework
 Focus on events
 Different fusion algorithms
 Description of the
dialogue via the SMUIML
language
http://www.hephaistk.org
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 39October 23, 2017
UsiXML
 User Interface eXtended
Markup Language
 XML-based declarative
User Interface Description
Language (UIDL) to des-
cribe abstract interfaces
 can later be mapped to
different physical character-
istics (including multimodal
user interfaces)
Stanciulescu et al., 2005
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 40October 23, 2017
Mudra
 Fusion across different levels of abstraction
 unified fusion framework based on shared fact base
 Interactions defined via declarative rule-based language
 Rapid prototyping
 simple integration of new input devices
 integration of external gesture recognisers
Hoste et al., 2011
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 41October 23, 2017
References
 R.A. Bolt, “Put-that-there”: Voice and Gesture at
the Graphics Interface, In Proceedings of Siggraph 1980,
7th Annual Conference on Computer Graphics and Inter-
active Techniques, Seattle, USA, July 1980
 http://dx.doi.org/10.1145/965105.807503
 L. Nigay and J. Coutaz, A Design Space for Multimodal
Systems: Concurrent Processing and Data Fusion, In
Proceedings of CHI 1993, 3rd International Conference
on Human Factors in Computing Systems, Amsterdam,
The Netherlands, April 1993
 http://dx.doi.org/10.1145/169059.169143
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 42October 23, 2017
References …
 J. Coutaz et al., Four Easy Pieces for
Assessing the Usability of Multimodal Interaction: The
CARE Properties, In Proceedings of Interact 1995, 5th
International Conference on Human-Computer
Interaction, Lillehammer, Norway, June 1995
 http://dx.doi.org/10.1007/978-1-5041-2896-4_19
 B. Dumas, D. Lalanne and S. Oviatt, Multi-
modal Interfaces: A Survey of Principles, Models and
Frameworks, Human Machine Interaction, LNCS 5440,
2009
 http://dx.doi.org/10.1007/978-3-642-00437-7_1
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 43October 23, 2017
References …
 S. Oviatt, Ten Myths of Multimodal Interaction,
Communications of the ACM 42(11), November 1999
 http://dx.doi.org/10.1145/319382.319398
 Put-that-there demo
 https://www.youtube.com/watch?v=RyBEUyEtxQo
 R. Belotti, C. Decurtins, M.C. Norrie, B. Signer and
L. Vukelja, Experimental Platform for Mobile Information
Systems, Proceedings of MobiCom 2005, 11th Annual
International Conference on Mobile Computing and Net-
working, Cologne, Germany, August 2005
 http://dx.doi.org/10.1145/1080829.1080856
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 44October 23, 2017
References …
 L. Hoste and B. Signer, SpeeG2: A Speech-
and Gesture-based Interface for Efficient Controller-free
Text Entry, Proceedings of ICMI 2013, 15th International
Conference on Multimodal Interaction, Sydney, Australia,
December 2013
 http://beatsigner.com/publications/hoste_ICMI2013.pdf
 L. Hoste, B. Dumas and B. Signer, Mudra: A Unified
Multimodal Interaction Framework, Proceedings of
ICMI 2011, 13th International Conference on Multimodal
Interaction, Alicante, Spain, November 2011
 http://beatsigner.com/publications/hoste_ICMI2011.pdf
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 45October 23, 2017
References …
 L. Hoste, A Declarative Approach for Engineer-
ing Multimodal Interaction, PhD thesis, Vrije Universiteit
Brussel (VUB), Brussels, Belgium, June 2015
 http://beatsigner.com/theses/PhdThesisLodeHoste.pdf
 A. Stanciulescu, Q. Limbourg, J. Vanderdonckt,
B. Michotte and F. Montero, A Transformational Ap-
proach for Multimodal Web User Interfaces Based on
UsiXML, Proceedings of ICMI 2005, 7th International
Conference on Multimodal Interfaces, Trento, Italy,
October 2005
 http://dx.doi.org/10.1145/1088463.1088508
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 46October 23, 2017
References …
 SpeeG2 demo
 https://www.youtube.com/watch?v=ItKySNv8l90
2 December 2005
Next Lecture
Pen-based Interaction

Contenu connexe

Tendances

Cross-Media Information Spaces and Architectures (CISA)
Cross-Media Information Spaces and Architectures (CISA)Cross-Media Information Spaces and Architectures (CISA)
Cross-Media Information Spaces and Architectures (CISA)Beat Signer
 
Introducing Tangible Holograms for Data Physicalisation and Big Data Exploration
Introducing Tangible Holograms for Data Physicalisation and Big Data ExplorationIntroducing Tangible Holograms for Data Physicalisation and Big Data Exploration
Introducing Tangible Holograms for Data Physicalisation and Big Data ExplorationBeat Signer
 
Towards a Framework for Dynamic Data Physicalisation
Towards a Framework for Dynamic Data PhysicalisationTowards a Framework for Dynamic Data Physicalisation
Towards a Framework for Dynamic Data PhysicalisationBeat Signer
 
Cross-Media Information Spaces and Architectures (CISA)
Cross-Media Information Spaces and Architectures (CISA)Cross-Media Information Spaces and Architectures (CISA)
Cross-Media Information Spaces and Architectures (CISA)Beat Signer
 
Multimodal Interaction - Lecture 05 - Next Generation User Interfaces (401816...
Multimodal Interaction - Lecture 05 - Next Generation User Interfaces (401816...Multimodal Interaction - Lecture 05 - Next Generation User Interfaces (401816...
Multimodal Interaction - Lecture 05 - Next Generation User Interfaces (401816...Beat Signer
 
Introduction - Lecture 1 - Advanced Topics in Information Systems (4016792ENR)
Introduction - Lecture 1 - Advanced Topics in Information Systems (4016792ENR)Introduction - Lecture 1 - Advanced Topics in Information Systems (4016792ENR)
Introduction - Lecture 1 - Advanced Topics in Information Systems (4016792ENR)Beat Signer
 
Cross-Media Information Systems - Quo Vadis?
Cross-Media Information Systems - Quo Vadis?Cross-Media Information Systems - Quo Vadis?
Cross-Media Information Systems - Quo Vadis?Beat Signer
 
Interactive Paper: Past, Present and Future
Interactive Paper: Past, Present and FutureInteractive Paper: Past, Present and Future
Interactive Paper: Past, Present and FutureBeat Signer
 
Requirements Analysis, Prototyping and Evaluation - Lecture 03 - Next Generat...
Requirements Analysis, Prototyping and Evaluation - Lecture 03 - Next Generat...Requirements Analysis, Prototyping and Evaluation - Lecture 03 - Next Generat...
Requirements Analysis, Prototyping and Evaluation - Lecture 03 - Next Generat...Beat Signer
 
Implicit Human-Computer Interaction - Lecture 11 - Next Generation User Inter...
Implicit Human-Computer Interaction - Lecture 11 - Next Generation User Inter...Implicit Human-Computer Interaction - Lecture 11 - Next Generation User Inter...
Implicit Human-Computer Interaction - Lecture 11 - Next Generation User Inter...Beat Signer
 
Gesture-based Interaction - Lecture 08 - Next Generation User Interfaces (401...
Gesture-based Interaction - Lecture 08 - Next Generation User Interfaces (401...Gesture-based Interaction - Lecture 08 - Next Generation User Interfaces (401...
Gesture-based Interaction - Lecture 08 - Next Generation User Interfaces (401...Beat Signer
 
Paper-Digital User Interfaces - Applications, Frameworks and Future Challenges
Paper-Digital User Interfaces - Applications, Frameworks and Future ChallengesPaper-Digital User Interfaces - Applications, Frameworks and Future Challenges
Paper-Digital User Interfaces - Applications, Frameworks and Future ChallengesBeat Signer
 
Pen-based Interaction - Lecture 6 - Next Generation User Interfaces (4018166FNR)
Pen-based Interaction - Lecture 6 - Next Generation User Interfaces (4018166FNR)Pen-based Interaction - Lecture 6 - Next Generation User Interfaces (4018166FNR)
Pen-based Interaction - Lecture 6 - Next Generation User Interfaces (4018166FNR)Beat Signer
 
Interaction Design - Lecture 2 - Next Generation User Interfaces (4018166FNR)
Interaction Design - Lecture 2 - Next Generation User Interfaces (4018166FNR)Interaction Design - Lecture 2 - Next Generation User Interfaces (4018166FNR)
Interaction Design - Lecture 2 - Next Generation User Interfaces (4018166FNR)Beat Signer
 
From PaperPoint to MindXpres - Towards Enhanced Presentation Tools
From PaperPoint to MindXpres - Towards Enhanced Presentation ToolsFrom PaperPoint to MindXpres - Towards Enhanced Presentation Tools
From PaperPoint to MindXpres - Towards Enhanced Presentation ToolsBeat Signer
 
Introduction - Web Technologies (1019888BNR)
Introduction - Web Technologies (1019888BNR)Introduction - Web Technologies (1019888BNR)
Introduction - Web Technologies (1019888BNR)Beat Signer
 
Virtual and Augmented Reality - Lecture 10 - Next Generation User Interfaces ...
Virtual and Augmented Reality - Lecture 10 - Next Generation User Interfaces ...Virtual and Augmented Reality - Lecture 10 - Next Generation User Interfaces ...
Virtual and Augmented Reality - Lecture 10 - Next Generation User Interfaces ...Beat Signer
 
Mobile Information Systems - Lecture 08 - Web Information Systems (4011474FNR)
Mobile Information Systems - Lecture 08 - Web Information Systems (4011474FNR)Mobile Information Systems - Lecture 08 - Web Information Systems (4011474FNR)
Mobile Information Systems - Lecture 08 - Web Information Systems (4011474FNR)Beat Signer
 
Future Trends - Lecture 12 - Web Information Systems (4011474FNR)
Future Trends - Lecture 12 - Web Information Systems (4011474FNR)Future Trends - Lecture 12 - Web Information Systems (4011474FNR)
Future Trends - Lecture 12 - Web Information Systems (4011474FNR)Beat Signer
 
Crossing Spaces: Towards Cross-Media Personal Information Management User Int...
Crossing Spaces: Towards Cross-Media Personal Information Management User Int...Crossing Spaces: Towards Cross-Media Personal Information Management User Int...
Crossing Spaces: Towards Cross-Media Personal Information Management User Int...Beat Signer
 

Tendances (20)

Cross-Media Information Spaces and Architectures (CISA)
Cross-Media Information Spaces and Architectures (CISA)Cross-Media Information Spaces and Architectures (CISA)
Cross-Media Information Spaces and Architectures (CISA)
 
Introducing Tangible Holograms for Data Physicalisation and Big Data Exploration
Introducing Tangible Holograms for Data Physicalisation and Big Data ExplorationIntroducing Tangible Holograms for Data Physicalisation and Big Data Exploration
Introducing Tangible Holograms for Data Physicalisation and Big Data Exploration
 
Towards a Framework for Dynamic Data Physicalisation
Towards a Framework for Dynamic Data PhysicalisationTowards a Framework for Dynamic Data Physicalisation
Towards a Framework for Dynamic Data Physicalisation
 
Cross-Media Information Spaces and Architectures (CISA)
Cross-Media Information Spaces and Architectures (CISA)Cross-Media Information Spaces and Architectures (CISA)
Cross-Media Information Spaces and Architectures (CISA)
 
Multimodal Interaction - Lecture 05 - Next Generation User Interfaces (401816...
Multimodal Interaction - Lecture 05 - Next Generation User Interfaces (401816...Multimodal Interaction - Lecture 05 - Next Generation User Interfaces (401816...
Multimodal Interaction - Lecture 05 - Next Generation User Interfaces (401816...
 
Introduction - Lecture 1 - Advanced Topics in Information Systems (4016792ENR)
Introduction - Lecture 1 - Advanced Topics in Information Systems (4016792ENR)Introduction - Lecture 1 - Advanced Topics in Information Systems (4016792ENR)
Introduction - Lecture 1 - Advanced Topics in Information Systems (4016792ENR)
 
Cross-Media Information Systems - Quo Vadis?
Cross-Media Information Systems - Quo Vadis?Cross-Media Information Systems - Quo Vadis?
Cross-Media Information Systems - Quo Vadis?
 
Interactive Paper: Past, Present and Future
Interactive Paper: Past, Present and FutureInteractive Paper: Past, Present and Future
Interactive Paper: Past, Present and Future
 
Requirements Analysis, Prototyping and Evaluation - Lecture 03 - Next Generat...
Requirements Analysis, Prototyping and Evaluation - Lecture 03 - Next Generat...Requirements Analysis, Prototyping and Evaluation - Lecture 03 - Next Generat...
Requirements Analysis, Prototyping and Evaluation - Lecture 03 - Next Generat...
 
Implicit Human-Computer Interaction - Lecture 11 - Next Generation User Inter...
Implicit Human-Computer Interaction - Lecture 11 - Next Generation User Inter...Implicit Human-Computer Interaction - Lecture 11 - Next Generation User Inter...
Implicit Human-Computer Interaction - Lecture 11 - Next Generation User Inter...
 
Gesture-based Interaction - Lecture 08 - Next Generation User Interfaces (401...
Gesture-based Interaction - Lecture 08 - Next Generation User Interfaces (401...Gesture-based Interaction - Lecture 08 - Next Generation User Interfaces (401...
Gesture-based Interaction - Lecture 08 - Next Generation User Interfaces (401...
 
Paper-Digital User Interfaces - Applications, Frameworks and Future Challenges
Paper-Digital User Interfaces - Applications, Frameworks and Future ChallengesPaper-Digital User Interfaces - Applications, Frameworks and Future Challenges
Paper-Digital User Interfaces - Applications, Frameworks and Future Challenges
 
Pen-based Interaction - Lecture 6 - Next Generation User Interfaces (4018166FNR)
Pen-based Interaction - Lecture 6 - Next Generation User Interfaces (4018166FNR)Pen-based Interaction - Lecture 6 - Next Generation User Interfaces (4018166FNR)
Pen-based Interaction - Lecture 6 - Next Generation User Interfaces (4018166FNR)
 
Interaction Design - Lecture 2 - Next Generation User Interfaces (4018166FNR)
Interaction Design - Lecture 2 - Next Generation User Interfaces (4018166FNR)Interaction Design - Lecture 2 - Next Generation User Interfaces (4018166FNR)
Interaction Design - Lecture 2 - Next Generation User Interfaces (4018166FNR)
 
From PaperPoint to MindXpres - Towards Enhanced Presentation Tools
From PaperPoint to MindXpres - Towards Enhanced Presentation ToolsFrom PaperPoint to MindXpres - Towards Enhanced Presentation Tools
From PaperPoint to MindXpres - Towards Enhanced Presentation Tools
 
Introduction - Web Technologies (1019888BNR)
Introduction - Web Technologies (1019888BNR)Introduction - Web Technologies (1019888BNR)
Introduction - Web Technologies (1019888BNR)
 
Virtual and Augmented Reality - Lecture 10 - Next Generation User Interfaces ...
Virtual and Augmented Reality - Lecture 10 - Next Generation User Interfaces ...Virtual and Augmented Reality - Lecture 10 - Next Generation User Interfaces ...
Virtual and Augmented Reality - Lecture 10 - Next Generation User Interfaces ...
 
Mobile Information Systems - Lecture 08 - Web Information Systems (4011474FNR)
Mobile Information Systems - Lecture 08 - Web Information Systems (4011474FNR)Mobile Information Systems - Lecture 08 - Web Information Systems (4011474FNR)
Mobile Information Systems - Lecture 08 - Web Information Systems (4011474FNR)
 
Future Trends - Lecture 12 - Web Information Systems (4011474FNR)
Future Trends - Lecture 12 - Web Information Systems (4011474FNR)Future Trends - Lecture 12 - Web Information Systems (4011474FNR)
Future Trends - Lecture 12 - Web Information Systems (4011474FNR)
 
Crossing Spaces: Towards Cross-Media Personal Information Management User Int...
Crossing Spaces: Towards Cross-Media Personal Information Management User Int...Crossing Spaces: Towards Cross-Media Personal Information Management User Int...
Crossing Spaces: Towards Cross-Media Personal Information Management User Int...
 

Similaire à Multimodal Interaction - Lecture 05 - Next Generation User Interfaces (4018166FNR)

Multimodal man machine interaction
Multimodal man machine interactionMultimodal man machine interaction
Multimodal man machine interactionDr. Rajesh P Barnwal
 
Modeling For Sustainability: Or How to Make Smart CPS Smarter?
Modeling For Sustainability: Or How to Make Smart CPS Smarter?Modeling For Sustainability: Or How to Make Smart CPS Smarter?
Modeling For Sustainability: Or How to Make Smart CPS Smarter?Benoit Combemale
 
Caaa07 Presentation February Final
Caaa07 Presentation February FinalCaaa07 Presentation February Final
Caaa07 Presentation February Finalpbihler
 
Deep Context-Awareness: Context Coupling and New Types of Context Information...
Deep Context-Awareness: Context Coupling and New Types of Context Information...Deep Context-Awareness: Context Coupling and New Types of Context Information...
Deep Context-Awareness: Context Coupling and New Types of Context Information...Hong-Linh Truong
 
Context-Aware Computing
Context-Aware ComputingContext-Aware Computing
Context-Aware Computinglogus2k
 
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014Journal For Research
 
To the end of our possibilities with Adaptive User Interfaces
To the end of our possibilities with Adaptive User InterfacesTo the end of our possibilities with Adaptive User Interfaces
To the end of our possibilities with Adaptive User InterfacesJean Vanderdonckt
 
The Recurated Museum: IV. Collections Management & Sustainability
The Recurated Museum: IV. Collections Management & SustainabilityThe Recurated Museum: IV. Collections Management & Sustainability
The Recurated Museum: IV. Collections Management & SustainabilityChristopher Morse
 
Chapter 4 universal design
Chapter 4 universal designChapter 4 universal design
Chapter 4 universal designPado Pado
 
Experience-Interface-Interaction.pdf
Experience-Interface-Interaction.pdfExperience-Interface-Interaction.pdf
Experience-Interface-Interaction.pdfHasseyWijetunge
 
Multimodal and Affective Human Computer Interaction - Abhinav Sharma
Multimodal and Affective Human Computer Interaction - Abhinav SharmaMultimodal and Affective Human Computer Interaction - Abhinav Sharma
Multimodal and Affective Human Computer Interaction - Abhinav SharmaAbhinav Sharma
 
Mobile speech and advanced natural language solutions
Mobile speech and advanced natural language solutionsMobile speech and advanced natural language solutions
Mobile speech and advanced natural language solutionsSpringer
 
Extending Aural and Music Interfaces to Mobile Device Interaction Design Arch...
Extending Aural and Music Interfaces to Mobile Device Interaction Design Arch...Extending Aural and Music Interfaces to Mobile Device Interaction Design Arch...
Extending Aural and Music Interfaces to Mobile Device Interaction Design Arch...IJERA Editor
 
Automated Media Player using Hand Gesture
Automated Media Player using Hand GestureAutomated Media Player using Hand Gesture
Automated Media Player using Hand GestureIRJET Journal
 
Jason Brush: Filmmaking As User Experience Design, SXSW 2013
Jason Brush: Filmmaking As User Experience Design, SXSW 2013Jason Brush: Filmmaking As User Experience Design, SXSW 2013
Jason Brush: Filmmaking As User Experience Design, SXSW 2013jasonbrush
 
Ambient Intelligence Ambient Intelligence
Ambient Intelligence Ambient IntelligenceAmbient Intelligence Ambient Intelligence
Ambient Intelligence Ambient IntelligenceBRNSSPublicationHubI
 

Similaire à Multimodal Interaction - Lecture 05 - Next Generation User Interfaces (4018166FNR) (20)

Multimodal man machine interaction
Multimodal man machine interactionMultimodal man machine interaction
Multimodal man machine interaction
 
Modeling For Sustainability: Or How to Make Smart CPS Smarter?
Modeling For Sustainability: Or How to Make Smart CPS Smarter?Modeling For Sustainability: Or How to Make Smart CPS Smarter?
Modeling For Sustainability: Or How to Make Smart CPS Smarter?
 
Caaa07 Presentation February Final
Caaa07 Presentation February FinalCaaa07 Presentation February Final
Caaa07 Presentation February Final
 
Multimodal Interaction
Multimodal InteractionMultimodal Interaction
Multimodal Interaction
 
Deep Context-Awareness: Context Coupling and New Types of Context Information...
Deep Context-Awareness: Context Coupling and New Types of Context Information...Deep Context-Awareness: Context Coupling and New Types of Context Information...
Deep Context-Awareness: Context Coupling and New Types of Context Information...
 
Context-Aware Computing
Context-Aware ComputingContext-Aware Computing
Context-Aware Computing
 
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014
 
To the end of our possibilities with Adaptive User Interfaces
To the end of our possibilities with Adaptive User InterfacesTo the end of our possibilities with Adaptive User Interfaces
To the end of our possibilities with Adaptive User Interfaces
 
The Recurated Museum: IV. Collections Management & Sustainability
The Recurated Museum: IV. Collections Management & SustainabilityThe Recurated Museum: IV. Collections Management & Sustainability
The Recurated Museum: IV. Collections Management & Sustainability
 
2 4-10
2 4-102 4-10
2 4-10
 
C0353018026
C0353018026C0353018026
C0353018026
 
Chapter 4 universal design
Chapter 4 universal designChapter 4 universal design
Chapter 4 universal design
 
Experience-Interface-Interaction.pdf
Experience-Interface-Interaction.pdfExperience-Interface-Interaction.pdf
Experience-Interface-Interaction.pdf
 
Multimodal and Affective Human Computer Interaction - Abhinav Sharma
Multimodal and Affective Human Computer Interaction - Abhinav SharmaMultimodal and Affective Human Computer Interaction - Abhinav Sharma
Multimodal and Affective Human Computer Interaction - Abhinav Sharma
 
Mobile speech and advanced natural language solutions
Mobile speech and advanced natural language solutionsMobile speech and advanced natural language solutions
Mobile speech and advanced natural language solutions
 
Extending Aural and Music Interfaces to Mobile Device Interaction Design Arch...
Extending Aural and Music Interfaces to Mobile Device Interaction Design Arch...Extending Aural and Music Interfaces to Mobile Device Interaction Design Arch...
Extending Aural and Music Interfaces to Mobile Device Interaction Design Arch...
 
Inredis And Machine Learning Nips
Inredis And Machine Learning NipsInredis And Machine Learning Nips
Inredis And Machine Learning Nips
 
Automated Media Player using Hand Gesture
Automated Media Player using Hand GestureAutomated Media Player using Hand Gesture
Automated Media Player using Hand Gesture
 
Jason Brush: Filmmaking As User Experience Design, SXSW 2013
Jason Brush: Filmmaking As User Experience Design, SXSW 2013Jason Brush: Filmmaking As User Experience Design, SXSW 2013
Jason Brush: Filmmaking As User Experience Design, SXSW 2013
 
Ambient Intelligence Ambient Intelligence
Ambient Intelligence Ambient IntelligenceAmbient Intelligence Ambient Intelligence
Ambient Intelligence Ambient Intelligence
 

Plus de Beat Signer

Introduction - Lecture 1 - Human-Computer Interaction (1023841ANR)
Introduction - Lecture 1 - Human-Computer Interaction (1023841ANR)Introduction - Lecture 1 - Human-Computer Interaction (1023841ANR)
Introduction - Lecture 1 - Human-Computer Interaction (1023841ANR)Beat Signer
 
Indoor Positioning Using the OpenHPS Framework
Indoor Positioning Using the OpenHPS FrameworkIndoor Positioning Using the OpenHPS Framework
Indoor Positioning Using the OpenHPS FrameworkBeat Signer
 
Personalised Learning Environments Based on Knowledge Graphs and the Zone of ...
Personalised Learning Environments Based on Knowledge Graphs and the Zone of ...Personalised Learning Environments Based on Knowledge Graphs and the Zone of ...
Personalised Learning Environments Based on Knowledge Graphs and the Zone of ...Beat Signer
 
Cross-Media Technologies and Applications - Future Directions for Personal In...
Cross-Media Technologies and Applications - Future Directions for Personal In...Cross-Media Technologies and Applications - Future Directions for Personal In...
Cross-Media Technologies and Applications - Future Directions for Personal In...Beat Signer
 
Bridging the Gap: Managing and Interacting with Information Across Media Boun...
Bridging the Gap: Managing and Interacting with Information Across Media Boun...Bridging the Gap: Managing and Interacting with Information Across Media Boun...
Bridging the Gap: Managing and Interacting with Information Across Media Boun...Beat Signer
 
Codeschool in a Box: A Low-Barrier Approach to Packaging Programming Curricula
Codeschool in a Box: A Low-Barrier Approach to Packaging Programming CurriculaCodeschool in a Box: A Low-Barrier Approach to Packaging Programming Curricula
Codeschool in a Box: A Low-Barrier Approach to Packaging Programming CurriculaBeat Signer
 
The RSL Hypermedia Metamodel and Its Application in Cross-Media Solutions
The RSL Hypermedia Metamodel and Its Application in Cross-Media Solutions The RSL Hypermedia Metamodel and Its Application in Cross-Media Solutions
The RSL Hypermedia Metamodel and Its Application in Cross-Media Solutions Beat Signer
 
Case Studies and Course Review - Lecture 12 - Information Visualisation (4019...
Case Studies and Course Review - Lecture 12 - Information Visualisation (4019...Case Studies and Course Review - Lecture 12 - Information Visualisation (4019...
Case Studies and Course Review - Lecture 12 - Information Visualisation (4019...Beat Signer
 
Dashboards - Lecture 11 - Information Visualisation (4019538FNR)
Dashboards - Lecture 11 - Information Visualisation (4019538FNR)Dashboards - Lecture 11 - Information Visualisation (4019538FNR)
Dashboards - Lecture 11 - Information Visualisation (4019538FNR)Beat Signer
 
Interaction - Lecture 10 - Information Visualisation (4019538FNR)
Interaction - Lecture 10 - Information Visualisation (4019538FNR)Interaction - Lecture 10 - Information Visualisation (4019538FNR)
Interaction - Lecture 10 - Information Visualisation (4019538FNR)Beat Signer
 
View Manipulation and Reduction - Lecture 9 - Information Visualisation (4019...
View Manipulation and Reduction - Lecture 9 - Information Visualisation (4019...View Manipulation and Reduction - Lecture 9 - Information Visualisation (4019...
View Manipulation and Reduction - Lecture 9 - Information Visualisation (4019...Beat Signer
 
Visualisation Techniques - Lecture 8 - Information Visualisation (4019538FNR)
Visualisation Techniques - Lecture 8 - Information Visualisation (4019538FNR)Visualisation Techniques - Lecture 8 - Information Visualisation (4019538FNR)
Visualisation Techniques - Lecture 8 - Information Visualisation (4019538FNR)Beat Signer
 
Design Guidelines and Principles - Lecture 7 - Information Visualisation (401...
Design Guidelines and Principles - Lecture 7 - Information Visualisation (401...Design Guidelines and Principles - Lecture 7 - Information Visualisation (401...
Design Guidelines and Principles - Lecture 7 - Information Visualisation (401...Beat Signer
 
Data Processing and Visualisation Frameworks - Lecture 6 - Information Visual...
Data Processing and Visualisation Frameworks - Lecture 6 - Information Visual...Data Processing and Visualisation Frameworks - Lecture 6 - Information Visual...
Data Processing and Visualisation Frameworks - Lecture 6 - Information Visual...Beat Signer
 
Data Presentation - Lecture 5 - Information Visualisation (4019538FNR)
Data Presentation - Lecture 5 - Information Visualisation (4019538FNR)Data Presentation - Lecture 5 - Information Visualisation (4019538FNR)
Data Presentation - Lecture 5 - Information Visualisation (4019538FNR)Beat Signer
 
Analysis and Validation - Lecture 4 - Information Visualisation (4019538FNR)
Analysis and Validation - Lecture 4 - Information Visualisation (4019538FNR)Analysis and Validation - Lecture 4 - Information Visualisation (4019538FNR)
Analysis and Validation - Lecture 4 - Information Visualisation (4019538FNR)Beat Signer
 
Data Representation - Lecture 3 - Information Visualisation (4019538FNR)
Data Representation - Lecture 3 - Information Visualisation (4019538FNR)Data Representation - Lecture 3 - Information Visualisation (4019538FNR)
Data Representation - Lecture 3 - Information Visualisation (4019538FNR)Beat Signer
 
Human Perception and Colour Theory - Lecture 2 - Information Visualisation (4...
Human Perception and Colour Theory - Lecture 2 - Information Visualisation (4...Human Perception and Colour Theory - Lecture 2 - Information Visualisation (4...
Human Perception and Colour Theory - Lecture 2 - Information Visualisation (4...Beat Signer
 
Introduction - Lecture 1 - Information Visualisation (4019538FNR)
Introduction - Lecture 1 - Information Visualisation (4019538FNR)Introduction - Lecture 1 - Information Visualisation (4019538FNR)
Introduction - Lecture 1 - Information Visualisation (4019538FNR)Beat Signer
 
Cross-Media Document Linking and Navigation
Cross-Media Document Linking and NavigationCross-Media Document Linking and Navigation
Cross-Media Document Linking and NavigationBeat Signer
 

Plus de Beat Signer (20)

Introduction - Lecture 1 - Human-Computer Interaction (1023841ANR)
Introduction - Lecture 1 - Human-Computer Interaction (1023841ANR)Introduction - Lecture 1 - Human-Computer Interaction (1023841ANR)
Introduction - Lecture 1 - Human-Computer Interaction (1023841ANR)
 
Indoor Positioning Using the OpenHPS Framework
Indoor Positioning Using the OpenHPS FrameworkIndoor Positioning Using the OpenHPS Framework
Indoor Positioning Using the OpenHPS Framework
 
Personalised Learning Environments Based on Knowledge Graphs and the Zone of ...
Personalised Learning Environments Based on Knowledge Graphs and the Zone of ...Personalised Learning Environments Based on Knowledge Graphs and the Zone of ...
Personalised Learning Environments Based on Knowledge Graphs and the Zone of ...
 
Cross-Media Technologies and Applications - Future Directions for Personal In...
Cross-Media Technologies and Applications - Future Directions for Personal In...Cross-Media Technologies and Applications - Future Directions for Personal In...
Cross-Media Technologies and Applications - Future Directions for Personal In...
 
Bridging the Gap: Managing and Interacting with Information Across Media Boun...
Bridging the Gap: Managing and Interacting with Information Across Media Boun...Bridging the Gap: Managing and Interacting with Information Across Media Boun...
Bridging the Gap: Managing and Interacting with Information Across Media Boun...
 
Codeschool in a Box: A Low-Barrier Approach to Packaging Programming Curricula
Codeschool in a Box: A Low-Barrier Approach to Packaging Programming CurriculaCodeschool in a Box: A Low-Barrier Approach to Packaging Programming Curricula
Codeschool in a Box: A Low-Barrier Approach to Packaging Programming Curricula
 
The RSL Hypermedia Metamodel and Its Application in Cross-Media Solutions
The RSL Hypermedia Metamodel and Its Application in Cross-Media Solutions The RSL Hypermedia Metamodel and Its Application in Cross-Media Solutions
The RSL Hypermedia Metamodel and Its Application in Cross-Media Solutions
 
Case Studies and Course Review - Lecture 12 - Information Visualisation (4019...
Case Studies and Course Review - Lecture 12 - Information Visualisation (4019...Case Studies and Course Review - Lecture 12 - Information Visualisation (4019...
Case Studies and Course Review - Lecture 12 - Information Visualisation (4019...
 
Dashboards - Lecture 11 - Information Visualisation (4019538FNR)
Dashboards - Lecture 11 - Information Visualisation (4019538FNR)Dashboards - Lecture 11 - Information Visualisation (4019538FNR)
Dashboards - Lecture 11 - Information Visualisation (4019538FNR)
 
Interaction - Lecture 10 - Information Visualisation (4019538FNR)
Interaction - Lecture 10 - Information Visualisation (4019538FNR)Interaction - Lecture 10 - Information Visualisation (4019538FNR)
Interaction - Lecture 10 - Information Visualisation (4019538FNR)
 
View Manipulation and Reduction - Lecture 9 - Information Visualisation (4019...
View Manipulation and Reduction - Lecture 9 - Information Visualisation (4019...View Manipulation and Reduction - Lecture 9 - Information Visualisation (4019...
View Manipulation and Reduction - Lecture 9 - Information Visualisation (4019...
 
Visualisation Techniques - Lecture 8 - Information Visualisation (4019538FNR)
Visualisation Techniques - Lecture 8 - Information Visualisation (4019538FNR)Visualisation Techniques - Lecture 8 - Information Visualisation (4019538FNR)
Visualisation Techniques - Lecture 8 - Information Visualisation (4019538FNR)
 
Design Guidelines and Principles - Lecture 7 - Information Visualisation (401...
Design Guidelines and Principles - Lecture 7 - Information Visualisation (401...Design Guidelines and Principles - Lecture 7 - Information Visualisation (401...
Design Guidelines and Principles - Lecture 7 - Information Visualisation (401...
 
Data Processing and Visualisation Frameworks - Lecture 6 - Information Visual...
Data Processing and Visualisation Frameworks - Lecture 6 - Information Visual...Data Processing and Visualisation Frameworks - Lecture 6 - Information Visual...
Data Processing and Visualisation Frameworks - Lecture 6 - Information Visual...
 
Data Presentation - Lecture 5 - Information Visualisation (4019538FNR)
Data Presentation - Lecture 5 - Information Visualisation (4019538FNR)Data Presentation - Lecture 5 - Information Visualisation (4019538FNR)
Data Presentation - Lecture 5 - Information Visualisation (4019538FNR)
 
Analysis and Validation - Lecture 4 - Information Visualisation (4019538FNR)
Analysis and Validation - Lecture 4 - Information Visualisation (4019538FNR)Analysis and Validation - Lecture 4 - Information Visualisation (4019538FNR)
Analysis and Validation - Lecture 4 - Information Visualisation (4019538FNR)
 
Data Representation - Lecture 3 - Information Visualisation (4019538FNR)
Data Representation - Lecture 3 - Information Visualisation (4019538FNR)Data Representation - Lecture 3 - Information Visualisation (4019538FNR)
Data Representation - Lecture 3 - Information Visualisation (4019538FNR)
 
Human Perception and Colour Theory - Lecture 2 - Information Visualisation (4...
Human Perception and Colour Theory - Lecture 2 - Information Visualisation (4...Human Perception and Colour Theory - Lecture 2 - Information Visualisation (4...
Human Perception and Colour Theory - Lecture 2 - Information Visualisation (4...
 
Introduction - Lecture 1 - Information Visualisation (4019538FNR)
Introduction - Lecture 1 - Information Visualisation (4019538FNR)Introduction - Lecture 1 - Information Visualisation (4019538FNR)
Introduction - Lecture 1 - Information Visualisation (4019538FNR)
 
Cross-Media Document Linking and Navigation
Cross-Media Document Linking and NavigationCross-Media Document Linking and Navigation
Cross-Media Document Linking and Navigation
 

Dernier

PROCESS RECORDING FORMAT.docx
PROCESS      RECORDING        FORMAT.docxPROCESS      RECORDING        FORMAT.docx
PROCESS RECORDING FORMAT.docxPoojaSen20
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdfQucHHunhnh
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxnegromaestrong
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxiammrhaywood
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxDenish Jangid
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhikauryashika82
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingTeacherCyreneCayanan
 
An Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdfAn Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdfSanaAli374401
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactPECB
 
Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Disha Kariya
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...christianmathematics
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphThiyagu K
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxVishalSingh1417
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin ClassesCeline George
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfciinovamais
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfJayanti Pande
 

Dernier (20)

PROCESS RECORDING FORMAT.docx
PROCESS      RECORDING        FORMAT.docxPROCESS      RECORDING        FORMAT.docx
PROCESS RECORDING FORMAT.docx
 
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writing
 
An Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdfAn Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdf
 
Advance Mobile Application Development class 07
Advance Mobile Application Development class 07Advance Mobile Application Development class 07
Advance Mobile Application Development class 07
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptxINDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 

Multimodal Interaction - Lecture 05 - Next Generation User Interfaces (4018166FNR)

  • 1. 2 December 2005 Next Generation User Interfaces Multimodal Interaction Prof. Beat Signer Department of Computer Science Vrije Universiteit Brussel http://www.beatsigner.com
  • 2. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 2October 23, 2017 Human Senses and Modalities  Input as well as output modalities Sensory Perception Sense Organ Modality sight eyes visual hearing ears auditory smell nose olfactory taste tongue gustatory touch skin tactile (balance) vestibular system vestibular Modality “refers to the type of communication channel used to convey or acquire information. It also covers the way an idea is expressed or perceived, or the manner an action is performed.” Nigay and Coutaz, 1993
  • 3. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 3October 23, 2017 Bolt's "Put-that-there" (1980) Bolt, 1980
  • 4. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 4October 23, 2017 Video: “Put-that-there”
  • 5. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 5October 23, 2017 Bolt's "Put-that-there" (1980) ...  Combination of two input modalities  speech is used to issue the semantic part of the command  gestures are used to provide the locations on the screen  Complementary use of both modalities  use of only speech or gestures does not enable the same commands  The "Put-that-there" interaction is still regarded as a rich case of multimodal interaction  however, it is based on an old mouse-oriented metaphor for selecting objects
  • 6. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 6October 23, 2017 Multimodal Interaction  Multimodal interaction is about human-machine interaction involving multiple modalities  input modalities - speech, gesture, gaze, emotions, …  output modalities - voice synthesis, visual cues, …  Humans are inherently multimodal!
  • 7. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 7October 23, 2017 Example: SpeeG2 User Speech recognition (Microsoft SAPI 5.4) Skeletal tracking (Microsoft Kinect) 5 4 2 3 SpeeG2 GUI 6 1 Sven De Kock
  • 8. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 8October 23, 2017 Example: SpeeG2 …  While as user speaks a sentence, the most probable words are shown on the screen  By using hand gestures the user can correct words (by selection) if necessary  Improved recognition rate with up to 21 words per minute (WPM) SpeeG2, WISE Lab
  • 9. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 9October 23, 2017 Video: SpeeG2
  • 10. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 10October 23, 2017 GUIs vs. Multimodal Interfaces Graphical User Interfaces Multimodal Interfaces Single event input stream that controls the event loop Typically continuous and simultaneous input from multiple input streams Sequential processing Parallel processing Centralised architecture Often distributed architectures (e.g. multi agent) with high computational and memory requirements No temporal constraints Requires timestamping and temporal constraints for multimodal fusion
  • 11. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 11October 23, 2017 Multimodal Human-Machine Interaction Loop Dumas et al., 2009
  • 12. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 12October 23, 2017 Multimodal Fusion  Fusion in a multimodal system can happen at three different levels  data level  feature level  decision level Dumas et al., 2009
  • 13. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 13October 23, 2017 Multimodal Fusion …  Data-level fusion  focuses on the fusion of identical or tightly linked types of multimodal data - e.g. two video streams capturing the same scene from different angles - low semantics and highly sensitive to noise (no preprocessing)  Feature-level fusion  fusion of features extracted from the data - e.g. speech and lip movements  Decision-level fusion  focuses on interpretation based on semantic data - e.g. Bolt’s "Put-that-there"  speech and gesture  fusion of high-level information derived from data- and feature- level fusion - highly resistant to noise and failures
  • 14. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 14October 23, 2017 Comparison of Fusion Levels Data-Level Fusion Feature-Level Fusion Decision-Level Fusion Used for raw data of the same modality closely coupled modalities loosely coupled modalities Level of detail (data) highest level of detail moderate level of detail difficult to recover data that has been fused on the data and feature level Sensitivity to noise and failures highly sensitive moderately sensitive not very sensitive Usage not frequently used for multimodal interfaces used to improve the recognition rate most widely used level of fusion for multimodal interfaces
  • 15. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 15October 23, 2017 Advantages of Multimodal Input  Support and accommodate a user's perceptual and communicative capabilities  natural user interfaces with new ways of engaging interaction  Integration of computational skillls of computers in the real world by offering more natural ways of human- machine interaction  Enhanced robustness due to the combination of different (partial) information sources  Flexible personalisation by using different modalities based on user preferences and context  Helps visually or physically impaired users
  • 16. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 16October 23, 2017 Multimodal Fission  Forms a comprehensible answer for the user by making use of multiple output modalities based on the device and context  builds an abstract message through a combination of channels  Three main steps  selection and structuring of content  selection of modalities  output coordination  Coordination of the output on each channel is necessary to form a coherent message
  • 17. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 17October 23, 2017 Ten Myths of Multimodal Interaction 1. If you build a multimodal system, users will interact multimodally  users show a strong preference to interact multimodally but this does not mean that they always interact multimodally  unimodal and multimodal interactions are mixed depending on the task to be carried out 2. Speech and pointing is the dominant multimodal integration pattern  heritage from Bolt's "Put-that-there" where gesture and speech are used to select an object (mouse-oriented metaphor)  modalities can be used for more than just object selection - written input (e.g. SpeeG2), facial expressions, ... Sharon Oviatt
  • 18. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 18October 23, 2017 Ten Myths of Multimodal Interaction ... 3. Multimodal input involves simultaneous signals  signals are often not overlapping  users frequently introduce (consciously or not) a small delay between two modal inputs  developers should not rely on an overlap for the fusion process 4. Speech is the primary input mode in any multimodal system that includes it  in human-human communication speech is indeed the primary input mode  in human-machine communication speech is not the exclusive carrier of important information and does also not have temporal precedence over other modalities
  • 19. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 19October 23, 2017 Ten Myths of Multimodal Interaction ... 5. Multimodal language does not differ linguistically from unimodal language  different modalities complement each other - e.g. avoid error-prone voice descriptions of spatial locations by pointing  in many respects multimodal language is substantially simplified - might lead to more robust systems 6. Multimodal integration involves redundancy of content between modes  early research in multimodal interaction assumed that different redundant modalities could improve the recognition  complementary modalities are more important and designers should not rely on duplicated content
  • 20. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 20October 23, 2017 Ten Myths of Multimodal Interaction ... 7. Individual error-prone recognition technologies combine multimodally to produce even greater unreliability  assumes that using two error-prone input modes such as speech and handwriting recognition results in greater unreliability  however, increased robustness due to mutual disambiguation between modalities can be observed 8. All users’ multimodal commands are integrated in a uniform way  different users use different integration patterns (e.g. parallel vs. sequential)  system that can detect and adapt to a user’s dominant integration pattern might lead to increased recognition rates
  • 21. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 21October 23, 2017 Ten Myths of Multimodal Interaction ... 9. Different input modes are capable of transmitting comparable content  some modalities can be compared (e.g. speech and writing)  however, each modality has its own very distinctive gestures - e.g. gaze vs. a pointing device such a the Wiimote 10.Enhanced efficiency is the main advantage of multimodal systems  often there is only a minimal increase in efficiency  however, many other advantages - less errors, enhanced usability, flexibility, mutual disambiguation, …
  • 22. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 22October 23, 2017 Formal Models for Combining Modalities  Formalisation of multimodal human-machine interaction  Conceptualise the different possible relationships between input and output modalities  Two conceptual spaces  CASE: use and combination or modalities on the system side (fusion engine)  CARE: possible combination of modalities at the user level
  • 23. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 23October 23, 2017 CASE Model  3 dimensions in the CASE design space  levels of abstraction  use of modalities  fusion Nigay and Coutaz, 1993
  • 24. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 24October 23, 2017 CASE Model: Levels of Abstraction  Data received from a device can be processed at multiple levels of abstraction  Speech analysis example  signal (data) level  phonetic (feature) level  semantic level  A multimodal system is always classified under the meaning category Nigay and Coutaz, 1993
  • 25. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 25October 23, 2017 CASE Model: Use of Modalities  The use of modalities expresses the temporal availability of multiple modalities  parallel use allows the user to employ multiple modalities simultaneously  sequential use forces the user to use the modalities one after another Nigay and Coutaz, 1993
  • 26. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 26October 23, 2017 CASE Model: Fusion  Possible combination of different types of data  combined means that there is fusion  independant means that there is an absence of fusion Nigay and Coutaz, 1993
  • 27. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 27October 23, 2017 CASE Model: Classification  Concurrent  two distinctive tasks executed in parallel  e.g. draw a circle and draw a square next to it  Alternate  task with temporal alternation of modalities  e.g. "Draw a circle there" followed by pointing to the location Nigay and Coutaz, 1993
  • 28. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 28October 23, 2017 CASE Model: Classification  Synergistic  task using several coreferent modalities in parallel  e.g. "Draw a circle here" with concurrent pointing to the location  synergistic multimodal systems are the ultimate goal  Exclusive  one task after the other with one modality at a time (no coreference) Nigay and Coutaz, 1993
  • 29. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 29October 23, 2017 CARE Properties  Four properties to characterise and assess aspects of multimodal interaction in terms of the combination of modalities at the user level [Coutaz et al., 1995]  Complementarity  Assignment  Redundancy  Equivalence
  • 30. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 30October 23, 2017 Complementarity  Multiple modalities are to be used within a temporal window in order reach a given state  No single modality on its own is sufficient to reach the state  Integration (fusion) can happen sequentially or in parallel  Example  “Please show me this list” and <pointing at ‘list of flights’ label>
  • 31. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 31October 23, 2017 Assignment  Only one modality can be used to reach a given state  Absence of choice  Example  Moving the mouse to change the position of a window (considering that the mouse is the only modality available for that operation)
  • 32. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 32October 23, 2017 Redundancy  If multiple modalities have the same expressive power (equivalent) and if they are used within the same temporal window  Repetitive behaviour without increasing the expressive power  Integration (fusion) can happen sequentially or in parallel  Example  “Could you show me the list of flights?” and <click on ‘list of flights’ button>
  • 33. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 33October 23, 2017 Equivalence  Necessary and sufficient to use any single of the available modalities  Choice between multiple modalities  No temporal constraints  Example  “Could you show me the list of flights?” or <click on ‘list of flights’ button>
  • 34. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 34October 23, 2017 Example for Equivalence: EdFest 2004  Edfest 2004 prototype con- sisted of a digital pen and paper interface in combination with speech input and voice output [Belotti et al., 2005]  User had a choice to execute some commands either via pen input or through speech input
  • 35. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 35October 23, 2017 Multimodal Interaction Frameworks  Software frameworks for the creation of multimodal interfaces  Squidy  OpenInterface  HephaisTK  Mudra  Three families of frameworks  stream based - Squidy, OpenInterface  event based - HephaisTK  hybrid - Mudra
  • 36. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 36October 23, 2017 Squidy  Integrated tool with a GUI  Interfaces are define by pipelining components and filters  Online knowledge base of device components and filters  Semantic zooming  e.g. zooming on a filter shows a graph of the data passing through the filter http://www.squidy-lib.de
  • 37. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 37October 23, 2017 OpenInterface Framework  European research project (2006-2009)  Component-based framework  Online component library with components for numerous modalities and devices  Design an application by pipelining components http://www.openinterface.org
  • 38. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 38October 23, 2017 HephaisTK  Software agents-based framework  Focus on events  Different fusion algorithms  Description of the dialogue via the SMUIML language http://www.hephaistk.org
  • 39. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 39October 23, 2017 UsiXML  User Interface eXtended Markup Language  XML-based declarative User Interface Description Language (UIDL) to des- cribe abstract interfaces  can later be mapped to different physical character- istics (including multimodal user interfaces) Stanciulescu et al., 2005
  • 40. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 40October 23, 2017 Mudra  Fusion across different levels of abstraction  unified fusion framework based on shared fact base  Interactions defined via declarative rule-based language  Rapid prototyping  simple integration of new input devices  integration of external gesture recognisers Hoste et al., 2011
  • 41. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 41October 23, 2017 References  R.A. Bolt, “Put-that-there”: Voice and Gesture at the Graphics Interface, In Proceedings of Siggraph 1980, 7th Annual Conference on Computer Graphics and Inter- active Techniques, Seattle, USA, July 1980  http://dx.doi.org/10.1145/965105.807503  L. Nigay and J. Coutaz, A Design Space for Multimodal Systems: Concurrent Processing and Data Fusion, In Proceedings of CHI 1993, 3rd International Conference on Human Factors in Computing Systems, Amsterdam, The Netherlands, April 1993  http://dx.doi.org/10.1145/169059.169143
  • 42. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 42October 23, 2017 References …  J. Coutaz et al., Four Easy Pieces for Assessing the Usability of Multimodal Interaction: The CARE Properties, In Proceedings of Interact 1995, 5th International Conference on Human-Computer Interaction, Lillehammer, Norway, June 1995  http://dx.doi.org/10.1007/978-1-5041-2896-4_19  B. Dumas, D. Lalanne and S. Oviatt, Multi- modal Interfaces: A Survey of Principles, Models and Frameworks, Human Machine Interaction, LNCS 5440, 2009  http://dx.doi.org/10.1007/978-3-642-00437-7_1
  • 43. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 43October 23, 2017 References …  S. Oviatt, Ten Myths of Multimodal Interaction, Communications of the ACM 42(11), November 1999  http://dx.doi.org/10.1145/319382.319398  Put-that-there demo  https://www.youtube.com/watch?v=RyBEUyEtxQo  R. Belotti, C. Decurtins, M.C. Norrie, B. Signer and L. Vukelja, Experimental Platform for Mobile Information Systems, Proceedings of MobiCom 2005, 11th Annual International Conference on Mobile Computing and Net- working, Cologne, Germany, August 2005  http://dx.doi.org/10.1145/1080829.1080856
  • 44. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 44October 23, 2017 References …  L. Hoste and B. Signer, SpeeG2: A Speech- and Gesture-based Interface for Efficient Controller-free Text Entry, Proceedings of ICMI 2013, 15th International Conference on Multimodal Interaction, Sydney, Australia, December 2013  http://beatsigner.com/publications/hoste_ICMI2013.pdf  L. Hoste, B. Dumas and B. Signer, Mudra: A Unified Multimodal Interaction Framework, Proceedings of ICMI 2011, 13th International Conference on Multimodal Interaction, Alicante, Spain, November 2011  http://beatsigner.com/publications/hoste_ICMI2011.pdf
  • 45. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 45October 23, 2017 References …  L. Hoste, A Declarative Approach for Engineer- ing Multimodal Interaction, PhD thesis, Vrije Universiteit Brussel (VUB), Brussels, Belgium, June 2015  http://beatsigner.com/theses/PhdThesisLodeHoste.pdf  A. Stanciulescu, Q. Limbourg, J. Vanderdonckt, B. Michotte and F. Montero, A Transformational Ap- proach for Multimodal Web User Interfaces Based on UsiXML, Proceedings of ICMI 2005, 7th International Conference on Multimodal Interfaces, Trento, Italy, October 2005  http://dx.doi.org/10.1145/1088463.1088508
  • 46. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 46October 23, 2017 References …  SpeeG2 demo  https://www.youtube.com/watch?v=ItKySNv8l90
  • 47. 2 December 2005 Next Lecture Pen-based Interaction