201404 Multimodal Detection of Affective States: A Roadmap Through Diverse Technologies

Javier Gonzalez-Sanchez
Javier Gonzalez-SanchezSoftware Engineer, Researcher, and CS Educator à Arizona State University
Multimodal Detection of Affective States:
A Roadmap Through Diverse Technologies
Javier Gonzalez-Sanchez, Maria-Elena Chavez-Echeagaray
Robert Atkinson, Winslow Burleson
!
!
School of Computing, Informatics, and Decision Systems Engineering
Arizona State University
Tempe, Arizona, USA
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or
distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work
must be honored. For all other uses, contact the owner/author(s). Copyright is held by the author/owner(s).
CHI 2014, April 26–May 1, 2014, Toronto, Ontario, Canada. ACM 14/04
1
Advancing Next Generation Learning Environments Lab
&
Motivational Environments Group
!
School of Computing, Informatics, and Decision Systems Engineering
Arizona State University
2
About Us
This work was supported by Office of Naval Research under	

 Grant N00014-10-1-0143
Robert Atkinson
!
Dr. Robert Atkinson is an Associate Professor in the Ira A. Schools of
Engineering and in the Mary Lou Fulton Teacher’s College. His
research explores the intersection of cognitive science, informatics,
instructional design, and educational technology.
His scholarship involves the design of instructional material,
including books, and computer-based learning environments
according to our understanding of human cognitive architecture and
how to leverage its unique constraints and affordances.
His current research focus involves the study of engagement and
flow in games.
3
Principal Investigator
This work was supported by Office of Naval Research under	

 Grant N00014-10-1-0143
Winslow Burleson
!
Dr. Winslow Burleson received his PhD from the MIT Media Lab,
Affective Computing Group.
He joined ASU's School of Computing and Informatics and the Arts,
Media, and Engineering graduate program at ASU in 2006. He has
worked with the Entrepreneurial Management Unit at the Harvard
Business School, Deutsche Telekom Laboratories, SETI Institute,
and IBM's Almaden Research Center where he was awarded ten
patents.
He holds an MSE from Stanford University's Mechanical Engineering
Product Design Program and a BA in Bio-Physics from Rice
University. He has been a co-Principal Investigator on the Hubble
Space Telescope's Investigation of Binary Asteroids and consultant
to UNICEF and the World Scout Bureau.
4
Principal Investigator
This work was supported by Office of Naval Research under	

 Grant N00014-10-1-0143
Javier Gonzalez-Sanchez
!
Javier is a doctoral candidate in Computer Science at Arizona State
University. His research interests are affect-driven adaptation,
software architecture, affective computing, and educational
technology.
He holds an MS in Computer Science from the Center for Research
and Advanced Studies of the National Polytechnic Institute and a BS
in Computer Engineering from the University of Guadalajara.
His experience includes 12+ years as a software engineer and 9+
years teaching undergraduate courses at Tecnologico de Monterrey
and graduate courses at Universidad de Guadalajara.
5
Doctoral Candidate
This work was supported by Office of Naval Research under	

 Grant N00014-10-1-0143
Helen Chavez-Echeagaray
!
Helen is a doctoral candidate in Computer Science at Arizona State
University. Her interests are in the areas of affective computing,
educational technology (including robotics), and learning processes.
The Tecnológico de Monterrey campus Guadalajara conferred upon
her the degree of MS in Computer Science and the degree of BS in
Computer Systems Engineering.
Before starting her PhD program she was a faculty member for 8+
years at Tecnologico de Monterrey. Her experience includes an
administrative position and software development.
6
Doctoral Candidate
Outline of the Course
7
Preface
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Context
8
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Context
9
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Justification
One novel part of the planning and design of the interaction between people
and computers, related to the facet of securing user satisfaction, is the capability
of systems to adapt to their individual users by showing empathy.
!
!
Being empathetic implies that the computer is able to recognize a user’s
affective states and understand the implication of those states.
!
!
Detection of affective states is a step forward to provide machines with
the necessary intelligence to identify and understand human emotions and then
appropriately interact with humans. Therefore, it is necessary to equip computers
with hardware and software to enable them to perceive users’ affective states and
then use this understanding to create more harmonic interactions.
10
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Motivation
This course provides a description and demonstration of tools and methodologies
necessary for automatically detecting affective states.
!
Automatic detection of affective states requires the computer
!
(1) to gather information that is complex and diverse;
!
(2) to process and understand information integrating several sources
(senses) that could range from brain-wave signals and biofeedback readings, to
face-based and gesture emotion recognition to posture and pressure sensing; and
!
(3) once the data is integrated, to apply perceiving algorithms software and
data processing tools to understand user’s status.
!
During the course we will review case examples (datasets) obtained from
previous research studies.
11
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Objectives
Attendees of this course will
!
Learn about sensing devices used to detect affective states including
brain-computer interfaces, face-based emotion recognition systems, eye-
tracking systems and physiological sensors.
!
Understand the pros and cons of the sensing devices used to detect
affective states.
!
Learn about the data that is gathered from each sensing device and
understand its characteristics. Learn about what it takes to manage (pre-
process and synchronize) affective data.
!
Learn about approaches and algorithms used to analyze affective data and
how it could be used to drive computer functionality or behavior.
12
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Schedule | Session 1
1. Introduction to Affective Human-Computer Interfaces (20 minutes)
!
2. Sensing Devices (60 minutes)
!
2.1. Brain-computer interfaces
!
2.2. Face-based emotion recognition systems
!
!
Break
!
!
!
2.3. Eye-tracking systems
!
2.4. Physiological sensors (skin conductance, pressure, posture)
13
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Schedule | Session 2
3. Data Filtering and Integration (20 minutes)
!
3.1. Gathered data
!
3.2. Data synchronization
!
4. Analyzing data (20 minutes)
!
4.1. Regression
!
4.2. Clustering and Classification
!
Break
!
5. Sharing Experience and Group Discussion (20 minutes)
!
6. Conclusions and Q&A (20 minutes)
14
Session I
sensors and data
15
I. Introduction
16
Session I
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Concepts
Multimodal Detection
of Affective States
instinctual reaction to stimulation
feelings, emotions
How do you feel?
physiological physical
17
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Concepts
18
Until recently much of the affective data gathered by systems has relied heavily on
learner’s self-report of their affective state, observation, or software data
logs [1].
!
But now many systems have started to include data from the physical
manifestations of affective states through the use of sensing devices and the
application of novel machine learning and data mining algorithms to deal with the
vast amounts of data generated by the sensors [2][3].
!
!
!
!
!
!
!
!
!
[1] R.S.J. Baker, M.M.T Rodrigo and U.E. Xolocotzin, “The Dynamics of Affective Transitions in Simulation Problem-solving Environments,” Proc. Affective Computing
and Intelligent Interaction: Second International Conference (ACII ’07), A. Paiva, R. Prada & R. W. Picard (Eds.), Springer Verlag, Vol. Lecture Notes in Computer
Science 4738, pp. 666-677.
!
[2] I. Arroyo, D. G. Cooper, W. Burleson, F. P. Woolf, K. Muldner, and R. Christopherson, “Emotion Sensors Go to School,” Proc. Artificial Intelligence in Education:
Building Learning Systems that Care: from Knowledge Representation to Affective Modelling, (AIED 09), V Dimitrova, R. Mizoguchi, B. du Boulay & A. Grasser (Eds.),
IOS Press, July 2009, vol. Frontiers in Artificial Intelligence and Applications 200, pp. 17-24.
!
[3] R. W. Picard, Affective Computing, MIT Press, 1997.
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Concepts
Multimodal Detection
of Affective States
measurement
identify the presence of ...
physiological measures
physical appearance
self-report
19
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Model
Mul$modal)emo$on)recogni$on)system)
User)
Brainwaves)
Eye)movements)
Facial)expressions)
Physiological)signals)
Sensing))
Devices)
Percep$on)
mechanisms)
Integra$on)
Algorithm)
Raw)data) Beliefs) State)
20
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Sensing Devices
21
Sensing devices obtain data from the user about his physiological responses and
body reactions.
Sensing devices are hardware devices that collect quantitative data as measures
of physiological signals of emotional change. We call the measures provided by
the sensing devices raw data.
Our approach includes the use of brain-computer interfaces, eye-tracking systems,
biofeedback sensors, and face-based emotion recognition systems [4].
The use of several sensing devices either to recognize a broad range of emotions
or to improve the accuracy of recognizing one emotion is referred to as a
multimodal approach.
!
!
!
!
[4] J. Gonzalez-Sanchez, R. M. Christopherson, M. E. Chavez-Echeagaray, D. C. Gibson, R. Atkinson, W. Burleson, “ How to Do Multimodal Detection of Affective
States?,” ICALT, pp.654-655, 2011 IEEE 11th International Conference on Advanced Learning Technologies, 2011
2. Sensing Devices | Brain-Computer Interfaces
22
Session I
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
BCI
Brain-Computer Interfaces (BCI)
!
It is a particular type of a physiological instrument that uses brainwaves as
information sources (electrical activity along the scalp produced by the firing of
neurons within the brain).
!
Emotiv® EPOC headset [5] device will be used to show how to collect and work
with this kind of data.
!
!
!
!
!
!
!
!
!
!
!
!
!
!
!
!
!
!
!
[5] Emotiv - Brain Computer Interface Technology. Retrieved April 26, 2011, from http://www.emotiv.com.
23
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
BCI
Wireless Emotiv® EEG Headset.
!
The device reports data with intervals of 125 ms (8 Hz).
!
The raw data output includes 14 values (7 channels on each brain
hemisphere: AF3, F7, F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4, F8, and
AF4) and two values of the acceleration of the head when leaning (GyroX
and GyroY).
!
The Affectiv Suite reports 5 emotions: engagement, boredom,
excitement, frustration, and meditation.
!
And the Expressiv Suite reports facial gestures: blink, wink (left and
right), look (left and right), raise brow, furrow brow, smile, clench, smirk (left
and right), and laugh.
!
!
24
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 25
Electrodes are situated and labeled according to the CMS/DRL configuration [6][7]
BCI
[6] Sharbrough F, Chatrian G-E, Lesser RP, Lüders H, Nuwer M, Picton TW. American Electroencephalographic Society Guidelines for Standard Electrode Position
Nomenclature. J. Clin. Neurophysiol 8: 200-2.
!
[7] Electroencephalography. Retrieved November 14th, 2010, from Electric and Magnetic Measurement of the Electric Activity of Neural Tissue: www.bem.fi/book/13/13.htm
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Emotiv
Systems
$299
emotions EEG data facial gestures
26
BCI
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Demo
Wireless Emotiv® EEG Headset
27
BCI
28
Affectiv Suite
29
Expressiv Suite
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 30
Field Description Values
Timestamp
It is the timestamp (date and time) of the computer running
the system. It could be used to synchronize the data with
other sensors.
Format "yymmddhhmmssSSS"
(y - year, m - month, d - day, h - hour, m - minutes, s -
seconds, S - milliseconds).
UserID Identifies the user. An integer value.
Wireless Signal Status Shows the strength of the signal. The value is from 0 to 4, 4 being the best.
Blink, Wink Left and Right,
Look Left and Right, Raise
Brow, Furrow, Smile,
Clench, Smirk Left and
Right, Laugh
Part of the expressive suite.
Values between 0 and 1, 1 being the value that
represents the highest power/probability for this
emotion.
Short Term and Long
Term Excitement,
Engagement / Boredom,
Meditation, Frustration
Part of the Affective Suite.
Values between 0 and 1, 1 being the value that
represents the highest power/probability for this
emotion.
AF3, F7, F3, FC5, T7,
P7, O1, O2, P8, T8,
FC6, F4, F8, AF4.
Raw data coming from each of the 14 channels. The name of
these fields were defined according with the CMS/DRL
configuration [XXX][XXXX].
Values of 4000 and higher.
GyroX and GyroY
Information about how the head moves/accelerates according
to X and Y axis accordingly.
BCI
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 31
Timestamp AF3 F7 F3 FC5 T7 P7 O1 O2 P8 T8 FC6 F4 F8 AF4 AccX AccY
1011161125449014542.054831.79 4247.18 4690.26 4282.56 4395.38 4591.79 4569.23 4360 4570.77 4297.44 4311.28 4282.56 4367.18 1660 2003
1011161125449014536.924802.05 4243.08 4673.85 4272.31 4393.33 4592.82 4570.26 4354.87 4570.26 4292.31 4309.74 4277.95 4370.77 1658 2002
1011161125450104533.334798.97 4234.87 4669.74 4301.03 4396.92 4592.31 4570.77 4351.28 4561.03 4281.54 4301.54 4271.28 4363.59 1659 2003
1011161125450104549.234839.49 4241.03 4691.28 4333.85 4397.95 4596.41 4567.18 4355.9 4556.41 4286.15 4306.15 4277.95 4369.74 1659 2003
101116112545010 45804865.64 4251.79 4710.26 4340 4401.54 4603.59 4572.82 4360 4558.46 4298.97 4324.62 4296.41 4395.9 1657 2004
1011161125450104597.444860 4252.82 4705.64 4350.26 4412.31 4603.59 4577.44 4357.44 4555.9 4295.38 4329.23 4296.41 4414.36 1656 2005
1011161125450104584.624847.69 4246.67 4690.26 4360 4409.23 4597.44 4569.74 4351.79 4549.74 4278.97 4316.92 4272.82 4399.49 1656 2006
1011161125450104566.154842.05 4238.46 4684.1 4322.05 4389.74 4592.82 4566.67 4351.79 4549.74 4274.36 4310.26 4262.05 4370.77 1655 2005
1011161125450104563.594844.62 4231.79 4687.69 4267.69 4387.69 4594.36 4580 4361.03 4556.41 4278.97 4310.77 4274.36 4370.77 1653 2006
1011161125450104567.184847.18 4233.33 4688.72 4285.13 4409.23 4602.05 4589.23 4368.21 4560 4280.51 4310.77 4281.54 4390.26 1655 2004
1011161125450104570.264846.67 4234.87 4683.08 4323.08 4415.9 4604.1 4585.64 4366.67 4557.44 4277.95 4310.26 4273.33 4384.1 1652 2005
1011161125450104569.234842.56 4233.85 4678.46 4310.77 4402.56 4598.97 4583.08 4364.1 4553.85 4277.44 4310.26 4271.28 4372.31 1654 2005
1011161125450104558.464832.82 4234.87 4676.92 4301.03 4389.74 4595.38 4590.26 4368.72 4556.92 4280 4310.26 4276.92 4380 1653 2004
1011161125450104555.94831.79 4233.33 4679.49 4314.36 4390.26 4597.95 4598.97 4374.87 4562.56 4280.51 4311.28 4280 4386.15 1653 2004
1011161125450104569.744842.56 4232.82 4684.1 4303.59 4405.64 4609.74 4600 4378.46 4567.18 4278.97 4313.33 4280 4382.05 1653 2002
1011161125450104574.364846.67 4235.38 4683.08 4293.33 4416.41 4619.49 4604.1 4382.56 4570.77 4280.51 4310.77 4282.05 4382.05 1652 2002
1011161125450104562.054840.51 4227.18 4673.85 4300 4405.13 4611.28 4601.03 4376.41 4561.54 4280 4303.59 4279.49 4374.87 1652 2000
Raw Data
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 32
Timestamp ID Signal Blink Wink L Wink R Look L Look R Eyebrow Furrow Smile Clench Smirk L Smirk R Laugh
101116091145065 0 2 0 0 0 1 0 0 0 0 0 0 0.988563 0
101116091145190 0 2 0 0 0 1 0 0 0 0.45465 0 0 0 0
101116091145315 0 2 0 0 0 1 0 0 0 0.467005 0 0 0 0
101116091145440 0 2 0 0 0 1 0 0 0 0.401006 0 0 0 0
101116091145565 0 2 0 0 0 1 0 0 0 0.248671 0 0 0 0
101116091145690 0 2 0 0 0 1 0 0 0 0.173023 0 0 0 0
101116091145815 0 2 0 0 0 1 0 0 0 0.162788 0 0 0 0
101116091145940 0 2 0 0 0 1 0 0 0 0.156485 0 0 0 0
101116091146065 0 2 0 0 0 1 0 0 0 0 0 0.776925 0 0
101116091146190 0 2 0 0 0 1 0 0 0 0 0 0 0.608679 0
101116091146315 0 2 0 0 0 1 0 0 0 0 0 0 0.342364 0
101116091146440 0 2 0 0 0 1 0 0 0 0 0 0 0.149695 0
101116091146565 0 2 0 0 0 1 0 0 0 0 0 0 0.0864399 0
101116091146690 0 2 0 0 0 1 0 0 0 0 0 0 0.0733481 0
101116091146815 0 2 0 0 0 1 0 0 0 0 0 0 0.118965 0
101116091146941 0 2 0 0 0 1 0 0 0 0 0 0 0.259171 0
Expressiv Data
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 33
Timestamp Short Term Excitement Long Term Excitement Engagement Meditation Frustration
101116091145065 0.447595 0.54871 0.834476 0.333844 0.536197
101116091145190 0.447595 0.54871 0.834476 0.333844 0.536197
101116091145315 0.447595 0.54871 0.834476 0.333844 0.536197
101116091145440 0.487864 0.546877 0.834146 0.339548 0.54851
101116091145565 0.487864 0.546877 0.834146 0.339548 0.54851
101116091145690 0.487864 0.546877 0.834146 0.339548 0.54851
101116091145815 0.487864 0.546877 0.834146 0.339548 0.54851
101116091145940 0.521663 0.545609 0.839321 0.348321 0.558228
101116091146065 0.521663 0.545609 0.839321 0.348321 0.558228
101116091146190 0.521663 0.545609 0.839321 0.348321 0.558228
101116091146315 0.521663 0.545609 0.839321 0.348321 0.558228
101116091146440 0.509297 0.544131 0.84401 0.358717 0.546771
101116091146565 0.509297 0.544131 0.84401 0.358717 0.546771
101116091146690 0.509297 0.544131 0.84401 0.358717 0.546771
101116091146815 0.509297 0.544131 0.84401 0.358717 0.546771
101116091146941 0.451885 0.541695 0.848087 0.368071 0.533919
Affectiv Data
2. Sensing Devices | Face-Based Emotion Recognition
34
Session I
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Face-Based Recognition
Face-based emotion recognition systems
!
These systems infer affective states by capturing images of the users’ facial
expressions and head movements.
!
We are going to show the capabilities of face-based emotion recognition systems
using a simple 30 fps USB webcam and software from MIT Media Lab [8].
!
!
!
!
!
!
!
!
!
!
!
!
[8] R. E. Kaliouby and P. Robinson, “Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures,” Proc. Conference on
Computer Vision and Pattern Recognition Workshop (CVPRW ‘04), IEEE Computer Society, June 2004, Volume 10, p. 154.
35
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Face-Based Recognition
MindReader API enables the real time analysis, tagging and inference of
cognitive affective mental states from facial video in real-time.
!
This framework combines vision-based processing of the face with predictions
of mental state models to interpret the meaning underlying head and facial
signals over time.
!
It provides results at intervals of approximately 100 ms (10 Hz).
!
!
With this system it is possible to infer emotions such as agreeing,
concentrating, disagreeing, interested, thinking, and unsure.
!
(Ekman and Friesen 1978) – Facial Action Coding System, 46 actions (plus
head movements).
36
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Demo
MindReader Software from MIT Media Lab
37
Face-Based Recognition
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 38
!
19!Lip!Corner!Depressor!
!
26!Jaw!Drop!
!
27!Mouth!Stretch!
Enero!22,!2010! Javier!González!!Sánchez!|!María!E.!Chávez!Echeagaray! 22!
Face-Based Recognition
39
Mind Reader
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 40
Field Description Values
Timestamp
It is the timestamp (date and time) of the computer running
the system. It could be used to synchronize the data with
other sensors.
Format "yymmddhhmmssSSS"
(y - year, m - month, d - day, h - hour, m - minutes,
s - seconds, S - milliseconds).
Agreement,
Concentrating,
Disagreement,
Interested, Thinking,
Unsure
This value shows the probability of this emotion being present
on the user at a particular time (frame).
This value is between 0 and 1.
If the value is -1 it means it was not possible to define
an emotion.
This happens when the user's face is out of the
camera focus.
Mind Reader
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 41
Timestamp Agreement Concentrating Disagreement Interested Thinking Unsure
101116112838516 0.001836032 0.999917 1.79E-04 0.16485406 0.57114255 0.04595062
101116112838578 0.001447654 0.9999516 1.29E-04 0.16310683 0.5958921 0.042706452
101116112838672 5.97E-04 0 1.5E-04 0.44996294 0.45527613 0.00789697
101116112838766 2.46E-04 0 1.75E-04 0.77445686 0.32144752 0.001418217
101116112838860 1.01E-04 0 2.04E-04 0.93511915 0.21167138 2.53E-04
101116112838953 4.18E-05 0 2.38E-04 0.983739 0.13208677 4.52E-05
101116112839016 1.72E-05 0 2.78E-04 0.9960774 0.07941038 8.07E-06
101116112839110 7.1E-06 0 3.24E-04 0.99906266 0.046613157 1.44E-06
101116112839156 2.92E-06 0 3.77E-04 0.99977654 0.026964737 2.57E-07
101116112839250 1.21E-06 0 4.4E-04 0.9999467 0.015464196 4.58E-08
101116112839391 4.97E-07 0 5.12E-04 0.9999873 0.008824189 8.18E-09
101116112839438 2.05E-07 0 5.97E-04 0.999997 0.005020725 1.46E-09
101116112839547 8.43E-08 0 6.96E-04 0.9999993 0.002851939 2.6E-10
101116112839578 3.47E-08 0 8.11E-04 0.9999999 0.001618473 4.64E-11
101116112839688 1.43E-08 0 9.45E-04 0.99999994 9.18E-04 8.29E-12
101116112839781 5.9E-09 0 0.001101404 1 5.21E-04 1.48E-12
101116112839828 2.43E-09 0 0.001283521 1 2.95E-04 2.64E-13
Mind Reader
2. Sensing Devices | Eye-Tracking Systems
42
Session I
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Eye-tracking systems
!
These are instruments that measure eye position and eye movement in order to
detect zones in which the user has particular interest in a specific time and
moment.
!
Datasets from Tobii®Eye-tracking system [9] data will be shown.
!
!
!
!
!
!
!
!
!
!
!
[9] Tobii Technology - Eye Tracking and Eye Control. Retrieved April 26, 2011, from http://www.tobii.com.
Eye-Tracking System
43
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Eye-Tracking System
Tobii® Eye Tracker.
!
The device reports data with intervals of 100 ms (10Hz).
!
The output provides data concerning attention direction (gaze-x, gaze-y),
duration of fixation, and pupil dilation.
44
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Demo
Tobii® Eye Tracker
45
Eye-Tracking System
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 46
Field Description Values
Timestamp
It is the timestamp (date and time) of the computer running
the system. It could be used to synchronize the data with
other sensors.
Format "yymmddhhmmssSSS"
(y - year, m - month, d - day, h - hour, m - minutes, s -
seconds, S - milliseconds).
GazePoint X
The horizontal screen position for either eye or the average
for both eyes. This value is also used for the fixation definition.
0 is the left edge, the maximum value of the horizontal
screen resolution is the right edge.
GazePoint Y
The vertical screen position for either eye or the average for
both eyes. This value is also used for the fixation definition.
0 is the bottom edge, the maximum value of the
vertical screen resolution is the upper edge.
Pupil Left Pupil size (left eye) in mm. Varies
Validity Left Validity of the gaze data.
0 to 4. 0 if the eye is found and the tracking quality
good. If the eye cannot be found by the eye tracker,
the validity code will be 4.
Pupil Right Pupil size (right eye) in mm. Varies
Validity Right Validity of the gaze data.
0 to 4. 0 if the eye is found and the tracking quality
good. If the eye cannot be found by the eye tracker,
the validity code will be 4.
FixationDuration Fixation duration. The time in milliseconds that a fixation lasts. Varies
Event Events, automatic and logged, will show up under Event. Varies
AOI
Areas Of Interest if fixations on multiple AOIs are to be written
on the same row.
Varies
Eye-Tracking System
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 47
Timestamp GPX GPY Pupil Left Validity L Pupil Right Validity R Fixation Event AOI
101124162405582 636 199 2.759313 0 2.88406 0 48 Content
101124162405599 641 207 2.684893 0 2.855817 0 48 Content
101124162405615 659 211 2.624458 0 2.903861 0 48 Content
101124162405632 644 201 2.636186 0 2.916132 0 48 Content
101124162405649 644 213 2.690685 0 2.831013 0 48 Content
101124162405666 628 194 2.651784 0 2.869714 0 48 Content
101124162405682 614 177 2.829281 0 2.899828 0 48 Content
101124162405699 701 249 2.780344 0 2.907665 0 49 Content
101124162405716 906 341 2.853761 0 2.916398 0 49 Content
101124162405732 947 398 2.829427 0 2.889944 0 49 Content
101124162405749 941 400 2.826602 0 2.881179 0 49 Content
101124162405766 938 403 2.78699 0 2.87948 0 49 KeyPress Content
101124162405782 937 411 2.803387 0 2.821803 0 49 Content
101124162405799 934 397 2.819166 0 2.871547 0 49 Content
101124162405816 941 407 2.811687 0 2.817927 0 49 Content
101124162405832 946 405 2.857419 0 2.857427 0 49 Content
101124162405849 0 0 -1 4 -1 4 49 Content
Eye-Tracking System
2. Sensing Devices | Galvanic Skin Conductance Sensor
48
Session I
49
Galvanic Skin Conductance
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Galvanic Skin Conductance
Provide information on the activity of physiological functions of an individual.
!
!
Arousal detection. Measures the electrical conductance of the skin, which
varies with its moisture level that depends on the sweat glands, which are
controlled by the sympathetic and parasympathetic nervous systems. [10]
!
!
Hardware designed by MIT Media Lab.
!
!
It is a Wireless Bluetooth device that reports data in intervals of
approximately 500 ms (2Hz)
!
!
!
!
[10] M. Strauss, C. Reynolds, S. Hughes, K. Park, G. McDarby, and R.W. Picard, “The HandWave Bluetooth Skin Conductance Sensor,” Proc. First International
Conference on Affective Computing and Intelligent Interaction (ACII 05), Springer-Verlang, Oct. 2005, pp. 699-706, doi:10.1007/11573548_90.
50
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Demo
Skin Electrical Conductance Sensor
51
Galvanic Skin Conductance
52
Galvanic Skin Conductance
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 53
Field Description Values
Timestamp
It is the timestamp (date and time) of the computer running
the system. It could be used to synchronize the data with
other sensors.
Format "yymmddhhmmssSSS"
(y - year, m - month, d - day, h - hour, m - minutes, s -
seconds, S - milliseconds).
Battery Voltage Level of the battery voltage. 0 - 3 Volts
Conductance Level of arousal. 0 - 3 Volts
Galvanic Skin Conductance
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 54
Timestamp Voltage Conductance
101116101332262 2.482352941 1.030696176
101116101332762 2.482352941 1.023404165
101116101333262 2.482352941 1.019813274
101116101333762 2.482352941 1.041657802
101116101334247 2.482352941 0.998280273
101116101334747 2.482352941 0.991181142
101116101335247 2.482352941 0.980592229
101116101335747 2.482352941 0.998280273
101116101336247 2.482352941 1.012586294
101116101336762 2.482352941 1.012586294
101116101337231 2.482352941 1.012586294
101116101337747 2.482352941 1.009008251
101116101338247 2.482352941 0.998280273
101116101338747 2.482352941 0.991181142
101116101339247 2.482352941 0.987628521
101116101339731 2.482352941 0.987628521
101116101340231 2.482352941 0.980592229
Galvanic Skin Conductance
2. Sensing Devices | Pressure Sensor
55
Session I
56
Pressure Sensor
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Pressure Sensor
Provide information on the activity of physiological functions of an individual.
!
!
Pressure sensors are able to detect the increasing amount of pressure
(correlated with levels of frustration) that the user puts on a mouse, or
any other controller (such as a game controller) [11].
!
!
Hardware designed by MIT Media Lab.
!
!
It is a serial device that reports data in intervals of approximately 150 ms
(6Hz).
!
!
!
!
[11] Y. Qi, and R. W. Picard, "Context-Sensitive Bayesian Classifiers and Application to Mouse Pressure Pattern Classification," Proc. International Conference on
Pattern Recognition (ICPR 02), Aug. 2002, vol 3, pp. 30448, doi:10.1109/ICPR.2002.1047973.
57
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Demo
Mouse Pressure Sensor
58
Pressure Sensor
59
Pressure Sensor
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 60
Field Description Values
Timestamp
It is the timestamp (date and time) of the computer running
the system. It could be used to synchronize the data with
other sensors.
Format "yymmddhhmmssSSS"
(y - year, m - month, d - day, h - hour, m - minutes, s -
seconds, S - milliseconds).
Right Rear Sensor positioned in the right rear of the mouse. 0 - 1024. 0 being the highest pressure.
Right Front Sensor positioned in the right front of the mouse. 0 - 1024. 0 being the highest pressure.
Left Rear Sensor positioned in the left rear of the mouse. 0 - 1024. 0 being the highest pressure.
Left Front Sensor positioned in the left front of the mouse. 0 - 1024. 0 being the highest pressure.
Middle Rear Sensor positioned in the middle rear of the mouse. 0 - 1024. 0 being the highest pressure.
Middle Front Sensor positioned in the middle front of the mouse. 0 - 1024. 0 being the highest pressure.
Pressure Sensor
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 61
Timestamp Right Rear Right Front Left Rear Left Front Middle Rear Middle Front
110720113306312 1023 1023 1023 1023 1023 1023
110720113306468 1023 1023 1023 1023 1023 1023
110720113306625 1023 998 1023 1002 1023 1023
110720113306781 1023 1009 1023 977 1023 1023
110720113306937 1023 794 1023 982 1023 1023
110720113307109 1023 492 1022 891 1023 1023
110720113307265 1023 395 1021 916 1019 1023
110720113307421 1023 382 1021 949 1023 1023
110720113307578 1023 364 1022 983 1023 1023
110720113307734 1023 112 1021 1004 1023 1023
110720113307890 1023 204 1021 946 1023 1023
110720113308046 1023 465 1022 971 1023 1023
110720113308203 1023 404 1022 1023 1023 1023
110720113308359 1023 166 1021 1023 1023 1023
110720113308515 1023 145 1021 1023 1023 1023
110720113308687 1023 154 1021 1023 1023 1023
110720113308843 1023 126 1021 1023 1023 1023
Pressure Sensor
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 62
Pressure Sensor
The raw data from the mouse is processed to obtain meaningful
information [12].
The data obtained is a normalized value from the six different sensors
on the mouse.
!
!
!
!
!
!
[12] Cooper, D., Arroyo, I., Woolf, B., Muldner, K., Burleson, W., and Christopherson, R. (2009). Sensors model student self concept in the classroom,
User Modeling, Adaptation, and Personalization, 30--41
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 63
Timestamp Right Rear Right Front Left Rear Left Front Middle Rear Middle Front Value
110720113306312 1023 1023 1023 1023 1023 1023 6
110720113306468 1023 1023 1023 1023 1023 1023 6
110720113306625 1023 998 1023 1002 1023 1023 5.955034213
110720113306781 1023 1009 1023 977 1023 1023 5.941348974
110720113306937 1023 794 1023 982 1023 1023 5.736070381
110720113307109 1023 492 1022 891 1023 1023 5.350928641
110720113307265 1023 395 1021 916 1019 1023 5.275659824
110720113307421 1023 382 1021 949 1023 1023 5.299120235
110720113307578 1023 364 1022 983 1023 1023 5.315738025
110720113307734 1023 112 1021 1004 1023 1023 5.088954057
110720113307890 1023 204 1021 946 1023 1023 5.122189638
110720113308046 1023 465 1022 971 1023 1023 5.402737048
110720113308203 1023 404 1022 1023 1023 1023 5.393939394
110720113308359 1023 166 1021 1023 1023 1023 5.160312805
110720113308515 1023 145 1021 1023 1023 1023 5.139784946
110720113308687 1023 154 1021 1023 1023 1023 5.1485826
110720113308843 1023 126 1021 1023 1023 1023 5.121212121
Pressure Sensor
2. Sensing Devices | Posture Sensor
64
Session I
65
Posture Sensor
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Posture detection using a low-cost, low-resolution pressure sensitive seat
cushion and back pad.
!
Developed at ASU based on experience using a more expensive high
resolution unit from the MIT Media Lab [13].
!
!
!
!
!
!
!
!
!
[13] S. Mota, and R. W. Picard, "Automated Posture Analysis for Detecting Learners Interest Level," Proc. Computer Vision and Pattern Recognition Workshop
(CVPRW 03), IEEE Press, June 2003, vol. 5, pp. 49, doi:10.1109/CVPRW.2003.10047.
Posture Sensor
66
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Demo
Chair Posture Sensor
67
Posture Sensor
68
Posture Sensor
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 69
Field Description Values
Timestamp
It is the timestamp (date and time) of the computer running
the system. It could be used to synchronize the data with
other sensors.
Format "yymmddhhmmssSSS"
(y - year, m - month, d - day, h - hour, m - minutes, s -
seconds, S - milliseconds).
AccX Value of the X axis of the accelerometer. Varies
AccY Value of the Y axis of the accelerometer. Varies
Right Seat Sensor positioned in the right side of the seat cushion. 0 - 1024. 1024 being the highest pressure.
Middle Seat Sensor positioned in the middle of the seat cushion. 0 - 1024. 1024 being the highest pressure.
Left Seat Sensor positioned in the left side of the seat cushion. 0 - 1024. 1024 being the highest pressure.
Right Back Sensor positioned in the right side of the back pad. 0 - 1024. 1024 being the highest pressure.
Middle Back Sensor positioned in the middle of the back pad. 0 - 1024. 1024 being the highest pressure.
Left Back Sensor positioned in the left side of the back pad. 0 - 1024. 1024 being the highest pressure.
Posture Sensor
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 70
Timestamp AccX AccY Right Seat Middle Seat Left Seat Right Back Middle Back Left Back
110720074358901 980 -171 1015 1019 1012 976 554 309
110720074359136 969 -169 1008 1004 1012 978 540 305
110720074359401 1008 -165 1015 1012 1008 974 554 368
110720074359636 993 -166 1001 1004 1016 975 548 306
110720074359854 994 -167 1015 1011 1003 967 559 418
110720074400120 970 -167 1011 1008 1001 968 620 358
110720074400354 977 -166 1011 1011 1013 968 541 413
110720074400589 985 -128 1012 1010 1006 974 565 314
110720074400839 996 -182 1016 1014 1012 972 668 290
110720074401089 991 -185 1012 1012 1004 858 108 2
110720074401526 1068 -310 1019 522 1001 32 0 421
110720074401557 937 -124 92 0 0 0 1 247
110720074401714 979 -104 103 0 0 0 0 87
110720074401745 957 -165 143 3 0 0 0 0
110720074402026 945 -171 126 0 0 0 1 3
110720074402339 948 -169 0 0 1 0 0 0
110720074402620 952 -166 3 0 1 0 0 0
Posture Sensor
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 71
Posture Sensor
The raw data from the six chair sensors is processed [13] to obtain net seat
change, net back change, and sit forward values.
!
!
!
!
!
!
!
!
[12] Cooper, D., Arroyo, I., Woolf, B., Muldner, K., Burleson, W., and Christopherson, R. (2009). Sensors model student self concept in the classroom,
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 72
Timestamp Right Seat Middle Seat Left Seat Right Back Middle Back Left Back
NetSeat
Change
NetBack
Change
SitForward
110720074358901 1015 1019 1012 976 554 309 12 152 0
110720074359136 1008 1004 1012 978 540 305 22 20 0
110720074359401 1015 1012 1008 974 554 368 19 81 0
110720074359636 1001 1004 1016 975 548 306 30 69 0
110720074359854 1015 1011 1003 967 559 418 34 131 0
110720074400120 1011 1008 1001 968 620 358 9 122 0
110720074400354 1011 1011 1013 968 541 413 15 134 0
110720074400589 1012 1010 1006 974 565 314 9 129 0
110720074400839 1016 1014 1012 972 668 290 14 129 0
110720074401089 1012 1012 1004 858 108 2 14 962 0
110720074401526 1019 522 1001 32 0 421 500 1353 0
110720074401557 92 0 0 0 1 247 2450 207 1
110720074401714 103 0 0 0 0 87 11 161 1
110720074401745 143 3 0 0 0 0 43 87 1
110720074402026 126 0 0 0 1 3 20 4 1
110720074402339 0 0 1 0 0 0 127 4 1
110720074402620 3 0 1 0 0 0 3 0 1
Posture Sensor
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 73
Sensing
Device
(rate in Hz)
Legacy Software
Sensing
(Input or Raw Data)
Physiological responses and/or
Emotion reported (output or sensed values)
Emotiv® EEG
headset
(128 Hz)
Emotiv® SDK Brain Waves
EEG activity. Reported in 14 channels [16],: AF3, F7, F3, FC5, T7, P7, O1, O2,
P8, T8, FC6, F4, F8, and AF4.
Face activity. Blink, wink (left and right), look (left and right), raise brow, furrow
brow, smile, clench, smirk (left and right), and laugh.
!
Emotions. Excitement, engagement, boredom, meditation and frustration.
Standard Webcam
(10 Hz)
MIT Media Lab MindReader Facial Expressions Emotion. Agreeing, concentrating, disagreeing, interested, thinking and unsure.
MIT skin conductance sensor
(2 Hz)
USB driver Skin Conductivity Arousal.
MIT pressure sensor
(6 Hz)
USB driver Pressure One pressure value per sensor allocated into the input/control device.
Tobii® Eye tracking
(60 Hz)
Tobii® SDK Eye Tracking Gaze point (x, y).
MIT posture sensor
(6 Hz)
USB driver Pressure
Pressure values in the back and the seat (in the right, middle and left zones) of a
cushion chair.
Summary
Session 2
working with data and research experiences
74
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Background
75
The datasets shown in the next slides correspond to the data collected in three
studies. The stimuli, protocol, and participants are described briefly in the
following paragraphs.
!
1. Study One: Subjects playing a video game.
!
2. Study Two: Subjects reading documents with and without pictures.
!
3. Study Three: Subjects solving tasks in a Tutoring System.
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Study One
76
Stimuli. The high-fidelity graphic, deeply engaging, Guitar Hero® video game.
The game involves holding a guitar interface while listening to music and watching
a video screen.
!
Protocol. The study consists of a one-hour session with (1) 15 minutes to practice
with the objective that the user gets familiar with the game controller and the
environment and (2) 45 minutes when users played four songs of their choice, one
of each level: easy, medium, hard and expert.
!
Participants. The call for participation was an open call among Arizona State
University students. The experiment was run over 21 subjects, 67% of them were
men and 33% women. The age ranged from 18 to 28 years. The study includes
subjects with different (self-reported) experience playing video games, where 15%
never played before, 5% were not skilled in video games, 28% reported being
slightly skilled, 33% somewhat skilled, 14% very skilled, and only 5% reported
themselves as experts.
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Study Two
77
Stimuli. On-screen reading material was used for this study. Two different
types of reading material were used, one with either on-task or off-task images,
captions, and drawings; and the second containing only text.
!
Protocol. The study consists of one 60-minute session where the subject is
presented with 10 pages from a popular Educational Psychology textbook and
asked to read for understanding. Each participant was asked to complete a pre
and post-test.
!
Participants. The call for participation also was an open call among Arizona State
University students. The study was run over 28 subjects.
!
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Study Three
78
Stimuli. A Tutoring System for system dynamics. The tutor tutors students on
how to model systems behavior using a graphical representation and algebraic
expressions. The model is represented using a directed graph structure that
defines a topology formed by nodes and links. Once there is a complete model,
students can execute and debug it [13].
!
Protocol. Students were asked to perform a set of tasks (about system dynamics
modeling) using the tutoring system. While the student was working on these
tasks, the tutoring system collected emotional state information with the intention
of being able to generate better and more accurate hints and feedback.
!
Participants. Pilot test during Summer 2010 with 2 groups of 30 high-school
students.
!
!
[14] K. VanLehn, W. Burleson, M.E. Chavez-Echeagaray, R. Christopherson, J. Gonzalez-Sanchez, Y. Hidalgo-Pontet, and L. Zhang. “The Affective Meta-Tutoring
Project: How to motivate students to use effective meta-cognitive strategies,” Proceedings of the 19th International Conference on Computers in Education. Chiang
Mai, Thailand: Asia- Pacific Society for Computers in Education. October 2011. In press.
3. Data Filtering and Integration
79
Session 2
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Filtering
80
Filtering the data includes cleaning, synchronizing, and averaging
or passing a threshold or a range.
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Filtering and Integration
81
The multimodal emotion recognition considers the existence of different
sources of data that contribute to infer the affective state of an individual.
Each sensing device has its own type of data and sample rate.
The challenge is how to combine the data coming from these diverse
sensing devices that have proven their independent functionality and create an
improved unique output.
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Filtering and Integration
Centre
Agent
Agent
Data
Logger
!
Data
Tutoring
Multimodal
82
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Framework
83
We developed a framework that follows the organizational strategy of an agent
federation [15].
!
!
The federation assigns one agent to collect raw data from each sensing device.
!
!
That agent implements the perception mechanism for its assigned
sensing device to map raw data into beliefs.
!
!
Each agent is autonomous and encapsulates one sensing device and its
perception mechanisms into independent, individual, and intelligent components.
All the data is timestamped and independently identified by agent.
!
!
!
[15] Gonzalez-Sanchez, J.; Chavez-Echeagaray, M.E.; Atkinson, R.; and Burleson, W. (2011), "ABE: An Agent-Based Software Architecture for a Multimodal Emotion
Recognition Framework," in Proceedings of Working IEEE/IFIP Conference on Software Architecture (June 2011).
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Framework
84
When several agents need to interact, it is important to establish an organizational
strategy for them, which defines authority relationships, data flows and
coordination protocols [16] [17].
!
!
These beliefs are reported to a central agent, which integrates them into one
affective state report.
!
!
Third-party systems are able to obtain affective state reports from the centre-agent
using a publish-subscribe style.
!
!
!
!
!
[16] F. Tuijnman, and H. Afsarmanesh, "Distributed objects in a federation of autonomous cooperating agents," Proc. International Conference on
Intelligent and Cooperative Information Systems, May 1993, pp. 256-265, doi:10.1109/ICICIS.1993.291763. 
!
[17] M. Wood, and S. DeLoach, “An overview of the multiagent systems engineering methodology,” Agent-Oriented Software Engineering, (AOSE
2000), Springer-Verlag, 2001, pp. 207-221, doi:10.1007/3-540-44564-1_14.
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Framework | Architecture
[15] Gonzalez-Sanchez, J.; Chavez-Echeagaray, M.E.; Atkinson, R.; and Burleson, W. (2011), "ABE: An Agent-Based Software Architecture for a Multimodal
Emotion Recognition Framework," in Proceedings of Working IEEE/IFIP Conference on Software Architecture (June 2011).
85
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Integration | sparse
86
[18] J. Liu, S. Ji, and J. Ye. SLEP: Sparse Learning with Efficient Projections. Arizona State University, 2009. http://www.public.asu.edu/~jye02/Software/SLEP.
timestamp fixationIndex gazePointX gazePointY
mappedFixationPo
intX
mappedFixationPo
intY fixationDuration
Short Term
Excitement
Long Term
Excitement
Engagement/
Boredom Meditation Frustration Conductance agreement concentrating
4135755652 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628
4135755659 213 573 408 570 408 216
4135755668 0.436697 0.521059 0.550011 0.335825 0.498908
4135755676 213 566 412 570 408 216
4135755692 213 565 404 570 408 216
4135755709 213 567 404 570 408 216
4135755714
4135755726 213 568 411 570 408 216
4135755742 213 568 409 570 408 216
4135755759 213 563 411 570 408 216
4135755761
4135755776 213 574 413 570 408 216
4135755792 213 554 402 570 408 216
4135755809 214 603 409 696 405 216
4135755824
4135755826 214 701 407 696 405 216
4135755842 214 697 403 696 405 216
4135755859 214 693 401 696 405 216
4135755876 214 700 402 696 405 216
4135755892 214 701 411 696 405 216
4135755909 214 686 398 696 405 216
4135755918
4135755926 214 694 399 696 405 216
4135755942 214 694 407 696 405 216
4135755959 214 698 404 696 405 216
4135755964
4135755976 214 704 398 696 405 216
4135755992 214 693 415 696 405 216
4135756009 214 696 411 696 405 216
4135756025 215 728 406 804 387 183
4135756027 0.436697 0.521059 0.550011 0.335825 0.498908 1 1
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Integration | state machine
87
timestamp fixationIndex gazePointX gazePointY
mappedFixationPo
intX
mappedFixationPo
intY fixationDuration
Short Term
Excitement
Long Term
Excitement
Engagement/
Boredom Meditation Frustration Conductance agreement concentrating
4135755652 213 574 414 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755659 213 573 408 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755668 213 573 408 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755676 213 566 412 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755692 213 565 404 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755709 213 567 404 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755714 213 567 404 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755726 213 568 411 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755742 213 568 409 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755759 213 563 411 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755761 213 563 411 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755776 213 574 413 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755792 213 554 402 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755809 214 603 409 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755824 214 603 409 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755826 214 701 407 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755842 214 697 403 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755859 214 693 401 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755876 214 700 402 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755892 214 701 411 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755909 214 686 398 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755918 214 686 398 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755926 214 694 399 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755942 214 694 407 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755959 214 698 404 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755964 214 698 404 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755976 214 704 398 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135755992 214 693 415 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135756009 214 696 411 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135756025 215 728 406 804 387 183 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
4135756027 215 728 406 804 387 183 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Integration | time window
88
timestamp NetSeatChange NetBackChange SitForward RightSeat MiddleSeat LeftSeat RightBack MiddleBack LeftBack MouseValue RightRear RightFront LeftRear LeftFront
11072009371 10 15 0999.28865981012.041237977.3505155979.0103093937.8762887921.7731959 51022.9521281022.9627661022.9574471022.968085
11072009372 16 29 01003.2989691011.505155 970.257732982.5463918 924.371134929.8041237 51022.9840431022.9574471022.9627661022.952128
11072009381 13 17 01005.8020831011.427083976.2395833979.7083333939.5208333944.8958333 51022.9574471022.9787231022.9946811022.968085
11072009382 14 20 01005.3814431011.463918973.8969072982.0927835938.5979381946.6494845 51022.9574471022.9840431022.9574471022.957447
11072009391 10 52 01005.1340211012.020619 977.742268972.2371134672.3298969923.5463918 51022.9788361022.9735451022.9894181022.957672
11072009392 10 15 01004.3608251012.494845987.5051546981.3814433865.4020619942.2268041 51022.989305635.33689841021.8502671022.962567
11072009401 10 21 01003.052083 1012.09375 988.375 984.46875 873945.2708333 51022.978947990.74210531022.9157891022.989474
11072009402 9 9 0997.60204081012.214286989.5102041982.6836735868.4897959942.9489796 5 10231020.547872 1023 1023
11072009411 10 73 0999.03092781013.010309 984.556701 975.628866791.1649485938.0309278 51022.9840431022.9574471022.9840431022.952128
11072009412 11 57 01002.4948451012.618557 978.814433970.2371134868.4020619948.6494845 51022.9576721022.9735451022.968254 1022.94709
11072009421 9 15 01000.948454 1013.85567983.8556701975.9484536931.2680412961.3298969 51022.9734041022.9627661022.9627661022.957447
11072009422 10 14 01000.2680411013.175258983.4845361980.7628866935.7113402962.0927835 51022.9468091022.9893621022.9734041022.957447
11072009431 10 26 01000.9278351013.237113982.3608247974.9587629912.6907216948.7216495 51022.9839571022.9839571022.9572191022.946524
11072009432 10 25 01001.6701031013.309278981.8041237976.5670103938.2886598955.7010309 51022.9946811022.9734041022.9946811022.984043
11072009441 9 62 01002.5257731013.515464984.6082474971.2474227852.7835052947.0927835 5 1023 1023 1023 1023
11072009442 12 30 01006.1145831011.208333988.5833333973.7291667846.1458333948.5729167 5 1022.978611022.9893051022.9732621022.994652
11072009451 10 35 0 1006.6251011.645833 995.875968.9166667 809.46875934.2291667 51022.9734041022.9893621022.9840431022.978723
11072009452 10 37 01006.3402061012.865979996.0103093972.4226804807.9072165933.5773196 51022.9840431022.9521281022.9787231022.973404
11072009461 9 32 01005.8645831012.708333 996.875971.3229167815.8229167 938.875 51022.9946811022.9680851022.9734041022.978723
11072009462 8 21 01006.322917 1011.9375997.4895833 970.875852.8645833 942.09375 51022.9679141022.9358291022.9304811022.946524
11072009471 10 25 01005.5729171011.427083997.0833333 972.0625 884.28125943.8333333 51022.9893621022.9893621022.9680851022.973404
11072009472 10 17 01005.072917 1011.40625997.6145833973.3333333 901.84375944.4479167 51022.9734041022.9255321022.9468091022.978723
11072009481 10 16 01005.7938141012.329897998.7525773970.6701031 900.371134946.5360825 51022.9680851022.9574471022.9627661022.941489
11072009482 10 22 0 1005.93751012.333333999.9791667972.9791667 890.34375945.3541667 51022.9893621022.9521281022.9627661022.952128
11072009491 9 10 01007.4583331012.7395831000.229167 971.65625900.2395833 945.96875 51022.989362 10231022.9840431022.994681
11072009492 9 21 01007.304348 10131000.608696964.6195652869.6304348949.2608696 51022.994709 10231022.9576721022.962963
11072009501 9 16 0 1007.29 1013.14 998.94 961.98 870.65 950.76 51022.9787231022.9946811022.9893621022.984043
11072009502 10 11 0 1006.21875 1012.34375999.8854167963.1145833882.2708333943.6770833 51022.983957 10231022.9893051022.994652
11072009511 8 14 01006.8229171012.979167 1000.15625967.1041667884.0729167 942.21875 51022.9841271022.9788361022.9947091022.984127
11072009512 9 15 0 1006.906251013.4270831001.041667967.4895833884.7083333938.5208333 5 1023 1023 1023 1023
11072009521 9 24 01006.7052631013.0842111000.031579966.7894737866.5684211940.1578947 5 1023 1023 1023 1023
4. Analyzing Data
89
Session 2
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Tool | Eureqa
mathematical relationships in data
90
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 91
For reverse engineering searches of the data, the Eureqa tool [19] is
used to discover mathematical expressions of the structural relationships
in the data records.
!
For example, if a record holds information about the physical and emotional
behavior of an individual who was engaged in a single experimental setting,
Eureqa could take all the available sources of data and reveal both how the
measure of engagement is calculated from specific data streams as well as how
other sensors may influence the proposed emotional construct.
!
!
!
!
!
!
!
!
[19] Dubcˇa ́kova ́, R. Eureqa-software review. Genetic programming and evolvable machines. Genet. Program. Evol. Mach. (2010) online first. doi:10.1007/s10710-
010-9124-z .
Tool | Eureqa
92
Tool | Eureqa
93
Tool | Eureqa
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Tool | Weka
explore classification clustering
pre-processing visualization
94
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 95
For clustering and classification approaches: Weka [20]
It is a tool that implements a collection of machine learning algorithms for data
mining tasks and is used to explore the data composition and relationships and
derive useful knowledge from data records.
!
!
!
!
!
!
!
[20] M. Hall, E. Frank, G. Holmes, B. Pfahringer, P. Reutemann, and I.H. Witten, “The WEKA Data Mining Software: An Update,” Proc. SIGKDD Explorations, 2009,
Tool | Weka
96
Tool | Weka
P0009 novice playing expert song
97
Tool | Weka
P0009 novice playing expert song
98
Tool | Weka
Documented API for Java Developers
5. Sharing Experiences and Group Discussion
99
Session 2
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Experiences
100
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Visualization
101
BCI and Gaze Points engagement
This figure shows the engagement fixation points of expert player playing in expert-mode. The size of the circle represents the duration of the fixation in that point,
while the level of shading represents the intensity of the emotion.
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Visualization
102
BCI and Gaze Points frustration
This figure shows the frustration fixation points of expert player playing in expert-mode. The size of the circle represents the duration of the fixation in that point, while
the level of shading represents the intensity of the emotion.
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Visualization
103
BCI and Gaze Points boredom
This figure shows the boredom fixation points of expert player playing in expert-mode. The size of the circle represents the duration of the fixation in that point, while
the level of shading represents the intensity of the emotion.
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Visualization
104
BCI and Gaze Points engagement
This figures shows the engagement gaze points (above a threshold of 0.6) of a user reading material with seductive details (i.e. cartoons).
For this user the text on the bottom part of the first column was engaging.
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Visualization
105
BCI and Gaze Points frustration
This figure shows the frustration gaze points (above a threshold of 0.6) of a user reading material with seductive details (i.e. cartoons).
Looking at the cartoon is related with high frustration level.
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Visualization
106
BCI and Gaze Points boredom
This figure shows the boredom gaze points (above a threshold of 0.5) of a user while reading material with seductive details (i.e. cartoons).
Notice that the text in the middle part of the second column of that page was boring.
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Matching Values
107
Our hypothesis is that, to a great extent, it is possible to infer the values of one source from the other source.
BCI-based values Correlation with face-based model
excitement 0.284
engagement 0.282
meditation 0.188
frustration 0.275
Face-based values Correlation with BCI-based model
agreement 0.76
concentrating 0.765
disagreement 0.794
interested 0.774
thinking 0.78
unsure 0.828
BCI and Face-Based Emotion Recognition
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Matching Values
108
…"
[21] L. Breiman. Random forests. Machine Learning, 2001. Volume 45, Number 1. Pages 5–32.
Random Forest
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Networks
Structural(Equa+ons,(Adjacency(Matrixes(
and(Networks(Graphs(
Brain(schema+c,(showing(the(channels(
that(contribute(with(engagement(
! !
BCI raw data | Engagement
109
to
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Closed-loop
Emotion Adaptation | Games
110
[22] Bernays, R., Mone, J., Yau, P., Murcia, M., Gonzalez-Sanchez, J., Chavez-Echeagaray, M. E., Christopherson, R. M., Atkinson, R., and Yoshihiro, K. 2012. Lost
in the Dark: Emotion Adaption. In Adjunct proceedings of the 25th annual ACM symposium on User interface software and technology, 79–80. New York, NY, USA.
ACM. doi:10.1145/2380296.2380331.

Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Closed-loop
Emotion Mirroring | Virtual World
111
[23] Gonzalez-Sanchez, J., Chavez-Echeagaray, M. E., Gibson, D., and Atkinson, R. 2013. Multimodal Affect Recognition in Virtual Worlds: Avatars Mirroring User's
Affect. In ACII '13: Proceedings of the 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction. IEEE Computer Society.



Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Group Discussion
112
devices data inferences
6. Conclusions and Q&A
113
Session 2
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Conclusion
This course aims to provide the attendees with a presentation about tools and
dataset exploration. While the course do not present an exhaustive list of all the
methods available for gathering, processing, analyzing, and interpreting affective
sensor data, the course describe the basis of a multimodal approach that
attendees can use to launch their own research efforts.
!
!
!
!
!
!
!
!
!
!
!
!
!
[22] B. du Boulay, “Towards a Motivationally-Intelligent Pedagogy: How should an intelligent tutor respond to the unmotivated or the demotivated?,” Proc. New
Perspectives on Affect and Learning Technologies, R. A. Calvo & S. D'Mello (Eds.), Springer-Verlag, in press.
114
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
References
1. R.S.J. Baker, M.M.T Rodrigo and U.E. Xolocotzin, “The Dynamics of Affective Transitions in Simulation
Problem-solving Environments,” Proc. Affective Computing and Intelligent Interaction: Second
International Conference (ACII ’07), A. Paiva, R. Prada & R. W. Picard (Eds.), Springer-Verlag, Vol.
Lecture Notes in Computer Science 4738, pp. 666-677.
2. I. Arroyo, D. G. Cooper, W. Burleson, F. P. Woolf, K. Muldner, and R. Christopherson, “Emotion Sensors
Go to School,” Proc. Artificial Intelligence in Education: Building Learning Systems that Care: from
Knowledge Representation to Affective Modelling, (AIED 09), V. Dimitrova, R. Mizoguchi, B. du Boulay
& A. Grasser (Eds.), IOS Press, July 2009, vol. Frontiers in Artificial Intelligence and Applications 200,
pp. 17-24.
3. R. W. Picard, Affective Computing, MIT Press, 1997.
4. J. Gonzalez-Sanchez, R. M. Christopherson, M. E. Chavez-Echeagaray, D. C. Gibson, R. Atkinson, W.
Burleson, “ How to Do Multimodal Detection of Affective States?,” icalt, pp.654-655, 2011 IEEE 11th
International Conference on Advanced Learning Technologies, 2011
5. Emotiv - Brain Computer Interface Technology. Retrieved April 26, 2011, from http://www.emotiv.com.
6. F. Sharbrough, G.E. Chatrian, R.P. Lesser, H. Lüders, M. Nuwer, T.W. Picton. American
Electroencephalographic Society Guidelines for Standard Electrode Position Nomenclature. J. Clin.
Neurophysiol 8: 200-2.
7. Electroencephalography. Retrieved November 14th, 2010, from Electric and Magnetic Measurement of
the Electric Activity of Neural Tissue: http://www.bem.fi/book/13/13.htm
115
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
References
8. R. E. Kaliouby and P. Robinson, “Real-Time Inference of Complex Mental States from Facial
Expressions and Head Gestures,” Proc. Conference on Computer Vision and Pattern Recognition
Workshop (CVPRW ‘04), IEEE Computer Society, June 2004, Volume 10, p.154.
9. Tobii Technology - Eye Tracking and Eye Control. Retrieved April 26, 2011, from http://
www.tobii.com.
10. M. Strauss, C. Reynolds, S. Hughes, K. Park, G. McDarby, and R.W. Picard, “The HandWave
Bluetooth Skin Conductance Sensor,” Proc. First International Conference on Affective Computing
and Intelligent Interaction (ACII 05), Springer-Verlang, Oct. 2005, pp. 699-706, doi:
10.1007/11573548_90.
11. Y. Qi, and R. W. Picard, "Context-Sensitive Bayesian Classifiers and Application to Mouse Pressure
Pattern Classification," Proc. International Conference on Pattern Recognition (ICPR 02), Aug.
2002, vol 3, pp. 30448, doi:10.1109/ICPR.2002.1047973.
12. D. Cooper, I. Arroyo, B. Woolf, K. Muldner, W. Burleson, and R. Christopherson. (2009). Sensors
model student self concept in the classroom, User Modeling, Adaptation, and Personalization,
30-41.
13. S. Mota, and R. W. Picard, "Automated Posture Analysis for Detecting Learners Interest Level,"
Proc. Computer Vision and Pattern Recognition Workshop (CVPRW 03), IEEE Press, June 2003,
vol. 5, pp. 49, doi:10.1109/CVPRW.2003.10047.
116
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
References
14. K. VanLehn, W. Burleson, M.E. Chavez-Echeagaray, R. Christopherson, J. Gonzalez-Sanchez, Y.
Hidalgo-Pontet, and L. Zhang. “The Affective Meta-Tutoring Project: How to motivate students to
use effective meta-cognitive strategies,” Proceedings of the 19th International Conference on
Computers in Education. Chiang Mai, Thailand: Asia- Pacific Society for Computers in Education.
October 2011. In press.
15. J. Gonzalez-Sanchez; M.E. Chavez-Echeagaray; R. Atkinson; and W. Burleson. (2011), "ABE: An
Agent-Based Software Architecture for a Multimodal Emotion Recognition Framework," in
Proceedings of Working IEEE/IFIP Conference on Software Architecture (June 2011).
16. F. Tuijnman, and H. Afsarmanesh, "Distributed objects in a federation of autonomous cooperating
agents," Proc. International Conference on Intelligent and Cooperative Information Systems, May
1993, pp. 256-265, doi:10.1109/ICICIS.1993.291763.
17. M. Wood, and S. DeLoach, “An overview of the multiagent systems engineering methodology,”
Agent-Oriented Software Engineering, (AOSE 2000), Springer-Verlag, 2001, pp. 207-221, doi:
10.1007/3-540-44564-1_14.
18. [18] J. Liu, S. Ji, and J. Ye. SLEP: Sparse Learning with Efficient Projections. Arizona State
University, 2009. http://www.public.asu.edu/~jye02/Software/SLEP.
19. [19] R. Dubcakova. Eureqa-software review. Genetic programming and evolvable machines. Genet.
Program. Evol. Mach. (2010) online first. doi:10.1007/s10710- 010-9124-z .
117
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
References
20. M. Hall, E. Frank, G. Holmes, B. Pfahringer, P. Reutemann, and I.H.Witten, “The WEKA Data Mining
Software: An Update,” Proc. SIGKDD Explorations, 2009, Volume 11, Issue 1.
21. L. Breiman. Random forests. Machine Learning, 2001. Volume 45, Number 1. Pages 5–32.
22. R. Bernays, J. Mone, P. Yau, M. Murcia, J. Gonzalez-Sanchez, M.E. Chavez-Echeagaray, R.
Christopherson, R. Atkinson, and K. Yoshihiro. 2012. Lost in the Dark: Emotion Adaption. In Adjunct
proceedings of the 25th annual ACM symposium on User interface software and technology, 79–80.
New York, NY, USA. ACM. doi:10.1145/2380296.2380331.
23. J. Gonzalez-Sanchez, M.E. Chavez-Echeagaray, D. Gibson, and R. Atkinson. 2013. Multimodal Affect
Recognition in Virtual Worlds: Avatars Mirroring User's Affect. In ACII '13: Proceedings of the 2013
Humaine Association Conference on Affective Computing and Intelligent Interaction. IEEE Computer
Society. 724-725. doi: 10.1109/ACII.2013.133
24. B. du Boulay, Towards a Motivationally-Intelligent Pedagogy: How should an intelligent tutor respond to
the unmotivated or the demotivated?, Proc. New Perspectives on Affect and Learning Technologies, R.
A. Calvo & S. D'Mello (Eds.), Springer-Verlag.
118
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Questions | Answers
119
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
Acknowledgements
This research was supported by
Office of Naval Research under Grant N00014-10-1-0143
awarded to Dr. Robert Atkinson
120
angle.lab.asu.edu
hci.asu.edu
!
{ javiergs, helenchavez } @ asu.edu
1 sur 121

Recommandé

201500 Cognitive Informatics par
201500 Cognitive Informatics201500 Cognitive Informatics
201500 Cognitive InformaticsJavier Gonzalez-Sanchez
1K vues46 diapositives
Machine Learning: par
Machine Learning:Machine Learning:
Machine Learning:butest
675 vues28 diapositives
201209 An Introduction to Building Affective-Driven Self-Adaptive Software par
201209 An Introduction to Building Affective-Driven Self-Adaptive Software 201209 An Introduction to Building Affective-Driven Self-Adaptive Software
201209 An Introduction to Building Affective-Driven Self-Adaptive Software Javier Gonzalez-Sanchez
795 vues43 diapositives
Eigenface based recognition of emotion variant faces par
Eigenface based recognition of emotion variant facesEigenface based recognition of emotion variant faces
Eigenface based recognition of emotion variant facesAlexander Decker
377 vues9 diapositives
Expert System Lecture Notes Chapter 1,2,3,4,5 - Dr.J.VijiPriya par
 Expert System Lecture Notes Chapter 1,2,3,4,5 - Dr.J.VijiPriya Expert System Lecture Notes Chapter 1,2,3,4,5 - Dr.J.VijiPriya
Expert System Lecture Notes Chapter 1,2,3,4,5 - Dr.J.VijiPriyaVijiPriya Jeyamani
7.4K vues61 diapositives
IRJET- Prediction of Human Facial Expression using Deep Learning par
IRJET- Prediction of Human Facial Expression using Deep LearningIRJET- Prediction of Human Facial Expression using Deep Learning
IRJET- Prediction of Human Facial Expression using Deep LearningIRJET Journal
22 vues6 diapositives

Contenu connexe

Tendances

Informative Centers' Intelligent Agent Based Model - a preliminary study par
Informative Centers' Intelligent Agent Based Model - a preliminary studyInformative Centers' Intelligent Agent Based Model - a preliminary study
Informative Centers' Intelligent Agent Based Model - a preliminary studytulipbiru64
377 vues15 diapositives
Inredis And Machine Learning Nips par
Inredis And Machine Learning NipsInredis And Machine Learning Nips
Inredis And Machine Learning NipsINREDIS research project, led by Technosite
444 vues25 diapositives
SCiL Poster par
SCiL PosterSCiL Poster
SCiL PosterMabel Y Wong
73 vues1 diapositive
C0353018026 par
C0353018026C0353018026
C0353018026inventionjournals
297 vues9 diapositives
An Artificially Intelligent Device for the Intellectually Disabled par
An Artificially Intelligent Device for the Intellectually DisabledAn Artificially Intelligent Device for the Intellectually Disabled
An Artificially Intelligent Device for the Intellectually DisabledAssociation of Scientists, Developers and Faculties
223 vues5 diapositives
Analysis on techniques used to recognize and identifying the Human emotions par
Analysis on techniques used to recognize and identifying  the Human emotions Analysis on techniques used to recognize and identifying  the Human emotions
Analysis on techniques used to recognize and identifying the Human emotions IJECEIAES
28 vues8 diapositives

En vedette

201407 A System Architecture for Affective Meta Intelligent Tutoring Systems par
201407 A System Architecture for Affective Meta Intelligent Tutoring Systems201407 A System Architecture for Affective Meta Intelligent Tutoring Systems
201407 A System Architecture for Affective Meta Intelligent Tutoring SystemsJavier Gonzalez-Sanchez
3.1K vues30 diapositives
Jay Cross Vivo Versao Final Corrigida par
Jay Cross Vivo Versao Final CorrigidaJay Cross Vivo Versao Final Corrigida
Jay Cross Vivo Versao Final CorrigidaLuis Fernando Guggenberger
406 vues53 diapositives
201506 CSE340 Lecture 09 par
201506 CSE340 Lecture 09201506 CSE340 Lecture 09
201506 CSE340 Lecture 09Javier Gonzalez-Sanchez
562 vues27 diapositives
201506 CSE340 Lecture 16 par
201506 CSE340 Lecture 16201506 CSE340 Lecture 16
201506 CSE340 Lecture 16Javier Gonzalez-Sanchez
711 vues19 diapositives
Week7 par
Week7Week7
Week7s1160114
303 vues10 diapositives
簡介創用CC授權 par
簡介創用CC授權簡介創用CC授權
簡介創用CC授權Chou Emily
520 vues30 diapositives

En vedette(20)

201407 A System Architecture for Affective Meta Intelligent Tutoring Systems par Javier Gonzalez-Sanchez
201407 A System Architecture for Affective Meta Intelligent Tutoring Systems201407 A System Architecture for Affective Meta Intelligent Tutoring Systems
201407 A System Architecture for Affective Meta Intelligent Tutoring Systems
簡介創用CC授權 par Chou Emily
簡介創用CC授權簡介創用CC授權
簡介創用CC授權
Chou Emily520 vues
How new technologies affect the art of contesting par Tobias Wellnitz
How new technologies affect the art of contestingHow new technologies affect the art of contesting
How new technologies affect the art of contesting
Tobias Wellnitz2K vues
Sociale media en journalistiek par Bart Van Belle
Sociale media en journalistiekSociale media en journalistiek
Sociale media en journalistiek
Bart Van Belle1.1K vues
Developing distributed analysis pipelines with shared community resources usi... par Brad Chapman
Developing distributed analysis pipelines with shared community resources usi...Developing distributed analysis pipelines with shared community resources usi...
Developing distributed analysis pipelines with shared community resources usi...
Brad Chapman2.4K vues

Similaire à 201404 Multimodal Detection of Affective States: A Roadmap Through Diverse Technologies

Mindreadingcomputer par
Mindreadingcomputer Mindreadingcomputer
Mindreadingcomputer ZULFIN
5.3K vues15 diapositives
Insights into Innovation, Tokyo 8-6-10, Martha G. Russell par
Insights into Innovation, Tokyo 8-6-10, Martha G. RussellInsights into Innovation, Tokyo 8-6-10, Martha G. Russell
Insights into Innovation, Tokyo 8-6-10, Martha G. RussellMartha Russell
676 vues55 diapositives
Towards_multimodal_emotion_recognition_i.pdf par
Towards_multimodal_emotion_recognition_i.pdfTowards_multimodal_emotion_recognition_i.pdf
Towards_multimodal_emotion_recognition_i.pdfSHEEJAMOLPT
2 vues29 diapositives
Affective computing by- Sandeep Jadhav par
Affective computing by- Sandeep JadhavAffective computing by- Sandeep Jadhav
Affective computing by- Sandeep JadhavSandep Jadhav
773 vues9 diapositives
Affective computing by Sandeep Jadhav par
Affective computing by Sandeep JadhavAffective computing by Sandeep Jadhav
Affective computing by Sandeep JadhavSandep Jadhav
644 vues9 diapositives
Ijsrdv8 i10424 par
Ijsrdv8 i10424Ijsrdv8 i10424
Ijsrdv8 i10424aissmsblogs
62 vues4 diapositives

Similaire à 201404 Multimodal Detection of Affective States: A Roadmap Through Diverse Technologies(20)

Mindreadingcomputer par ZULFIN
Mindreadingcomputer Mindreadingcomputer
Mindreadingcomputer
ZULFIN5.3K vues
Insights into Innovation, Tokyo 8-6-10, Martha G. Russell par Martha Russell
Insights into Innovation, Tokyo 8-6-10, Martha G. RussellInsights into Innovation, Tokyo 8-6-10, Martha G. Russell
Insights into Innovation, Tokyo 8-6-10, Martha G. Russell
Martha Russell676 vues
Towards_multimodal_emotion_recognition_i.pdf par SHEEJAMOLPT
Towards_multimodal_emotion_recognition_i.pdfTowards_multimodal_emotion_recognition_i.pdf
Towards_multimodal_emotion_recognition_i.pdf
SHEEJAMOLPT2 vues
Affective computing by- Sandeep Jadhav par Sandep Jadhav
Affective computing by- Sandeep JadhavAffective computing by- Sandeep Jadhav
Affective computing by- Sandeep Jadhav
Sandep Jadhav773 vues
Affective computing by Sandeep Jadhav par Sandep Jadhav
Affective computing by Sandeep JadhavAffective computing by Sandeep Jadhav
Affective computing by Sandeep Jadhav
Sandep Jadhav644 vues
Mind reading computer par rajasri999
Mind reading computerMind reading computer
Mind reading computer
rajasri99934.5K vues
Media X at Stanford University - Description par Martha Russell
Media X at Stanford University - DescriptionMedia X at Stanford University - Description
Media X at Stanford University - Description
Martha Russell1.1K vues
Mindreadingcomputer 120714123334-phpapp01 par manojkesari
Mindreadingcomputer 120714123334-phpapp01Mindreadingcomputer 120714123334-phpapp01
Mindreadingcomputer 120714123334-phpapp01
manojkesari1.1K vues
Designing for the periphery of our attention - a study on Ambient Information... par Mauro Pinheiro
Designing for the periphery of our attention - a study on Ambient Information...Designing for the periphery of our attention - a study on Ambient Information...
Designing for the periphery of our attention - a study on Ambient Information...
Mauro Pinheiro774 vues
Software Technology Insurance for Customers par SPIN Chennai
Software Technology Insurance for CustomersSoftware Technology Insurance for Customers
Software Technology Insurance for Customers
SPIN Chennai569 vues
Study on Different Human Emotions Using Back Propagation Method par ijiert bestjournal
Study on Different Human Emotions Using Back Propagation MethodStudy on Different Human Emotions Using Back Propagation Method
Study on Different Human Emotions Using Back Propagation Method
The Challenges of Affect Detection in the Social Programmer Ecosystem par Nicole Novielli
The Challenges of Affect Detection in the Social Programmer EcosystemThe Challenges of Affect Detection in the Social Programmer Ecosystem
The Challenges of Affect Detection in the Social Programmer Ecosystem
Nicole Novielli1.5K vues

Plus de Javier Gonzalez-Sanchez

201804 SER332 Lecture 01 par
201804 SER332 Lecture 01201804 SER332 Lecture 01
201804 SER332 Lecture 01Javier Gonzalez-Sanchez
411 vues22 diapositives
201801 SER332 Lecture 03 par
201801 SER332 Lecture 03201801 SER332 Lecture 03
201801 SER332 Lecture 03Javier Gonzalez-Sanchez
136 vues55 diapositives
201801 SER332 Lecture 04 par
201801 SER332 Lecture 04201801 SER332 Lecture 04
201801 SER332 Lecture 04Javier Gonzalez-Sanchez
133 vues24 diapositives
201801 SER332 Lecture 02 par
201801 SER332 Lecture 02201801 SER332 Lecture 02
201801 SER332 Lecture 02Javier Gonzalez-Sanchez
103 vues15 diapositives
201801 CSE240 Lecture 26 par
201801 CSE240 Lecture 26201801 CSE240 Lecture 26
201801 CSE240 Lecture 26Javier Gonzalez-Sanchez
411 vues15 diapositives
201801 CSE240 Lecture 25 par
201801 CSE240 Lecture 25201801 CSE240 Lecture 25
201801 CSE240 Lecture 25Javier Gonzalez-Sanchez
141 vues13 diapositives

Plus de Javier Gonzalez-Sanchez(20)

Dernier

.conf Go 2023 - Data analysis as a routine par
.conf Go 2023 - Data analysis as a routine.conf Go 2023 - Data analysis as a routine
.conf Go 2023 - Data analysis as a routineSplunk
90 vues12 diapositives
Future of Learning - Khoong Chan Meng par
Future of Learning - Khoong Chan MengFuture of Learning - Khoong Chan Meng
Future of Learning - Khoong Chan MengNUS-ISS
31 vues7 diapositives
Report 2030 Digital Decade par
Report 2030 Digital DecadeReport 2030 Digital Decade
Report 2030 Digital DecadeMassimo Talia
13 vues41 diapositives
handbook for web 3 adoption.pdf par
handbook for web 3 adoption.pdfhandbook for web 3 adoption.pdf
handbook for web 3 adoption.pdfLiveplex
19 vues16 diapositives
The Importance of Cybersecurity for Digital Transformation par
The Importance of Cybersecurity for Digital TransformationThe Importance of Cybersecurity for Digital Transformation
The Importance of Cybersecurity for Digital TransformationNUS-ISS
25 vues26 diapositives
Igniting Next Level Productivity with AI-Infused Data Integration Workflows par
Igniting Next Level Productivity with AI-Infused Data Integration Workflows Igniting Next Level Productivity with AI-Infused Data Integration Workflows
Igniting Next Level Productivity with AI-Infused Data Integration Workflows Safe Software
91 vues86 diapositives

Dernier(20)

.conf Go 2023 - Data analysis as a routine par Splunk
.conf Go 2023 - Data analysis as a routine.conf Go 2023 - Data analysis as a routine
.conf Go 2023 - Data analysis as a routine
Splunk90 vues
Future of Learning - Khoong Chan Meng par NUS-ISS
Future of Learning - Khoong Chan MengFuture of Learning - Khoong Chan Meng
Future of Learning - Khoong Chan Meng
NUS-ISS31 vues
handbook for web 3 adoption.pdf par Liveplex
handbook for web 3 adoption.pdfhandbook for web 3 adoption.pdf
handbook for web 3 adoption.pdf
Liveplex19 vues
The Importance of Cybersecurity for Digital Transformation par NUS-ISS
The Importance of Cybersecurity for Digital TransformationThe Importance of Cybersecurity for Digital Transformation
The Importance of Cybersecurity for Digital Transformation
NUS-ISS25 vues
Igniting Next Level Productivity with AI-Infused Data Integration Workflows par Safe Software
Igniting Next Level Productivity with AI-Infused Data Integration Workflows Igniting Next Level Productivity with AI-Infused Data Integration Workflows
Igniting Next Level Productivity with AI-Infused Data Integration Workflows
Safe Software91 vues
DALI Basics Course 2023 par Ivory Egg
DALI Basics Course  2023DALI Basics Course  2023
DALI Basics Course 2023
Ivory Egg14 vues
Upskilling the Evolving Workforce with Digital Fluency for Tomorrow's Challen... par NUS-ISS
Upskilling the Evolving Workforce with Digital Fluency for Tomorrow's Challen...Upskilling the Evolving Workforce with Digital Fluency for Tomorrow's Challen...
Upskilling the Evolving Workforce with Digital Fluency for Tomorrow's Challen...
NUS-ISS23 vues
Voice Logger - Telephony Integration Solution at Aegis par Nirmal Sharma
Voice Logger - Telephony Integration Solution at AegisVoice Logger - Telephony Integration Solution at Aegis
Voice Logger - Telephony Integration Solution at Aegis
Nirmal Sharma17 vues
Combining Orchestration and Choreography for a Clean Architecture par ThomasHeinrichs1
Combining Orchestration and Choreography for a Clean ArchitectureCombining Orchestration and Choreography for a Clean Architecture
Combining Orchestration and Choreography for a Clean Architecture
Digital Product-Centric Enterprise and Enterprise Architecture - Tan Eng Tsze par NUS-ISS
Digital Product-Centric Enterprise and Enterprise Architecture - Tan Eng TszeDigital Product-Centric Enterprise and Enterprise Architecture - Tan Eng Tsze
Digital Product-Centric Enterprise and Enterprise Architecture - Tan Eng Tsze
NUS-ISS19 vues
Business Analyst Series 2023 - Week 3 Session 5 par DianaGray10
Business Analyst Series 2023 -  Week 3 Session 5Business Analyst Series 2023 -  Week 3 Session 5
Business Analyst Series 2023 - Week 3 Session 5
DianaGray10165 vues
How the World's Leading Independent Automotive Distributor is Reinventing Its... par NUS-ISS
How the World's Leading Independent Automotive Distributor is Reinventing Its...How the World's Leading Independent Automotive Distributor is Reinventing Its...
How the World's Leading Independent Automotive Distributor is Reinventing Its...
NUS-ISS15 vues
Empathic Computing: Delivering the Potential of the Metaverse par Mark Billinghurst
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the Metaverse
AI: mind, matter, meaning, metaphors, being, becoming, life values par Twain Liu 刘秋艳
AI: mind, matter, meaning, metaphors, being, becoming, life valuesAI: mind, matter, meaning, metaphors, being, becoming, life values
AI: mind, matter, meaning, metaphors, being, becoming, life values
Web Dev - 1 PPT.pdf par gdsczhcet
Web Dev - 1 PPT.pdfWeb Dev - 1 PPT.pdf
Web Dev - 1 PPT.pdf
gdsczhcet52 vues
Emerging & Future Technology - How to Prepare for the Next 10 Years of Radica... par NUS-ISS
Emerging & Future Technology - How to Prepare for the Next 10 Years of Radica...Emerging & Future Technology - How to Prepare for the Next 10 Years of Radica...
Emerging & Future Technology - How to Prepare for the Next 10 Years of Radica...
NUS-ISS15 vues

201404 Multimodal Detection of Affective States: A Roadmap Through Diverse Technologies

  • 1. Multimodal Detection of Affective States: A Roadmap Through Diverse Technologies Javier Gonzalez-Sanchez, Maria-Elena Chavez-Echeagaray Robert Atkinson, Winslow Burleson ! ! School of Computing, Informatics, and Decision Systems Engineering Arizona State University Tempe, Arizona, USA Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s). Copyright is held by the author/owner(s). CHI 2014, April 26–May 1, 2014, Toronto, Ontario, Canada. ACM 14/04 1
  • 2. Advancing Next Generation Learning Environments Lab & Motivational Environments Group ! School of Computing, Informatics, and Decision Systems Engineering Arizona State University 2 About Us
  • 3. This work was supported by Office of Naval Research under Grant N00014-10-1-0143 Robert Atkinson ! Dr. Robert Atkinson is an Associate Professor in the Ira A. Schools of Engineering and in the Mary Lou Fulton Teacher’s College. His research explores the intersection of cognitive science, informatics, instructional design, and educational technology. His scholarship involves the design of instructional material, including books, and computer-based learning environments according to our understanding of human cognitive architecture and how to leverage its unique constraints and affordances. His current research focus involves the study of engagement and flow in games. 3 Principal Investigator
  • 4. This work was supported by Office of Naval Research under Grant N00014-10-1-0143 Winslow Burleson ! Dr. Winslow Burleson received his PhD from the MIT Media Lab, Affective Computing Group. He joined ASU's School of Computing and Informatics and the Arts, Media, and Engineering graduate program at ASU in 2006. He has worked with the Entrepreneurial Management Unit at the Harvard Business School, Deutsche Telekom Laboratories, SETI Institute, and IBM's Almaden Research Center where he was awarded ten patents. He holds an MSE from Stanford University's Mechanical Engineering Product Design Program and a BA in Bio-Physics from Rice University. He has been a co-Principal Investigator on the Hubble Space Telescope's Investigation of Binary Asteroids and consultant to UNICEF and the World Scout Bureau. 4 Principal Investigator
  • 5. This work was supported by Office of Naval Research under Grant N00014-10-1-0143 Javier Gonzalez-Sanchez ! Javier is a doctoral candidate in Computer Science at Arizona State University. His research interests are affect-driven adaptation, software architecture, affective computing, and educational technology. He holds an MS in Computer Science from the Center for Research and Advanced Studies of the National Polytechnic Institute and a BS in Computer Engineering from the University of Guadalajara. His experience includes 12+ years as a software engineer and 9+ years teaching undergraduate courses at Tecnologico de Monterrey and graduate courses at Universidad de Guadalajara. 5 Doctoral Candidate
  • 6. This work was supported by Office of Naval Research under Grant N00014-10-1-0143 Helen Chavez-Echeagaray ! Helen is a doctoral candidate in Computer Science at Arizona State University. Her interests are in the areas of affective computing, educational technology (including robotics), and learning processes. The Tecnológico de Monterrey campus Guadalajara conferred upon her the degree of MS in Computer Science and the degree of BS in Computer Systems Engineering. Before starting her PhD program she was a faculty member for 8+ years at Tecnologico de Monterrey. Her experience includes an administrative position and software development. 6 Doctoral Candidate
  • 7. Outline of the Course 7 Preface
  • 8. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Context 8
  • 9. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Context 9
  • 10. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Justification One novel part of the planning and design of the interaction between people and computers, related to the facet of securing user satisfaction, is the capability of systems to adapt to their individual users by showing empathy. ! ! Being empathetic implies that the computer is able to recognize a user’s affective states and understand the implication of those states. ! ! Detection of affective states is a step forward to provide machines with the necessary intelligence to identify and understand human emotions and then appropriately interact with humans. Therefore, it is necessary to equip computers with hardware and software to enable them to perceive users’ affective states and then use this understanding to create more harmonic interactions. 10
  • 11. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Motivation This course provides a description and demonstration of tools and methodologies necessary for automatically detecting affective states. ! Automatic detection of affective states requires the computer ! (1) to gather information that is complex and diverse; ! (2) to process and understand information integrating several sources (senses) that could range from brain-wave signals and biofeedback readings, to face-based and gesture emotion recognition to posture and pressure sensing; and ! (3) once the data is integrated, to apply perceiving algorithms software and data processing tools to understand user’s status. ! During the course we will review case examples (datasets) obtained from previous research studies. 11
  • 12. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Objectives Attendees of this course will ! Learn about sensing devices used to detect affective states including brain-computer interfaces, face-based emotion recognition systems, eye- tracking systems and physiological sensors. ! Understand the pros and cons of the sensing devices used to detect affective states. ! Learn about the data that is gathered from each sensing device and understand its characteristics. Learn about what it takes to manage (pre- process and synchronize) affective data. ! Learn about approaches and algorithms used to analyze affective data and how it could be used to drive computer functionality or behavior. 12
  • 13. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Schedule | Session 1 1. Introduction to Affective Human-Computer Interfaces (20 minutes) ! 2. Sensing Devices (60 minutes) ! 2.1. Brain-computer interfaces ! 2.2. Face-based emotion recognition systems ! ! Break ! ! ! 2.3. Eye-tracking systems ! 2.4. Physiological sensors (skin conductance, pressure, posture) 13
  • 14. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Schedule | Session 2 3. Data Filtering and Integration (20 minutes) ! 3.1. Gathered data ! 3.2. Data synchronization ! 4. Analyzing data (20 minutes) ! 4.1. Regression ! 4.2. Clustering and Classification ! Break ! 5. Sharing Experience and Group Discussion (20 minutes) ! 6. Conclusions and Q&A (20 minutes) 14
  • 17. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Concepts Multimodal Detection of Affective States instinctual reaction to stimulation feelings, emotions How do you feel? physiological physical 17
  • 18. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Concepts 18 Until recently much of the affective data gathered by systems has relied heavily on learner’s self-report of their affective state, observation, or software data logs [1]. ! But now many systems have started to include data from the physical manifestations of affective states through the use of sensing devices and the application of novel machine learning and data mining algorithms to deal with the vast amounts of data generated by the sensors [2][3]. ! ! ! ! ! ! ! ! ! [1] R.S.J. Baker, M.M.T Rodrigo and U.E. Xolocotzin, “The Dynamics of Affective Transitions in Simulation Problem-solving Environments,” Proc. Affective Computing and Intelligent Interaction: Second International Conference (ACII ’07), A. Paiva, R. Prada & R. W. Picard (Eds.), Springer Verlag, Vol. Lecture Notes in Computer Science 4738, pp. 666-677. ! [2] I. Arroyo, D. G. Cooper, W. Burleson, F. P. Woolf, K. Muldner, and R. Christopherson, “Emotion Sensors Go to School,” Proc. Artificial Intelligence in Education: Building Learning Systems that Care: from Knowledge Representation to Affective Modelling, (AIED 09), V Dimitrova, R. Mizoguchi, B. du Boulay & A. Grasser (Eds.), IOS Press, July 2009, vol. Frontiers in Artificial Intelligence and Applications 200, pp. 17-24. ! [3] R. W. Picard, Affective Computing, MIT Press, 1997.
  • 19. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Concepts Multimodal Detection of Affective States measurement identify the presence of ... physiological measures physical appearance self-report 19
  • 20. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Model Mul$modal)emo$on)recogni$on)system) User) Brainwaves) Eye)movements) Facial)expressions) Physiological)signals) Sensing)) Devices) Percep$on) mechanisms) Integra$on) Algorithm) Raw)data) Beliefs) State) 20
  • 21. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Sensing Devices 21 Sensing devices obtain data from the user about his physiological responses and body reactions. Sensing devices are hardware devices that collect quantitative data as measures of physiological signals of emotional change. We call the measures provided by the sensing devices raw data. Our approach includes the use of brain-computer interfaces, eye-tracking systems, biofeedback sensors, and face-based emotion recognition systems [4]. The use of several sensing devices either to recognize a broad range of emotions or to improve the accuracy of recognizing one emotion is referred to as a multimodal approach. ! ! ! ! [4] J. Gonzalez-Sanchez, R. M. Christopherson, M. E. Chavez-Echeagaray, D. C. Gibson, R. Atkinson, W. Burleson, “ How to Do Multimodal Detection of Affective States?,” ICALT, pp.654-655, 2011 IEEE 11th International Conference on Advanced Learning Technologies, 2011
  • 22. 2. Sensing Devices | Brain-Computer Interfaces 22 Session I
  • 23. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray BCI Brain-Computer Interfaces (BCI) ! It is a particular type of a physiological instrument that uses brainwaves as information sources (electrical activity along the scalp produced by the firing of neurons within the brain). ! Emotiv® EPOC headset [5] device will be used to show how to collect and work with this kind of data. ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! [5] Emotiv - Brain Computer Interface Technology. Retrieved April 26, 2011, from http://www.emotiv.com. 23
  • 24. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray BCI Wireless Emotiv® EEG Headset. ! The device reports data with intervals of 125 ms (8 Hz). ! The raw data output includes 14 values (7 channels on each brain hemisphere: AF3, F7, F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4, F8, and AF4) and two values of the acceleration of the head when leaning (GyroX and GyroY). ! The Affectiv Suite reports 5 emotions: engagement, boredom, excitement, frustration, and meditation. ! And the Expressiv Suite reports facial gestures: blink, wink (left and right), look (left and right), raise brow, furrow brow, smile, clench, smirk (left and right), and laugh. ! ! 24
  • 25. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 25 Electrodes are situated and labeled according to the CMS/DRL configuration [6][7] BCI [6] Sharbrough F, Chatrian G-E, Lesser RP, Lüders H, Nuwer M, Picton TW. American Electroencephalographic Society Guidelines for Standard Electrode Position Nomenclature. J. Clin. Neurophysiol 8: 200-2. ! [7] Electroencephalography. Retrieved November 14th, 2010, from Electric and Magnetic Measurement of the Electric Activity of Neural Tissue: www.bem.fi/book/13/13.htm
  • 26. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Emotiv Systems $299 emotions EEG data facial gestures 26 BCI
  • 27. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Demo Wireless Emotiv® EEG Headset 27 BCI
  • 30. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 30 Field Description Values Timestamp It is the timestamp (date and time) of the computer running the system. It could be used to synchronize the data with other sensors. Format "yymmddhhmmssSSS" (y - year, m - month, d - day, h - hour, m - minutes, s - seconds, S - milliseconds). UserID Identifies the user. An integer value. Wireless Signal Status Shows the strength of the signal. The value is from 0 to 4, 4 being the best. Blink, Wink Left and Right, Look Left and Right, Raise Brow, Furrow, Smile, Clench, Smirk Left and Right, Laugh Part of the expressive suite. Values between 0 and 1, 1 being the value that represents the highest power/probability for this emotion. Short Term and Long Term Excitement, Engagement / Boredom, Meditation, Frustration Part of the Affective Suite. Values between 0 and 1, 1 being the value that represents the highest power/probability for this emotion. AF3, F7, F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4, F8, AF4. Raw data coming from each of the 14 channels. The name of these fields were defined according with the CMS/DRL configuration [XXX][XXXX]. Values of 4000 and higher. GyroX and GyroY Information about how the head moves/accelerates according to X and Y axis accordingly. BCI
  • 31. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 31 Timestamp AF3 F7 F3 FC5 T7 P7 O1 O2 P8 T8 FC6 F4 F8 AF4 AccX AccY 1011161125449014542.054831.79 4247.18 4690.26 4282.56 4395.38 4591.79 4569.23 4360 4570.77 4297.44 4311.28 4282.56 4367.18 1660 2003 1011161125449014536.924802.05 4243.08 4673.85 4272.31 4393.33 4592.82 4570.26 4354.87 4570.26 4292.31 4309.74 4277.95 4370.77 1658 2002 1011161125450104533.334798.97 4234.87 4669.74 4301.03 4396.92 4592.31 4570.77 4351.28 4561.03 4281.54 4301.54 4271.28 4363.59 1659 2003 1011161125450104549.234839.49 4241.03 4691.28 4333.85 4397.95 4596.41 4567.18 4355.9 4556.41 4286.15 4306.15 4277.95 4369.74 1659 2003 101116112545010 45804865.64 4251.79 4710.26 4340 4401.54 4603.59 4572.82 4360 4558.46 4298.97 4324.62 4296.41 4395.9 1657 2004 1011161125450104597.444860 4252.82 4705.64 4350.26 4412.31 4603.59 4577.44 4357.44 4555.9 4295.38 4329.23 4296.41 4414.36 1656 2005 1011161125450104584.624847.69 4246.67 4690.26 4360 4409.23 4597.44 4569.74 4351.79 4549.74 4278.97 4316.92 4272.82 4399.49 1656 2006 1011161125450104566.154842.05 4238.46 4684.1 4322.05 4389.74 4592.82 4566.67 4351.79 4549.74 4274.36 4310.26 4262.05 4370.77 1655 2005 1011161125450104563.594844.62 4231.79 4687.69 4267.69 4387.69 4594.36 4580 4361.03 4556.41 4278.97 4310.77 4274.36 4370.77 1653 2006 1011161125450104567.184847.18 4233.33 4688.72 4285.13 4409.23 4602.05 4589.23 4368.21 4560 4280.51 4310.77 4281.54 4390.26 1655 2004 1011161125450104570.264846.67 4234.87 4683.08 4323.08 4415.9 4604.1 4585.64 4366.67 4557.44 4277.95 4310.26 4273.33 4384.1 1652 2005 1011161125450104569.234842.56 4233.85 4678.46 4310.77 4402.56 4598.97 4583.08 4364.1 4553.85 4277.44 4310.26 4271.28 4372.31 1654 2005 1011161125450104558.464832.82 4234.87 4676.92 4301.03 4389.74 4595.38 4590.26 4368.72 4556.92 4280 4310.26 4276.92 4380 1653 2004 1011161125450104555.94831.79 4233.33 4679.49 4314.36 4390.26 4597.95 4598.97 4374.87 4562.56 4280.51 4311.28 4280 4386.15 1653 2004 1011161125450104569.744842.56 4232.82 4684.1 4303.59 4405.64 4609.74 4600 4378.46 4567.18 4278.97 4313.33 4280 4382.05 1653 2002 1011161125450104574.364846.67 4235.38 4683.08 4293.33 4416.41 4619.49 4604.1 4382.56 4570.77 4280.51 4310.77 4282.05 4382.05 1652 2002 1011161125450104562.054840.51 4227.18 4673.85 4300 4405.13 4611.28 4601.03 4376.41 4561.54 4280 4303.59 4279.49 4374.87 1652 2000 Raw Data
  • 32. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 32 Timestamp ID Signal Blink Wink L Wink R Look L Look R Eyebrow Furrow Smile Clench Smirk L Smirk R Laugh 101116091145065 0 2 0 0 0 1 0 0 0 0 0 0 0.988563 0 101116091145190 0 2 0 0 0 1 0 0 0 0.45465 0 0 0 0 101116091145315 0 2 0 0 0 1 0 0 0 0.467005 0 0 0 0 101116091145440 0 2 0 0 0 1 0 0 0 0.401006 0 0 0 0 101116091145565 0 2 0 0 0 1 0 0 0 0.248671 0 0 0 0 101116091145690 0 2 0 0 0 1 0 0 0 0.173023 0 0 0 0 101116091145815 0 2 0 0 0 1 0 0 0 0.162788 0 0 0 0 101116091145940 0 2 0 0 0 1 0 0 0 0.156485 0 0 0 0 101116091146065 0 2 0 0 0 1 0 0 0 0 0 0.776925 0 0 101116091146190 0 2 0 0 0 1 0 0 0 0 0 0 0.608679 0 101116091146315 0 2 0 0 0 1 0 0 0 0 0 0 0.342364 0 101116091146440 0 2 0 0 0 1 0 0 0 0 0 0 0.149695 0 101116091146565 0 2 0 0 0 1 0 0 0 0 0 0 0.0864399 0 101116091146690 0 2 0 0 0 1 0 0 0 0 0 0 0.0733481 0 101116091146815 0 2 0 0 0 1 0 0 0 0 0 0 0.118965 0 101116091146941 0 2 0 0 0 1 0 0 0 0 0 0 0.259171 0 Expressiv Data
  • 33. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 33 Timestamp Short Term Excitement Long Term Excitement Engagement Meditation Frustration 101116091145065 0.447595 0.54871 0.834476 0.333844 0.536197 101116091145190 0.447595 0.54871 0.834476 0.333844 0.536197 101116091145315 0.447595 0.54871 0.834476 0.333844 0.536197 101116091145440 0.487864 0.546877 0.834146 0.339548 0.54851 101116091145565 0.487864 0.546877 0.834146 0.339548 0.54851 101116091145690 0.487864 0.546877 0.834146 0.339548 0.54851 101116091145815 0.487864 0.546877 0.834146 0.339548 0.54851 101116091145940 0.521663 0.545609 0.839321 0.348321 0.558228 101116091146065 0.521663 0.545609 0.839321 0.348321 0.558228 101116091146190 0.521663 0.545609 0.839321 0.348321 0.558228 101116091146315 0.521663 0.545609 0.839321 0.348321 0.558228 101116091146440 0.509297 0.544131 0.84401 0.358717 0.546771 101116091146565 0.509297 0.544131 0.84401 0.358717 0.546771 101116091146690 0.509297 0.544131 0.84401 0.358717 0.546771 101116091146815 0.509297 0.544131 0.84401 0.358717 0.546771 101116091146941 0.451885 0.541695 0.848087 0.368071 0.533919 Affectiv Data
  • 34. 2. Sensing Devices | Face-Based Emotion Recognition 34 Session I
  • 35. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Face-Based Recognition Face-based emotion recognition systems ! These systems infer affective states by capturing images of the users’ facial expressions and head movements. ! We are going to show the capabilities of face-based emotion recognition systems using a simple 30 fps USB webcam and software from MIT Media Lab [8]. ! ! ! ! ! ! ! ! ! ! ! ! [8] R. E. Kaliouby and P. Robinson, “Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures,” Proc. Conference on Computer Vision and Pattern Recognition Workshop (CVPRW ‘04), IEEE Computer Society, June 2004, Volume 10, p. 154. 35
  • 36. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Face-Based Recognition MindReader API enables the real time analysis, tagging and inference of cognitive affective mental states from facial video in real-time. ! This framework combines vision-based processing of the face with predictions of mental state models to interpret the meaning underlying head and facial signals over time. ! It provides results at intervals of approximately 100 ms (10 Hz). ! ! With this system it is possible to infer emotions such as agreeing, concentrating, disagreeing, interested, thinking, and unsure. ! (Ekman and Friesen 1978) – Facial Action Coding System, 46 actions (plus head movements). 36
  • 37. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Demo MindReader Software from MIT Media Lab 37 Face-Based Recognition
  • 38. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 38 ! 19!Lip!Corner!Depressor! ! 26!Jaw!Drop! ! 27!Mouth!Stretch! Enero!22,!2010! Javier!González!!Sánchez!|!María!E.!Chávez!Echeagaray! 22! Face-Based Recognition
  • 40. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 40 Field Description Values Timestamp It is the timestamp (date and time) of the computer running the system. It could be used to synchronize the data with other sensors. Format "yymmddhhmmssSSS" (y - year, m - month, d - day, h - hour, m - minutes, s - seconds, S - milliseconds). Agreement, Concentrating, Disagreement, Interested, Thinking, Unsure This value shows the probability of this emotion being present on the user at a particular time (frame). This value is between 0 and 1. If the value is -1 it means it was not possible to define an emotion. This happens when the user's face is out of the camera focus. Mind Reader
  • 41. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 41 Timestamp Agreement Concentrating Disagreement Interested Thinking Unsure 101116112838516 0.001836032 0.999917 1.79E-04 0.16485406 0.57114255 0.04595062 101116112838578 0.001447654 0.9999516 1.29E-04 0.16310683 0.5958921 0.042706452 101116112838672 5.97E-04 0 1.5E-04 0.44996294 0.45527613 0.00789697 101116112838766 2.46E-04 0 1.75E-04 0.77445686 0.32144752 0.001418217 101116112838860 1.01E-04 0 2.04E-04 0.93511915 0.21167138 2.53E-04 101116112838953 4.18E-05 0 2.38E-04 0.983739 0.13208677 4.52E-05 101116112839016 1.72E-05 0 2.78E-04 0.9960774 0.07941038 8.07E-06 101116112839110 7.1E-06 0 3.24E-04 0.99906266 0.046613157 1.44E-06 101116112839156 2.92E-06 0 3.77E-04 0.99977654 0.026964737 2.57E-07 101116112839250 1.21E-06 0 4.4E-04 0.9999467 0.015464196 4.58E-08 101116112839391 4.97E-07 0 5.12E-04 0.9999873 0.008824189 8.18E-09 101116112839438 2.05E-07 0 5.97E-04 0.999997 0.005020725 1.46E-09 101116112839547 8.43E-08 0 6.96E-04 0.9999993 0.002851939 2.6E-10 101116112839578 3.47E-08 0 8.11E-04 0.9999999 0.001618473 4.64E-11 101116112839688 1.43E-08 0 9.45E-04 0.99999994 9.18E-04 8.29E-12 101116112839781 5.9E-09 0 0.001101404 1 5.21E-04 1.48E-12 101116112839828 2.43E-09 0 0.001283521 1 2.95E-04 2.64E-13 Mind Reader
  • 42. 2. Sensing Devices | Eye-Tracking Systems 42 Session I
  • 43. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Eye-tracking systems ! These are instruments that measure eye position and eye movement in order to detect zones in which the user has particular interest in a specific time and moment. ! Datasets from Tobii®Eye-tracking system [9] data will be shown. ! ! ! ! ! ! ! ! ! ! ! [9] Tobii Technology - Eye Tracking and Eye Control. Retrieved April 26, 2011, from http://www.tobii.com. Eye-Tracking System 43
  • 44. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Eye-Tracking System Tobii® Eye Tracker. ! The device reports data with intervals of 100 ms (10Hz). ! The output provides data concerning attention direction (gaze-x, gaze-y), duration of fixation, and pupil dilation. 44
  • 45. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Demo Tobii® Eye Tracker 45 Eye-Tracking System
  • 46. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 46 Field Description Values Timestamp It is the timestamp (date and time) of the computer running the system. It could be used to synchronize the data with other sensors. Format "yymmddhhmmssSSS" (y - year, m - month, d - day, h - hour, m - minutes, s - seconds, S - milliseconds). GazePoint X The horizontal screen position for either eye or the average for both eyes. This value is also used for the fixation definition. 0 is the left edge, the maximum value of the horizontal screen resolution is the right edge. GazePoint Y The vertical screen position for either eye or the average for both eyes. This value is also used for the fixation definition. 0 is the bottom edge, the maximum value of the vertical screen resolution is the upper edge. Pupil Left Pupil size (left eye) in mm. Varies Validity Left Validity of the gaze data. 0 to 4. 0 if the eye is found and the tracking quality good. If the eye cannot be found by the eye tracker, the validity code will be 4. Pupil Right Pupil size (right eye) in mm. Varies Validity Right Validity of the gaze data. 0 to 4. 0 if the eye is found and the tracking quality good. If the eye cannot be found by the eye tracker, the validity code will be 4. FixationDuration Fixation duration. The time in milliseconds that a fixation lasts. Varies Event Events, automatic and logged, will show up under Event. Varies AOI Areas Of Interest if fixations on multiple AOIs are to be written on the same row. Varies Eye-Tracking System
  • 47. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 47 Timestamp GPX GPY Pupil Left Validity L Pupil Right Validity R Fixation Event AOI 101124162405582 636 199 2.759313 0 2.88406 0 48 Content 101124162405599 641 207 2.684893 0 2.855817 0 48 Content 101124162405615 659 211 2.624458 0 2.903861 0 48 Content 101124162405632 644 201 2.636186 0 2.916132 0 48 Content 101124162405649 644 213 2.690685 0 2.831013 0 48 Content 101124162405666 628 194 2.651784 0 2.869714 0 48 Content 101124162405682 614 177 2.829281 0 2.899828 0 48 Content 101124162405699 701 249 2.780344 0 2.907665 0 49 Content 101124162405716 906 341 2.853761 0 2.916398 0 49 Content 101124162405732 947 398 2.829427 0 2.889944 0 49 Content 101124162405749 941 400 2.826602 0 2.881179 0 49 Content 101124162405766 938 403 2.78699 0 2.87948 0 49 KeyPress Content 101124162405782 937 411 2.803387 0 2.821803 0 49 Content 101124162405799 934 397 2.819166 0 2.871547 0 49 Content 101124162405816 941 407 2.811687 0 2.817927 0 49 Content 101124162405832 946 405 2.857419 0 2.857427 0 49 Content 101124162405849 0 0 -1 4 -1 4 49 Content Eye-Tracking System
  • 48. 2. Sensing Devices | Galvanic Skin Conductance Sensor 48 Session I
  • 50. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Galvanic Skin Conductance Provide information on the activity of physiological functions of an individual. ! ! Arousal detection. Measures the electrical conductance of the skin, which varies with its moisture level that depends on the sweat glands, which are controlled by the sympathetic and parasympathetic nervous systems. [10] ! ! Hardware designed by MIT Media Lab. ! ! It is a Wireless Bluetooth device that reports data in intervals of approximately 500 ms (2Hz) ! ! ! ! [10] M. Strauss, C. Reynolds, S. Hughes, K. Park, G. McDarby, and R.W. Picard, “The HandWave Bluetooth Skin Conductance Sensor,” Proc. First International Conference on Affective Computing and Intelligent Interaction (ACII 05), Springer-Verlang, Oct. 2005, pp. 699-706, doi:10.1007/11573548_90. 50
  • 51. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Demo Skin Electrical Conductance Sensor 51 Galvanic Skin Conductance
  • 53. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 53 Field Description Values Timestamp It is the timestamp (date and time) of the computer running the system. It could be used to synchronize the data with other sensors. Format "yymmddhhmmssSSS" (y - year, m - month, d - day, h - hour, m - minutes, s - seconds, S - milliseconds). Battery Voltage Level of the battery voltage. 0 - 3 Volts Conductance Level of arousal. 0 - 3 Volts Galvanic Skin Conductance
  • 54. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 54 Timestamp Voltage Conductance 101116101332262 2.482352941 1.030696176 101116101332762 2.482352941 1.023404165 101116101333262 2.482352941 1.019813274 101116101333762 2.482352941 1.041657802 101116101334247 2.482352941 0.998280273 101116101334747 2.482352941 0.991181142 101116101335247 2.482352941 0.980592229 101116101335747 2.482352941 0.998280273 101116101336247 2.482352941 1.012586294 101116101336762 2.482352941 1.012586294 101116101337231 2.482352941 1.012586294 101116101337747 2.482352941 1.009008251 101116101338247 2.482352941 0.998280273 101116101338747 2.482352941 0.991181142 101116101339247 2.482352941 0.987628521 101116101339731 2.482352941 0.987628521 101116101340231 2.482352941 0.980592229 Galvanic Skin Conductance
  • 55. 2. Sensing Devices | Pressure Sensor 55 Session I
  • 57. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Pressure Sensor Provide information on the activity of physiological functions of an individual. ! ! Pressure sensors are able to detect the increasing amount of pressure (correlated with levels of frustration) that the user puts on a mouse, or any other controller (such as a game controller) [11]. ! ! Hardware designed by MIT Media Lab. ! ! It is a serial device that reports data in intervals of approximately 150 ms (6Hz). ! ! ! ! [11] Y. Qi, and R. W. Picard, "Context-Sensitive Bayesian Classifiers and Application to Mouse Pressure Pattern Classification," Proc. International Conference on Pattern Recognition (ICPR 02), Aug. 2002, vol 3, pp. 30448, doi:10.1109/ICPR.2002.1047973. 57
  • 58. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Demo Mouse Pressure Sensor 58 Pressure Sensor
  • 60. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 60 Field Description Values Timestamp It is the timestamp (date and time) of the computer running the system. It could be used to synchronize the data with other sensors. Format "yymmddhhmmssSSS" (y - year, m - month, d - day, h - hour, m - minutes, s - seconds, S - milliseconds). Right Rear Sensor positioned in the right rear of the mouse. 0 - 1024. 0 being the highest pressure. Right Front Sensor positioned in the right front of the mouse. 0 - 1024. 0 being the highest pressure. Left Rear Sensor positioned in the left rear of the mouse. 0 - 1024. 0 being the highest pressure. Left Front Sensor positioned in the left front of the mouse. 0 - 1024. 0 being the highest pressure. Middle Rear Sensor positioned in the middle rear of the mouse. 0 - 1024. 0 being the highest pressure. Middle Front Sensor positioned in the middle front of the mouse. 0 - 1024. 0 being the highest pressure. Pressure Sensor
  • 61. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 61 Timestamp Right Rear Right Front Left Rear Left Front Middle Rear Middle Front 110720113306312 1023 1023 1023 1023 1023 1023 110720113306468 1023 1023 1023 1023 1023 1023 110720113306625 1023 998 1023 1002 1023 1023 110720113306781 1023 1009 1023 977 1023 1023 110720113306937 1023 794 1023 982 1023 1023 110720113307109 1023 492 1022 891 1023 1023 110720113307265 1023 395 1021 916 1019 1023 110720113307421 1023 382 1021 949 1023 1023 110720113307578 1023 364 1022 983 1023 1023 110720113307734 1023 112 1021 1004 1023 1023 110720113307890 1023 204 1021 946 1023 1023 110720113308046 1023 465 1022 971 1023 1023 110720113308203 1023 404 1022 1023 1023 1023 110720113308359 1023 166 1021 1023 1023 1023 110720113308515 1023 145 1021 1023 1023 1023 110720113308687 1023 154 1021 1023 1023 1023 110720113308843 1023 126 1021 1023 1023 1023 Pressure Sensor
  • 62. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 62 Pressure Sensor The raw data from the mouse is processed to obtain meaningful information [12]. The data obtained is a normalized value from the six different sensors on the mouse. ! ! ! ! ! ! [12] Cooper, D., Arroyo, I., Woolf, B., Muldner, K., Burleson, W., and Christopherson, R. (2009). Sensors model student self concept in the classroom, User Modeling, Adaptation, and Personalization, 30--41
  • 63. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 63 Timestamp Right Rear Right Front Left Rear Left Front Middle Rear Middle Front Value 110720113306312 1023 1023 1023 1023 1023 1023 6 110720113306468 1023 1023 1023 1023 1023 1023 6 110720113306625 1023 998 1023 1002 1023 1023 5.955034213 110720113306781 1023 1009 1023 977 1023 1023 5.941348974 110720113306937 1023 794 1023 982 1023 1023 5.736070381 110720113307109 1023 492 1022 891 1023 1023 5.350928641 110720113307265 1023 395 1021 916 1019 1023 5.275659824 110720113307421 1023 382 1021 949 1023 1023 5.299120235 110720113307578 1023 364 1022 983 1023 1023 5.315738025 110720113307734 1023 112 1021 1004 1023 1023 5.088954057 110720113307890 1023 204 1021 946 1023 1023 5.122189638 110720113308046 1023 465 1022 971 1023 1023 5.402737048 110720113308203 1023 404 1022 1023 1023 1023 5.393939394 110720113308359 1023 166 1021 1023 1023 1023 5.160312805 110720113308515 1023 145 1021 1023 1023 1023 5.139784946 110720113308687 1023 154 1021 1023 1023 1023 5.1485826 110720113308843 1023 126 1021 1023 1023 1023 5.121212121 Pressure Sensor
  • 64. 2. Sensing Devices | Posture Sensor 64 Session I
  • 66. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Posture detection using a low-cost, low-resolution pressure sensitive seat cushion and back pad. ! Developed at ASU based on experience using a more expensive high resolution unit from the MIT Media Lab [13]. ! ! ! ! ! ! ! ! ! [13] S. Mota, and R. W. Picard, "Automated Posture Analysis for Detecting Learners Interest Level," Proc. Computer Vision and Pattern Recognition Workshop (CVPRW 03), IEEE Press, June 2003, vol. 5, pp. 49, doi:10.1109/CVPRW.2003.10047. Posture Sensor 66
  • 67. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Demo Chair Posture Sensor 67 Posture Sensor
  • 69. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 69 Field Description Values Timestamp It is the timestamp (date and time) of the computer running the system. It could be used to synchronize the data with other sensors. Format "yymmddhhmmssSSS" (y - year, m - month, d - day, h - hour, m - minutes, s - seconds, S - milliseconds). AccX Value of the X axis of the accelerometer. Varies AccY Value of the Y axis of the accelerometer. Varies Right Seat Sensor positioned in the right side of the seat cushion. 0 - 1024. 1024 being the highest pressure. Middle Seat Sensor positioned in the middle of the seat cushion. 0 - 1024. 1024 being the highest pressure. Left Seat Sensor positioned in the left side of the seat cushion. 0 - 1024. 1024 being the highest pressure. Right Back Sensor positioned in the right side of the back pad. 0 - 1024. 1024 being the highest pressure. Middle Back Sensor positioned in the middle of the back pad. 0 - 1024. 1024 being the highest pressure. Left Back Sensor positioned in the left side of the back pad. 0 - 1024. 1024 being the highest pressure. Posture Sensor
  • 70. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 70 Timestamp AccX AccY Right Seat Middle Seat Left Seat Right Back Middle Back Left Back 110720074358901 980 -171 1015 1019 1012 976 554 309 110720074359136 969 -169 1008 1004 1012 978 540 305 110720074359401 1008 -165 1015 1012 1008 974 554 368 110720074359636 993 -166 1001 1004 1016 975 548 306 110720074359854 994 -167 1015 1011 1003 967 559 418 110720074400120 970 -167 1011 1008 1001 968 620 358 110720074400354 977 -166 1011 1011 1013 968 541 413 110720074400589 985 -128 1012 1010 1006 974 565 314 110720074400839 996 -182 1016 1014 1012 972 668 290 110720074401089 991 -185 1012 1012 1004 858 108 2 110720074401526 1068 -310 1019 522 1001 32 0 421 110720074401557 937 -124 92 0 0 0 1 247 110720074401714 979 -104 103 0 0 0 0 87 110720074401745 957 -165 143 3 0 0 0 0 110720074402026 945 -171 126 0 0 0 1 3 110720074402339 948 -169 0 0 1 0 0 0 110720074402620 952 -166 3 0 1 0 0 0 Posture Sensor
  • 71. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 71 Posture Sensor The raw data from the six chair sensors is processed [13] to obtain net seat change, net back change, and sit forward values. ! ! ! ! ! ! ! ! [12] Cooper, D., Arroyo, I., Woolf, B., Muldner, K., Burleson, W., and Christopherson, R. (2009). Sensors model student self concept in the classroom,
  • 72. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 72 Timestamp Right Seat Middle Seat Left Seat Right Back Middle Back Left Back NetSeat Change NetBack Change SitForward 110720074358901 1015 1019 1012 976 554 309 12 152 0 110720074359136 1008 1004 1012 978 540 305 22 20 0 110720074359401 1015 1012 1008 974 554 368 19 81 0 110720074359636 1001 1004 1016 975 548 306 30 69 0 110720074359854 1015 1011 1003 967 559 418 34 131 0 110720074400120 1011 1008 1001 968 620 358 9 122 0 110720074400354 1011 1011 1013 968 541 413 15 134 0 110720074400589 1012 1010 1006 974 565 314 9 129 0 110720074400839 1016 1014 1012 972 668 290 14 129 0 110720074401089 1012 1012 1004 858 108 2 14 962 0 110720074401526 1019 522 1001 32 0 421 500 1353 0 110720074401557 92 0 0 0 1 247 2450 207 1 110720074401714 103 0 0 0 0 87 11 161 1 110720074401745 143 3 0 0 0 0 43 87 1 110720074402026 126 0 0 0 1 3 20 4 1 110720074402339 0 0 1 0 0 0 127 4 1 110720074402620 3 0 1 0 0 0 3 0 1 Posture Sensor
  • 73. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 73 Sensing Device (rate in Hz) Legacy Software Sensing (Input or Raw Data) Physiological responses and/or Emotion reported (output or sensed values) Emotiv® EEG headset (128 Hz) Emotiv® SDK Brain Waves EEG activity. Reported in 14 channels [16],: AF3, F7, F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4, F8, and AF4. Face activity. Blink, wink (left and right), look (left and right), raise brow, furrow brow, smile, clench, smirk (left and right), and laugh. ! Emotions. Excitement, engagement, boredom, meditation and frustration. Standard Webcam (10 Hz) MIT Media Lab MindReader Facial Expressions Emotion. Agreeing, concentrating, disagreeing, interested, thinking and unsure. MIT skin conductance sensor (2 Hz) USB driver Skin Conductivity Arousal. MIT pressure sensor (6 Hz) USB driver Pressure One pressure value per sensor allocated into the input/control device. Tobii® Eye tracking (60 Hz) Tobii® SDK Eye Tracking Gaze point (x, y). MIT posture sensor (6 Hz) USB driver Pressure Pressure values in the back and the seat (in the right, middle and left zones) of a cushion chair. Summary
  • 74. Session 2 working with data and research experiences 74
  • 75. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Background 75 The datasets shown in the next slides correspond to the data collected in three studies. The stimuli, protocol, and participants are described briefly in the following paragraphs. ! 1. Study One: Subjects playing a video game. ! 2. Study Two: Subjects reading documents with and without pictures. ! 3. Study Three: Subjects solving tasks in a Tutoring System.
  • 76. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Study One 76 Stimuli. The high-fidelity graphic, deeply engaging, Guitar Hero® video game. The game involves holding a guitar interface while listening to music and watching a video screen. ! Protocol. The study consists of a one-hour session with (1) 15 minutes to practice with the objective that the user gets familiar with the game controller and the environment and (2) 45 minutes when users played four songs of their choice, one of each level: easy, medium, hard and expert. ! Participants. The call for participation was an open call among Arizona State University students. The experiment was run over 21 subjects, 67% of them were men and 33% women. The age ranged from 18 to 28 years. The study includes subjects with different (self-reported) experience playing video games, where 15% never played before, 5% were not skilled in video games, 28% reported being slightly skilled, 33% somewhat skilled, 14% very skilled, and only 5% reported themselves as experts.
  • 77. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Study Two 77 Stimuli. On-screen reading material was used for this study. Two different types of reading material were used, one with either on-task or off-task images, captions, and drawings; and the second containing only text. ! Protocol. The study consists of one 60-minute session where the subject is presented with 10 pages from a popular Educational Psychology textbook and asked to read for understanding. Each participant was asked to complete a pre and post-test. ! Participants. The call for participation also was an open call among Arizona State University students. The study was run over 28 subjects. !
  • 78. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Study Three 78 Stimuli. A Tutoring System for system dynamics. The tutor tutors students on how to model systems behavior using a graphical representation and algebraic expressions. The model is represented using a directed graph structure that defines a topology formed by nodes and links. Once there is a complete model, students can execute and debug it [13]. ! Protocol. Students were asked to perform a set of tasks (about system dynamics modeling) using the tutoring system. While the student was working on these tasks, the tutoring system collected emotional state information with the intention of being able to generate better and more accurate hints and feedback. ! Participants. Pilot test during Summer 2010 with 2 groups of 30 high-school students. ! ! [14] K. VanLehn, W. Burleson, M.E. Chavez-Echeagaray, R. Christopherson, J. Gonzalez-Sanchez, Y. Hidalgo-Pontet, and L. Zhang. “The Affective Meta-Tutoring Project: How to motivate students to use effective meta-cognitive strategies,” Proceedings of the 19th International Conference on Computers in Education. Chiang Mai, Thailand: Asia- Pacific Society for Computers in Education. October 2011. In press.
  • 79. 3. Data Filtering and Integration 79 Session 2
  • 80. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Filtering 80 Filtering the data includes cleaning, synchronizing, and averaging or passing a threshold or a range.
  • 81. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Filtering and Integration 81 The multimodal emotion recognition considers the existence of different sources of data that contribute to infer the affective state of an individual. Each sensing device has its own type of data and sample rate. The challenge is how to combine the data coming from these diverse sensing devices that have proven their independent functionality and create an improved unique output.
  • 82. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Filtering and Integration Centre Agent Agent Data Logger ! Data Tutoring Multimodal 82
  • 83. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Framework 83 We developed a framework that follows the organizational strategy of an agent federation [15]. ! ! The federation assigns one agent to collect raw data from each sensing device. ! ! That agent implements the perception mechanism for its assigned sensing device to map raw data into beliefs. ! ! Each agent is autonomous and encapsulates one sensing device and its perception mechanisms into independent, individual, and intelligent components. All the data is timestamped and independently identified by agent. ! ! ! [15] Gonzalez-Sanchez, J.; Chavez-Echeagaray, M.E.; Atkinson, R.; and Burleson, W. (2011), "ABE: An Agent-Based Software Architecture for a Multimodal Emotion Recognition Framework," in Proceedings of Working IEEE/IFIP Conference on Software Architecture (June 2011).
  • 84. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Framework 84 When several agents need to interact, it is important to establish an organizational strategy for them, which defines authority relationships, data flows and coordination protocols [16] [17]. ! ! These beliefs are reported to a central agent, which integrates them into one affective state report. ! ! Third-party systems are able to obtain affective state reports from the centre-agent using a publish-subscribe style. ! ! ! ! ! [16] F. Tuijnman, and H. Afsarmanesh, "Distributed objects in a federation of autonomous cooperating agents," Proc. International Conference on Intelligent and Cooperative Information Systems, May 1993, pp. 256-265, doi:10.1109/ICICIS.1993.291763. ! [17] M. Wood, and S. DeLoach, “An overview of the multiagent systems engineering methodology,” Agent-Oriented Software Engineering, (AOSE 2000), Springer-Verlag, 2001, pp. 207-221, doi:10.1007/3-540-44564-1_14.
  • 85. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Framework | Architecture [15] Gonzalez-Sanchez, J.; Chavez-Echeagaray, M.E.; Atkinson, R.; and Burleson, W. (2011), "ABE: An Agent-Based Software Architecture for a Multimodal Emotion Recognition Framework," in Proceedings of Working IEEE/IFIP Conference on Software Architecture (June 2011). 85
  • 86. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Integration | sparse 86 [18] J. Liu, S. Ji, and J. Ye. SLEP: Sparse Learning with Efficient Projections. Arizona State University, 2009. http://www.public.asu.edu/~jye02/Software/SLEP. timestamp fixationIndex gazePointX gazePointY mappedFixationPo intX mappedFixationPo intY fixationDuration Short Term Excitement Long Term Excitement Engagement/ Boredom Meditation Frustration Conductance agreement concentrating 4135755652 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 4135755659 213 573 408 570 408 216 4135755668 0.436697 0.521059 0.550011 0.335825 0.498908 4135755676 213 566 412 570 408 216 4135755692 213 565 404 570 408 216 4135755709 213 567 404 570 408 216 4135755714 4135755726 213 568 411 570 408 216 4135755742 213 568 409 570 408 216 4135755759 213 563 411 570 408 216 4135755761 4135755776 213 574 413 570 408 216 4135755792 213 554 402 570 408 216 4135755809 214 603 409 696 405 216 4135755824 4135755826 214 701 407 696 405 216 4135755842 214 697 403 696 405 216 4135755859 214 693 401 696 405 216 4135755876 214 700 402 696 405 216 4135755892 214 701 411 696 405 216 4135755909 214 686 398 696 405 216 4135755918 4135755926 214 694 399 696 405 216 4135755942 214 694 407 696 405 216 4135755959 214 698 404 696 405 216 4135755964 4135755976 214 704 398 696 405 216 4135755992 214 693 415 696 405 216 4135756009 214 696 411 696 405 216 4135756025 215 728 406 804 387 183 4135756027 0.436697 0.521059 0.550011 0.335825 0.498908 1 1
  • 87. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Integration | state machine 87 timestamp fixationIndex gazePointX gazePointY mappedFixationPo intX mappedFixationPo intY fixationDuration Short Term Excitement Long Term Excitement Engagement/ Boredom Meditation Frustration Conductance agreement concentrating 4135755652 213 574 414 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755659 213 573 408 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755668 213 573 408 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755676 213 566 412 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755692 213 565 404 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755709 213 567 404 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755714 213 567 404 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755726 213 568 411 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755742 213 568 409 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755759 213 563 411 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755761 213 563 411 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755776 213 574 413 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755792 213 554 402 570 408 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755809 214 603 409 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755824 214 603 409 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755826 214 701 407 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755842 214 697 403 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755859 214 693 401 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755876 214 700 402 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755892 214 701 411 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755909 214 686 398 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755918 214 686 398 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755926 214 694 399 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755942 214 694 407 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755959 214 698 404 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755964 214 698 404 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755976 214 704 398 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135755992 214 693 415 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135756009 214 696 411 696 405 216 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135756025 215 728 406 804 387 183 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1 4135756027 215 728 406 804 387 183 0.436697 0.521059 0.550011 0.335825 0.4989080.401690628 1 1
  • 88. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Integration | time window 88 timestamp NetSeatChange NetBackChange SitForward RightSeat MiddleSeat LeftSeat RightBack MiddleBack LeftBack MouseValue RightRear RightFront LeftRear LeftFront 11072009371 10 15 0999.28865981012.041237977.3505155979.0103093937.8762887921.7731959 51022.9521281022.9627661022.9574471022.968085 11072009372 16 29 01003.2989691011.505155 970.257732982.5463918 924.371134929.8041237 51022.9840431022.9574471022.9627661022.952128 11072009381 13 17 01005.8020831011.427083976.2395833979.7083333939.5208333944.8958333 51022.9574471022.9787231022.9946811022.968085 11072009382 14 20 01005.3814431011.463918973.8969072982.0927835938.5979381946.6494845 51022.9574471022.9840431022.9574471022.957447 11072009391 10 52 01005.1340211012.020619 977.742268972.2371134672.3298969923.5463918 51022.9788361022.9735451022.9894181022.957672 11072009392 10 15 01004.3608251012.494845987.5051546981.3814433865.4020619942.2268041 51022.989305635.33689841021.8502671022.962567 11072009401 10 21 01003.052083 1012.09375 988.375 984.46875 873945.2708333 51022.978947990.74210531022.9157891022.989474 11072009402 9 9 0997.60204081012.214286989.5102041982.6836735868.4897959942.9489796 5 10231020.547872 1023 1023 11072009411 10 73 0999.03092781013.010309 984.556701 975.628866791.1649485938.0309278 51022.9840431022.9574471022.9840431022.952128 11072009412 11 57 01002.4948451012.618557 978.814433970.2371134868.4020619948.6494845 51022.9576721022.9735451022.968254 1022.94709 11072009421 9 15 01000.948454 1013.85567983.8556701975.9484536931.2680412961.3298969 51022.9734041022.9627661022.9627661022.957447 11072009422 10 14 01000.2680411013.175258983.4845361980.7628866935.7113402962.0927835 51022.9468091022.9893621022.9734041022.957447 11072009431 10 26 01000.9278351013.237113982.3608247974.9587629912.6907216948.7216495 51022.9839571022.9839571022.9572191022.946524 11072009432 10 25 01001.6701031013.309278981.8041237976.5670103938.2886598955.7010309 51022.9946811022.9734041022.9946811022.984043 11072009441 9 62 01002.5257731013.515464984.6082474971.2474227852.7835052947.0927835 5 1023 1023 1023 1023 11072009442 12 30 01006.1145831011.208333988.5833333973.7291667846.1458333948.5729167 5 1022.978611022.9893051022.9732621022.994652 11072009451 10 35 0 1006.6251011.645833 995.875968.9166667 809.46875934.2291667 51022.9734041022.9893621022.9840431022.978723 11072009452 10 37 01006.3402061012.865979996.0103093972.4226804807.9072165933.5773196 51022.9840431022.9521281022.9787231022.973404 11072009461 9 32 01005.8645831012.708333 996.875971.3229167815.8229167 938.875 51022.9946811022.9680851022.9734041022.978723 11072009462 8 21 01006.322917 1011.9375997.4895833 970.875852.8645833 942.09375 51022.9679141022.9358291022.9304811022.946524 11072009471 10 25 01005.5729171011.427083997.0833333 972.0625 884.28125943.8333333 51022.9893621022.9893621022.9680851022.973404 11072009472 10 17 01005.072917 1011.40625997.6145833973.3333333 901.84375944.4479167 51022.9734041022.9255321022.9468091022.978723 11072009481 10 16 01005.7938141012.329897998.7525773970.6701031 900.371134946.5360825 51022.9680851022.9574471022.9627661022.941489 11072009482 10 22 0 1005.93751012.333333999.9791667972.9791667 890.34375945.3541667 51022.9893621022.9521281022.9627661022.952128 11072009491 9 10 01007.4583331012.7395831000.229167 971.65625900.2395833 945.96875 51022.989362 10231022.9840431022.994681 11072009492 9 21 01007.304348 10131000.608696964.6195652869.6304348949.2608696 51022.994709 10231022.9576721022.962963 11072009501 9 16 0 1007.29 1013.14 998.94 961.98 870.65 950.76 51022.9787231022.9946811022.9893621022.984043 11072009502 10 11 0 1006.21875 1012.34375999.8854167963.1145833882.2708333943.6770833 51022.983957 10231022.9893051022.994652 11072009511 8 14 01006.8229171012.979167 1000.15625967.1041667884.0729167 942.21875 51022.9841271022.9788361022.9947091022.984127 11072009512 9 15 0 1006.906251013.4270831001.041667967.4895833884.7083333938.5208333 5 1023 1023 1023 1023 11072009521 9 24 01006.7052631013.0842111000.031579966.7894737866.5684211940.1578947 5 1023 1023 1023 1023
  • 90. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Tool | Eureqa mathematical relationships in data 90
  • 91. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 91 For reverse engineering searches of the data, the Eureqa tool [19] is used to discover mathematical expressions of the structural relationships in the data records. ! For example, if a record holds information about the physical and emotional behavior of an individual who was engaged in a single experimental setting, Eureqa could take all the available sources of data and reveal both how the measure of engagement is calculated from specific data streams as well as how other sensors may influence the proposed emotional construct. ! ! ! ! ! ! ! ! [19] Dubcˇa ́kova ́, R. Eureqa-software review. Genetic programming and evolvable machines. Genet. Program. Evol. Mach. (2010) online first. doi:10.1007/s10710- 010-9124-z . Tool | Eureqa
  • 94. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Tool | Weka explore classification clustering pre-processing visualization 94
  • 95. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray 95 For clustering and classification approaches: Weka [20] It is a tool that implements a collection of machine learning algorithms for data mining tasks and is used to explore the data composition and relationships and derive useful knowledge from data records. ! ! ! ! ! ! ! [20] M. Hall, E. Frank, G. Holmes, B. Pfahringer, P. Reutemann, and I.H. Witten, “The WEKA Data Mining Software: An Update,” Proc. SIGKDD Explorations, 2009, Tool | Weka
  • 96. 96 Tool | Weka P0009 novice playing expert song
  • 97. 97 Tool | Weka P0009 novice playing expert song
  • 98. 98 Tool | Weka Documented API for Java Developers
  • 99. 5. Sharing Experiences and Group Discussion 99 Session 2
  • 100. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Experiences 100
  • 101. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Visualization 101 BCI and Gaze Points engagement This figure shows the engagement fixation points of expert player playing in expert-mode. The size of the circle represents the duration of the fixation in that point, while the level of shading represents the intensity of the emotion.
  • 102. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Visualization 102 BCI and Gaze Points frustration This figure shows the frustration fixation points of expert player playing in expert-mode. The size of the circle represents the duration of the fixation in that point, while the level of shading represents the intensity of the emotion.
  • 103. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Visualization 103 BCI and Gaze Points boredom This figure shows the boredom fixation points of expert player playing in expert-mode. The size of the circle represents the duration of the fixation in that point, while the level of shading represents the intensity of the emotion.
  • 104. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Visualization 104 BCI and Gaze Points engagement This figures shows the engagement gaze points (above a threshold of 0.6) of a user reading material with seductive details (i.e. cartoons). For this user the text on the bottom part of the first column was engaging.
  • 105. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Visualization 105 BCI and Gaze Points frustration This figure shows the frustration gaze points (above a threshold of 0.6) of a user reading material with seductive details (i.e. cartoons). Looking at the cartoon is related with high frustration level.
  • 106. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Visualization 106 BCI and Gaze Points boredom This figure shows the boredom gaze points (above a threshold of 0.5) of a user while reading material with seductive details (i.e. cartoons). Notice that the text in the middle part of the second column of that page was boring.
  • 107. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Matching Values 107 Our hypothesis is that, to a great extent, it is possible to infer the values of one source from the other source. BCI-based values Correlation with face-based model excitement 0.284 engagement 0.282 meditation 0.188 frustration 0.275 Face-based values Correlation with BCI-based model agreement 0.76 concentrating 0.765 disagreement 0.794 interested 0.774 thinking 0.78 unsure 0.828 BCI and Face-Based Emotion Recognition
  • 108. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Matching Values 108 …" [21] L. Breiman. Random forests. Machine Learning, 2001. Volume 45, Number 1. Pages 5–32. Random Forest
  • 109. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Networks Structural(Equa+ons,(Adjacency(Matrixes( and(Networks(Graphs( Brain(schema+c,(showing(the(channels( that(contribute(with(engagement( ! ! BCI raw data | Engagement 109 to
  • 110. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Closed-loop Emotion Adaptation | Games 110 [22] Bernays, R., Mone, J., Yau, P., Murcia, M., Gonzalez-Sanchez, J., Chavez-Echeagaray, M. E., Christopherson, R. M., Atkinson, R., and Yoshihiro, K. 2012. Lost in the Dark: Emotion Adaption. In Adjunct proceedings of the 25th annual ACM symposium on User interface software and technology, 79–80. New York, NY, USA. ACM. doi:10.1145/2380296.2380331.

  • 111. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Closed-loop Emotion Mirroring | Virtual World 111 [23] Gonzalez-Sanchez, J., Chavez-Echeagaray, M. E., Gibson, D., and Atkinson, R. 2013. Multimodal Affect Recognition in Virtual Worlds: Avatars Mirroring User's Affect. In ACII '13: Proceedings of the 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction. IEEE Computer Society.
 

  • 112. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Group Discussion 112 devices data inferences
  • 113. 6. Conclusions and Q&A 113 Session 2
  • 114. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Conclusion This course aims to provide the attendees with a presentation about tools and dataset exploration. While the course do not present an exhaustive list of all the methods available for gathering, processing, analyzing, and interpreting affective sensor data, the course describe the basis of a multimodal approach that attendees can use to launch their own research efforts. ! ! ! ! ! ! ! ! ! ! ! ! ! [22] B. du Boulay, “Towards a Motivationally-Intelligent Pedagogy: How should an intelligent tutor respond to the unmotivated or the demotivated?,” Proc. New Perspectives on Affect and Learning Technologies, R. A. Calvo & S. D'Mello (Eds.), Springer-Verlag, in press. 114
  • 115. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray References 1. R.S.J. Baker, M.M.T Rodrigo and U.E. Xolocotzin, “The Dynamics of Affective Transitions in Simulation Problem-solving Environments,” Proc. Affective Computing and Intelligent Interaction: Second International Conference (ACII ’07), A. Paiva, R. Prada & R. W. Picard (Eds.), Springer-Verlag, Vol. Lecture Notes in Computer Science 4738, pp. 666-677. 2. I. Arroyo, D. G. Cooper, W. Burleson, F. P. Woolf, K. Muldner, and R. Christopherson, “Emotion Sensors Go to School,” Proc. Artificial Intelligence in Education: Building Learning Systems that Care: from Knowledge Representation to Affective Modelling, (AIED 09), V. Dimitrova, R. Mizoguchi, B. du Boulay & A. Grasser (Eds.), IOS Press, July 2009, vol. Frontiers in Artificial Intelligence and Applications 200, pp. 17-24. 3. R. W. Picard, Affective Computing, MIT Press, 1997. 4. J. Gonzalez-Sanchez, R. M. Christopherson, M. E. Chavez-Echeagaray, D. C. Gibson, R. Atkinson, W. Burleson, “ How to Do Multimodal Detection of Affective States?,” icalt, pp.654-655, 2011 IEEE 11th International Conference on Advanced Learning Technologies, 2011 5. Emotiv - Brain Computer Interface Technology. Retrieved April 26, 2011, from http://www.emotiv.com. 6. F. Sharbrough, G.E. Chatrian, R.P. Lesser, H. Lüders, M. Nuwer, T.W. Picton. American Electroencephalographic Society Guidelines for Standard Electrode Position Nomenclature. J. Clin. Neurophysiol 8: 200-2. 7. Electroencephalography. Retrieved November 14th, 2010, from Electric and Magnetic Measurement of the Electric Activity of Neural Tissue: http://www.bem.fi/book/13/13.htm 115
  • 116. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray References 8. R. E. Kaliouby and P. Robinson, “Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures,” Proc. Conference on Computer Vision and Pattern Recognition Workshop (CVPRW ‘04), IEEE Computer Society, June 2004, Volume 10, p.154. 9. Tobii Technology - Eye Tracking and Eye Control. Retrieved April 26, 2011, from http:// www.tobii.com. 10. M. Strauss, C. Reynolds, S. Hughes, K. Park, G. McDarby, and R.W. Picard, “The HandWave Bluetooth Skin Conductance Sensor,” Proc. First International Conference on Affective Computing and Intelligent Interaction (ACII 05), Springer-Verlang, Oct. 2005, pp. 699-706, doi: 10.1007/11573548_90. 11. Y. Qi, and R. W. Picard, "Context-Sensitive Bayesian Classifiers and Application to Mouse Pressure Pattern Classification," Proc. International Conference on Pattern Recognition (ICPR 02), Aug. 2002, vol 3, pp. 30448, doi:10.1109/ICPR.2002.1047973. 12. D. Cooper, I. Arroyo, B. Woolf, K. Muldner, W. Burleson, and R. Christopherson. (2009). Sensors model student self concept in the classroom, User Modeling, Adaptation, and Personalization, 30-41. 13. S. Mota, and R. W. Picard, "Automated Posture Analysis for Detecting Learners Interest Level," Proc. Computer Vision and Pattern Recognition Workshop (CVPRW 03), IEEE Press, June 2003, vol. 5, pp. 49, doi:10.1109/CVPRW.2003.10047. 116
  • 117. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray References 14. K. VanLehn, W. Burleson, M.E. Chavez-Echeagaray, R. Christopherson, J. Gonzalez-Sanchez, Y. Hidalgo-Pontet, and L. Zhang. “The Affective Meta-Tutoring Project: How to motivate students to use effective meta-cognitive strategies,” Proceedings of the 19th International Conference on Computers in Education. Chiang Mai, Thailand: Asia- Pacific Society for Computers in Education. October 2011. In press. 15. J. Gonzalez-Sanchez; M.E. Chavez-Echeagaray; R. Atkinson; and W. Burleson. (2011), "ABE: An Agent-Based Software Architecture for a Multimodal Emotion Recognition Framework," in Proceedings of Working IEEE/IFIP Conference on Software Architecture (June 2011). 16. F. Tuijnman, and H. Afsarmanesh, "Distributed objects in a federation of autonomous cooperating agents," Proc. International Conference on Intelligent and Cooperative Information Systems, May 1993, pp. 256-265, doi:10.1109/ICICIS.1993.291763. 17. M. Wood, and S. DeLoach, “An overview of the multiagent systems engineering methodology,” Agent-Oriented Software Engineering, (AOSE 2000), Springer-Verlag, 2001, pp. 207-221, doi: 10.1007/3-540-44564-1_14. 18. [18] J. Liu, S. Ji, and J. Ye. SLEP: Sparse Learning with Efficient Projections. Arizona State University, 2009. http://www.public.asu.edu/~jye02/Software/SLEP. 19. [19] R. Dubcakova. Eureqa-software review. Genetic programming and evolvable machines. Genet. Program. Evol. Mach. (2010) online first. doi:10.1007/s10710- 010-9124-z . 117
  • 118. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray References 20. M. Hall, E. Frank, G. Holmes, B. Pfahringer, P. Reutemann, and I.H.Witten, “The WEKA Data Mining Software: An Update,” Proc. SIGKDD Explorations, 2009, Volume 11, Issue 1. 21. L. Breiman. Random forests. Machine Learning, 2001. Volume 45, Number 1. Pages 5–32. 22. R. Bernays, J. Mone, P. Yau, M. Murcia, J. Gonzalez-Sanchez, M.E. Chavez-Echeagaray, R. Christopherson, R. Atkinson, and K. Yoshihiro. 2012. Lost in the Dark: Emotion Adaption. In Adjunct proceedings of the 25th annual ACM symposium on User interface software and technology, 79–80. New York, NY, USA. ACM. doi:10.1145/2380296.2380331. 23. J. Gonzalez-Sanchez, M.E. Chavez-Echeagaray, D. Gibson, and R. Atkinson. 2013. Multimodal Affect Recognition in Virtual Worlds: Avatars Mirroring User's Affect. In ACII '13: Proceedings of the 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction. IEEE Computer Society. 724-725. doi: 10.1109/ACII.2013.133 24. B. du Boulay, Towards a Motivationally-Intelligent Pedagogy: How should an intelligent tutor respond to the unmotivated or the demotivated?, Proc. New Perspectives on Affect and Learning Technologies, R. A. Calvo & S. D'Mello (Eds.), Springer-Verlag. 118
  • 119. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Questions | Answers 119
  • 120. Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray Acknowledgements This research was supported by Office of Naval Research under Grant N00014-10-1-0143 awarded to Dr. Robert Atkinson 120