Considering that the human-element as crucial in designing and implementing interactive intelligent systems, this tutorial provides a description and hands-on demonstration on detection of affective states and a description of devices, methodologies and data processing, as well as their impact in instructional design. The information that a computer senses in order to automate the detection of affective states, includes an extensive set of data, it could ranges from brain-waves signals and biofeedback readings from face-based or gesture emotion recognition and posture or pressure sensing. The work presented in this tutorial, is not about the development of the algorithms or hardware that make this works, our concerns are about the encapsulation of preexisting systems (we are actually using all of them) that implements those algorithms and uses these hardware to improve Learning.
1. How to Do Multimodal Detection
of Affective States?
Javier Gonzalez-Sanchez, Maria-Elena Chavez-Echeagaray,
David Gibson, Robert Atkinson, Winslow Burleson
Learning Science Research Lab
School of Computing, Informatics, and Decision Systems Engineering
Arizona State University
This work was supported by Office of Naval Research under Grant N00014-10-1-0143
6. Concepts
How to Do Multimodal Detection
of Affective States?
physiological physical instinctual reaction to stimulation
feelings, emotions
How do you feel?
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
7. Concepts
physical appearance
measurement
physiological measures
identify the presence of ...
self-report
How to Do Multimodal Detection
of Affective States?
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
11. BCI
Wireless Emotiv® EEG Headset.
The device reports data with intervals of 125 ms (8 Hz).
The output includes 14 values (7 channels on each brain
hemisphere: AF3, F7, F3, FC5, T7, P7, O1, O2, P8,
T8, FC6, F4, F8, and AF4) and two values of the
acceleration of the head when leaning (gyrox and gyroy).
This report Engagement, Boredom, Excitement,
Frustration, Meditation.
And also facial activity: blink, wink (left and right), look
(left and right), raise brow, furrow brow, smile,
clench, smirk (left and right), and laugh.
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
12. emotions EEG data facial gestures
Demo
Wireless Emotiv® EEG Headset
Emotiv
Systems
$299
This work was supported by Office of Naval Research under Grant N00014-10-1-0143
13. This work was supported by Office of Naval Research under Grant N00014-10-1-0143
14. This work was supported by Office of Naval Research under Grant N00014-10-1-0143
15. This work was supported by Office of Naval Research under Grant N00014-10-1-0143
16. EEG data
This work was supported by Office of Naval Research under Grant N00014-10-1-0143
17. emotions
This work was supported by Office of Naval Research under Grant N00014-10-1-0143
18. Sensing Devices
Tobii® Eye Tracker
The device reports data with intervals of 100 ms (10Hz).
The output provides data concerning attention direction
(gaze-x, gaze-y), time of focus and pupil dilation.
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
19. About
Tobii® Eye Tracker
This work was supported by Office of Naval Research under Grant N00014-10-1-0143
20. Sensing Devices
MindReader Software from MIT Media Lab.
It infers affective states from head gestures and facial
expressions in a video stream in real-time at data intervals
of 100 ms approximately (10 Hz).
With this system it is possible to infer: agreeing,
concentrating, disagreeing, interested, thinking
and unsure.
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
21. Demo
MindReader Software from MIT Media Lab
This work was supported by Office of Naval Research under Grant N00014-10-1-0143
22. This work was supported by Office of Naval Research under Grant N00014-10-1-0143
23. Sensing Devices
Hardware designed by MIT Media Lab.
It measures Arousal.
It is a skin electrical conductance sensor to measures the
electrical conductance of the skin, which varies with its
moisture level that depends on the sweat glands, which are
controlled by the sympathetic, and parasympathetic nervous
systems.
It is a Wireless Bluetooth device that reports data in intervals
of 500 ms approximately (2Hz).
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
25. This work was supported by Office of Naval Research under Grant N00014-10-1-0143
26. This work was supported by Office of Naval Research under Grant N00014-10-1-0143
27. Sensing Devices
Hardware designed by MIT Media Lab.
Pressure Sensing
It is a Serial device that reports data in intervals of 150 ms
approximately (6Hz).
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
28. Demo
Mouse Pressure Sensor
This work was supported by Office of Naval Research under Grant N00014-10-1-0143
29. This work was supported by Office of Naval Research under Grant N00014-10-1-0143
32. Framework
40 students
independently
Data Logger
Agent Data Visualizer
Agent
Agent
Centre
Multimodal
Tutoring
System
37 student concurrently
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
33. Framework
Agent
Federation
* B. Horling, and V. Lesser, “A survey of multi-agent organizational paradigms,” The Knowledge Engineering Review, Cambridge University
Press, 2005, vol. 19,
pp. 281-316, doi: 10.1017/S0269888905000317.
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
34. Framework
Agent
Federation
Gonzalez-Sanchez, J.; Chavez-Echeagaray, M.E.; Atkinson, R.; and Burleson, W. (2011), "ABE: An Agent-Based Software Architecture
for a Multimodal Emotion Recognition Framework," in Proceedings of Working IEEE/IFIP Conference on Software Architecture (June
2011).
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
35. Framework
Gonzalez-Sanchez, J.; Chavez-Echeagaray, M.E.; Atkinson, R.; and Burleson, W. (2011), "ABE: An Agent-Based Software Architecture
for a Multimodal Emotion Recognition Framework," in Proceedings of Working IEEE/IFIP Conference on Software Architecture (June
2011).
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
37. Tool: Weka
explore clasification clustering
pre-processing visualization
Mark Hall, Eibe Frank, Geoffrey Holmes, Bernhard Pfahringer, Peter Reutemann, Ian H. Witten (2009); The WEKA Data Mining Software:
An Update; SIGKDD Explorations, Volume 11, Issue 1.
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
38. Tool: Weka
The experiment was
run over 21 subjects,
undergrad and grad
students of Arizona
State University
ranging between 18 to
25 years. For the
purpose of our
experiment we
consider all levels of
expertise from novice
to expert users of
Guitar Hero, we also
consider regular and
no regular gamers , and
we also consider both
genders.
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
42. Techniques: Nets
engagement and EEG raw
!"#$%"$#&'()*$&+,-./(012&%3-%4(5&"#673.( >#&6-(.%=3?&+%/(.=,96-@("=3(%=&--3'.(
&-1(83"9,#:.(;#&<=.( "=&"(%,-"#6A$"3(96"=(3-@&@3?3-"(
! !
* Dr. David C. Gibson
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
43. Tool: ABE
gaze-x, gaze-y, time
frustration
threshold = 0.75
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
47. Experiences
movility | interference
data analysis
Third-
party
The experiment was run Systems
over 21 subjects, undergrad
and grad students of Arizona
State University ranging
between 18 to 25 years. For gaze-x, gaze-y, time
the purpose of our
experiment we consider all
levels of expertise from
novice to expert users of frustration
threshold = 0.75
Guitar Hero, we also consider
regular and no regular
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
50. Reference A
Virtual Worlds Best Practices in Education 2011 Conference
http://javiergs.com?p=1317
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
51. Reference B
8th Annual Games for Change Festival
http://javiergs.com?p=1433
Javier Gonzalez-Sanchez | Maria-Elena Chavez-Echeagaray
52. http://lsrl.lab.asu.edu
{javiergs, helenchavez}@asu.edu
This work was supported by Office of Naval Research under Grant N00014-10-1-0143