Keynote talk given by Mark Billinghurst at the VSMM 2016 conference on October 19th 2016.This talk was about how AR and VR can be used to create Empathic Computing experiences.
5. Augmented Reality
• Defining Characteristics
• Combines Real andVirtual Images
• Both can be seen at the same time
• Interactive in real-time
• The virtual content can be interacted with
• Registered in 3D
• Virtual objects appear fixed in space
7. Shared Space (1999)
• Face to Face interaction,Tangible AR metaphor
• Easy collaboration with strangers
• Users acted same as if handling real objects
14. Pokemon GO Effect
• Fastest App to reach $500 million in Revenue
• Only 63 days after launch, > $1 Billion in 6 months
• Over 500 million downloads, > 25 million DAU
• Nintendo stock price up by 50% (gain of $9 Billion USD)
16. “Do you want to sell sugar
water for the rest of your life
or do you want to come with
me and change the world?"
Steve Jobs to
John Sculley 1983
17. Mark’s Midlife Crisis
• Get Married
• Sabbatical at Google
• Looking for new opportunities
• Resigned my job
• Moved to a new country
• Now creating a new research group
34. Empathy
“Seeing with the Eyes of another,
Listening with the Ears of another,
and Feeling with the Heart of another..”
Alfred Adler
35.
36. Empathic Computing
1. Understanding: Systems that can
understand your feelings and emotions
2. Experiencing: Systems that help you
better experience the world of others
3. Sharing: Systems that help you better
share the experience of others
Sensors
VR
AR
40. 2. Experiencing: Virtual Reality
"Virtual reality offers a whole different
medium to tell stories that really connect
people and create an empathic connection."
Nonny de la Peña
http://www.emblematicgroup.com/
41. Using VR for Empathy
• USC Project Syria (2014)
• Experience of Terrorism • Project Homeless (2015)
• Experience of Homelessness
50. Movies are like a
machine that
generates Empathy
Roger Ebert
51. Technical Requirements
• Basic Requirements
• Make the technology transparent
• Wearable, unobtrusive
• Technology for transmitting
• Sights, Sounds, Feelings of another
• Audio, video, physiological sensors
52. Wearable AR for Empathic Interfaces
• Wearable AR can:
• Be unobtrusive
• Capture emotion
• Share sights and sounds
• Provide two way communication
• Enhance interaction in the real world
53. Changing Perspective
• CamNet (1992)
• British Telecom
• Wearable Teleconferencing
• audio, video
• Remote collaboration
• Sends task space video
• Similar CMU study (1996)
• cut performance time in half
61. Lessons Learned
• Good
• Communication easy and natural
• Users enjoy have view independence
• Very natural capturing panorama on Glass
• Sharing panorama enhances the shared experience
• Bad
• Difficult to support equal input
• Need to provide awareness cues
62. JackIn – Live Immersive Video Streaming
• Jun Rekimoto – University of Tokyo/Sony CSL
73. Lessons Learned
• Good
• System was wearable
• Sender and receiver mirrored emotion
• Minimal cues provided best experience
• Bad
• System delays
• Need for good stimulus
• Difficult to represent emotion
74. Gaze and Video Conferencing
• Gaze tracking
• Implicit communication cue
• Shows intent
• Task space collaboration
• HMD + camera + gaze tracker
• Expected Results
• Gaze cues reduce need for communication
• Allow remote collaborator to respond faster
80. Key Results
• Both the pointer and eye tracking visual cues
helped participants to perform significantly faster
• The pointer cue significantly improved perceived
quality of collaboration and co-presence
• Eye-tracking improved the collaboration quality,
and sense of being focused for the local users, and
enjoyment for the remote users
• The Both condition was ranked as the best in user
experience, while the None condition was worst.
81. Empathy Glasses (CHI 2016)
• Combine together eye-tracking, display, face expression
• Impicit cues – eye gaze, face expression
++
Pupil Labs Epson BT-200 AffectiveWear
Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of
the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
82. AffectiveWear – Emotion Glasses
• Photo sensors to recognize expression
• User calibration
• Machine learning
• Recognizing 8 face expressions
86. Ranking Results
"I ranked the (A) condition best, because I could easily
point to communicate, and when I needed it I could check
the facial expression to make sure I was being understood.”
Q2: Communication Q3: Understanding partner
HMD – Local User Computer – Remote User
87. Lessons Learned
• Pointing really helps in remote collaboration
• Makes remote user feel more connected
• Gaze looks promising
• Shows context of what person talking about
• Establish shared understanding/awareness
• Face expression
• Used as an implicit cue to show comprehension
• Limitations
• Limited implicit cues
• Task was a poor emotional trigger
• AffectiveWear needs improvement
88. Empathic VR Environments
• Player and Viewer
• Viewer slaved to player
• Share emotional signals
• Heart rate, GSR
• Remote affect measuring
90. Heart Rate for Different VR Apps.
• Games: theBlu, BrookHaven, Night Café
quiet scene – scary scene – peaceful scene
• Devices: HTV VIVE, Empatica E4
Heart Rate The Blu BrookHaven Night Café
Mean 84.89 BPM 98.23 BPM 73.01 BPM
91. Sharing VR Experiences
• Player controls viewer position (not view)
• Measure and share physiological cues
Viewer
Player
93. AR and VR for Empathic Computing
• VR systems are ideal for trying experiences:
• Strong story telling medium
• Provide total immersion/3D experience
• Easy to change virtual body scale and representation
• AR systems are idea for live sharing:
• Allow overlay on real world view/can share viewpoints
• Support remote annotation/communication
• Enhance real world task
96. AR + Smart Sensors + Social Networks
• Track population at city scale (mobile networks)
• Match population data to external sensor data
• Mine data for applications
100. Research Challenges
• How to capture emotion?
• How to measure empathy?
• Interface/interaction models?
• How to communicate emotion?
• How to create strong empathic bonds?
• How to scaling up to city/country scale?
103. Harvard Grant Study
• $20 million, 75 years study
• 268 Harvard graduates
• 456 disadvantaged people
• Led by George Valliant
• What makes us happy?
• warmth of relationships throughout
life have the greatest positive
impact on "life satisfaction".
104. “The seventy-five years and twenty million
dollars expended on the Grant Study
points to a straightforward five-word
conclusion: Happiness is love. Full stop.”
George Valliant
105. Conclusions
• Empathic Computing
• Sharing what you see, hear and feel
• AR/VR Enables Empathic Experiences
• Removing technology
• Changing perspective
• Sharing space/experience
• Many directions for future research