Ce diaporama a bien été signalé.
Le téléchargement de votre SlideShare est en cours. ×

Empathic Computing: Developing for the Whole Metaverse

Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Chargement dans…3
×

Consultez-les par la suite

1 sur 83 Publicité

Empathic Computing: Developing for the Whole Metaverse

Télécharger pour lire hors ligne

A keynote speech given by Mark Billinghurst at the Centre for Design and New Media at IIIT-Delhi. Given on June 16th 2022. This presentation is about how Empathic Computing can be used to develop for the entre range of the Metaverse.

A keynote speech given by Mark Billinghurst at the Centre for Design and New Media at IIIT-Delhi. Given on June 16th 2022. This presentation is about how Empathic Computing can be used to develop for the entre range of the Metaverse.

Publicité
Publicité

Plus De Contenu Connexe

Similaire à Empathic Computing: Developing for the Whole Metaverse (20)

Plus par Mark Billinghurst (20)

Publicité

Plus récents (20)

Empathic Computing: Developing for the Whole Metaverse

  1. 1. EMPATHIC COMPUTING: DEVELOPING FOR THE WHOLE METAVERSE Mark Billinghurst mark.billinghurst@unisa.edu.au June 16th 2022
  2. 2. “Only through communication can Human Life hold meaning.” Paulo Freire Philosopher
  3. 3. Teleconferencing Today
  4. 4. My Workplace in 2020, 2021…
  5. 5. Remote Conferencing • da
  6. 6. Communication Seam • da Task Space Communication Space
  7. 7. Limitations with Current Technology •Lack of spatial cues • Person blends with background •Poor communication cues • Limited gaze, gesture, non-verbal communication •Separation of task/communication space • Can’t see person and workspace at same time
  8. 8. Connecting at a Distance with AR/VR • Restore spatial cues • Sharing non-verbal cues • Creating shared spaces
  9. 9. Early Experiments (1994 - 2003) Greenspace (1994) AR conferencing (1999) 3D Live (2003)
  10. 10. https://youtu.be/rhtgTX_fPmk
  11. 11. Collaboration in Virtual Reality Facebook Spaces AltspaceVR
  12. 12. https://www.youtube.com/watch?v=lgj50IxRrKQ
  13. 13. Collaboration in Augmented Reality Magic Leap Avatar Chat
  14. 14. https://www.youtube.com/watch?v=PG3tQYlZ6JQ
  15. 15. Metaverse
  16. 16. The Metaverse • Neal Stephenson’s “Snow Crash” (1992) • The Metaverse is the convergence of: • 1) virtually enhanced physical reality • 2) physically persistent virtual space
  17. 17. Metaverse Taxonomy • Four Key Components • Virtual Worlds • Augmented Reality • Mirror Worlds • Lifelogging • Metaverse Roadmap • http://metaverseroadmap.org/
  18. 18. Sensing Immersing Augmenting Capturing
  19. 19. Communication Trends • 1. Experience Capture • Move from sharing faces to sharing places • 2. Natural Collaboration • Faster networks support more natural collaboration • 3. Implicit Understanding • Systems that recognize behaviour and emotion
  20. 20. Natural Collaboration Implicit Understanding Experience Capture
  21. 21. Natural Collaboration Implicit Understanding Experience Capture Empathic Computing
  22. 22. “Empathy is Seeing with the Eyes of another, Listening with the Ears of another, and Feeling with the Heart of another..” Alfred Adler
  23. 23. Empathic Computing Research Focus Can we develop systems that allow us to share what we are seeing, hearing and feeling with others?
  24. 24. Remote Communication
  25. 25. MiniMe Virtual Cues Enhanced Emotion Brain Synchronization Emotion Recognition Scene Capture AI
  26. 26. Changing Perspective • View from remote user’s perspective • Wearable Teleconferencing • audio, video, pointing • send task space video • CamNet (1992) • British Telecom • Similar CMU study (1996) • cut performance time in half
  27. 27. AR View Remote Expert View
  28. 28. Changing Perspective - Empathy Glasses • Combine together eye-tracking, display, face expression • Implicit cues – eye gaze, face expression + + Pupil Labs Epson BT-200 AffectiveWear Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
  29. 29. Remote Collaboration • Eye gaze pointer and remote pointing • Face expression display • Implicit cues for remote collaboration
  30. 30. https://youtu.be/M6-z9qdbVIg
  31. 31. Adding in Sensor Input • Using sensors to enhance collaboration • Sharing heart rate • Gaze cues • Face expression
  32. 32. Shared Sphere – 360 Video Sharing Shared Live 360 Video Host User Guest User Lee, G. A., Teo, T., Kim, S., & Billinghurst, M. (2017). Mixed reality collaboration through sharing a live panorama. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications (pp. 1-4).
  33. 33. https://www.youtube.com/watch?v=q_giuLot76k
  34. 34. Connecting between Spaces • Augmented Reality • Bringing remote people into your real space • Virtual Reality • Bringing elements of the real world into VR • AR/VR for sharing communication cues • Sharing non-verbal communication cues
  35. 35. 3D Live Scene Capture • Use cluster of RGBD sensors • Fuse together 3D point cloud
  36. 36. Platform Features
  37. 37. https://youtu.be/owzUWszAckE
  38. 38. Scene Capture and Sharing Scene Reconstruction Remote Expert Local Worker
  39. 39. AR View Remote Expert View https://youtu.be/-UpojQ7lz3k
  40. 40. View Sharing Evolution • Increased immersion • Improved scene understanding • Better collaboration 2D 360 3D
  41. 41. • Using AR/VR to share communication cues • Gaze, gesture, head pose, body position • Sharing same environment • Virtual copy of real world • Collaboration between AR/VR • VR user appears in AR user’s space Piumsomboon, T., Dey, A., Ens, B., Lee, G., & Billinghurst, M. (2019). The effects of sharing awareness cues in collaborative mixed reality. Frontiers in Robotics and AI, 6, 5. Sharing: Virtual Communication Cues (2019)
  42. 42. Sharing Virtual Communication Cues • Collaboration between AR and VR • Gaze Visualization Conditions • Baseline, FoV, Head-gaze, Eye-gaze
  43. 43. https://youtu.be/Mk64B-oCd2Q
  44. 44. Results • Predictions • Eye/Head pointing better than no cues • Eye/head pointing could reduce need for pointing • Results • No difference in task completion time • Head-gaze/eye-gaze great mutual gaze rate • Using head-gaze greater ease of use than baseline • All cues provide higher co-presence than baseline • Pointing gestures reduced in cue conditions • But • No difference between head-gaze and eye-gaze
  45. 45. Multi-Scale Collaboration • Changing the user’s virtual body scale
  46. 46. https://youtu.be/SD1afNoAHEs
  47. 47. Sharing a View
  48. 48. Sharing VR Experiences • HTC Vive HMD • Empathic glove • Empatica E4 Dey, A., Piumsomboon, T., Lee, Y., & Billinghurst, M. (2017). Effects of sharing physiological states of players in a collaborative virtual reality gameplay. In Proceedings of CHI 2017 (pp. 4045-4056).
  49. 49. VR Environments • Butterfly World: calm scene, collect butterflies • Zombie Attack: scary scene, fighting zombies
  50. 50. Changing the Other Person’s Heartrate? • Follow-on study: Artificially changing and sharing heartrate (-20%, 0%, +20%) • Key findings • Manipulated heart rate affects perceived valence and arousal levels of another person • No change in actual heartrate, but trend towards significance (p = 0.08) • Significant environment effect – active has higher HR than passive A. Dey, H. Chen, A. Hayati, M. Billinghurst and R. W. Lindeman, "Sharing Manipulated Heart Rate Feedback in Collaborative Virtual Environments," 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Beijing, China, 2019, pp. 248-257.
  51. 51. Sharing: Separating Cues from Body • What happens when you can’t see your colleague/agent? Piumsomboon, T., Lee, G. A., Hart, J. D., Ens, B., Lindeman, R. W., Thomas, B. H., & Billinghurst, M. (2018, April). Mini-me: An adaptive avatar for mixed reality remote collaboration. In Proceedings of the 2018 CHI conference on human factors in computing systems (pp. 1-13). Collaborating Collaborator out of View
  52. 52. Mini-Me Communication Cues in MR • When lose sight of collaborator a Mini-Me avatar appears • Miniature avatar in real world • Mini-Me points to shared objects, show communication cues • Redirected gaze, gestures
  53. 53. https://www.youtube.com/watch?v=YrdCg8zz57E
  54. 54. User Study (16 participants) • Collaboration between user in AR, expert in VR • Hololens, HTC Vive • Two tasks: • (1) asymmetric, (2) symmetric • Key findings • Mini-Me significantly improved performance time (task1) • Mini-Me significantly improved Social Presence scores • 63% (task 2) – 75% (task 1) of users preferred Mini-Me “The ability to see the small avatar … enhanced the speed of solving the task”
  55. 55. Technology Trends • Advanced displays • Wide FOV, high resolution • Real time space capture • 3D scanning, stitching, segmentation • Natural gesture interaction • Hand tracking, pose recognition • Robust eye-tracking • Gaze points, focus depth • Emotion sensing/sharing • Physiological sensing, emotion mapping
  56. 56. Sensor Enhanced HMDs Eye tracking, heart rate, pupillometry, and face camera HP Omnicept Project Galea EEG, EMG, EDA, PPG, EOG, eye gaze, etc.
  57. 57. Brain Synchronization
  58. 58. Pre-training (Finger Pointing) Session Start
  59. 59. Post-Training (Finger Pointing) Session End
  60. 60. Brain Synchronization in VR Gumilar, I., Sareen, E., Bell, R., Stone, A., Hayati, A., Mao, J., ... & Billinghurst, M. (2021). A comparative study on inter- brain synchrony in real and virtual environments using hyperscanning. Computers & Graphics, 94, 62-75.
  61. 61. NeuralDrum • Using brain synchronicity to increase connection • Collaborative VR drumming experience • Measure brain activity using 3 EEG electrodes • Use PLV to calculate synchronization • More synchronization increases graphics effects/immersion Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving Brain Synchronicity in XR Drumming. In SIGGRAPH Asia 2020 Technical Communications (pp. 1-4).
  62. 62. Set Up • HTC Vive HMD • OpenBCI • 3 EEG electrodes
  63. 63. https://www.youtube.com/watch?v=aG261GfiR90
  64. 64. Results "It’s quite interesting, I actually felt like my body was exchanged with my partner." Poor Player Good Player
  65. 65. • Advanced displays • Real time space capture • Natural gesture interaction • Robust eye-tracking • Emotion sensing/sharing Empathic Tele-Existence
  66. 66. Empathic Tele-Existence • Covering the entire Metaverse • AR, VR, Lifelogging, Mirror Worlds • Based on Empathic Computing • Creating shared understanding • Transforming collaboration • Observer to participant • Feeling of doing things together • Supporting Implicit collaboration
  67. 67. Conclusions • Communication Trends • Natural collaboration, Experience capture, Implicit understanding • Empathic Computing • Systems that enhance understanding • Combining AR, VR, Physiological sensors • Research directions • Sharing communication cues, brain synchronization • Empathic Tele-Existence
  68. 68. Metaverse Components • Four Key Components • Virtual Worlds • Augmented Reality • Mirror Worlds • Lifelogging
  69. 69. Delivering the Entire Metaverse
  70. 70. Metaverse Boundaries
  71. 71. Possible Research Directions • Lifelogging to VR • Bringing real world actions into VR, VR to experience lifelogging data • AR to Lifelogging • Using AR to view lifelogging data in everyday life, Sharing physiological data • Mirror Worlds to VR • VR copy of the real world, Mirroring real world collaboration in VR • AR to Mirror Worlds • Visualizing the past in place, Asymmetric collaboration • And more..
  72. 72. www.empathiccomputing.org @marknb00 mark.billinghurst@auckland.ac.nz

×