Director at HIT Lab NZ à University of South Australia
7 Nov 2017•0 j'aime•2,131 vues
1 sur 40
Collaborative Immersive Analytics
7 Nov 2017•0 j'aime•2,131 vues
Télécharger pour lire hors ligne
Signaler
Technologie
Presentation by Mark Billinghurst on Collaborative Immersive Analytics at the BDVA conference on November 7th 2017. This talk provides an overview of the topic of Collaborative Immersive Analytics
3. Collaborative Visualization
The shared use of computer-supported,
(interactive,) visual representations of data
by more than one person with the common
goal of contribution to joint information
processing activities.
Isenberg, P., Elmqvist, N., Scholtz, J., Cernea, D., Ma, K. L., & Hagen,
H. (2011). Collaborative visualization: definition, challenges, and
research agenda. Information Visualization, 10(4), 310-326.
Petra Isenberg, 2011
5. Multi-disciplinary Research
• Collaborative Visualisation related to:
• Scientific Visualisation
• Information Visualisation
• Visual Analytics
• All well established fields over last 20 years
• But little existing research in Collaborative Visualisation
• E.g. from 1990 – 2010, over 1600 papers in main Viz conferences
• Only 34 papers published in Collaborative Visualisation, ~2%
6. Collaborative Immersive Analytics (CIA)
The shared use of new immersive interaction
and display technologies by more than one
person for supporting collaborative analytical
reasoning and decision making.
• Key properties
• Use of immersive technologies
• Computer supported collaboration
• Analytical reasoning and decision making
8. Milgram’s Reality-Virtuality continuum
Mixed Reality
Reality - Virtuality (RV) Continuum
Real
Environment
Augmented
Reality (AR)
Augmented
Virtuality (AV)
Virtual
Environment
"...anywhere between the extrema of the virtuality continuum."
P. Milgram and A. F. Kishino, Taxonomy of Mixed Reality Visual Displays
IEICE Transactions on Information and Systems, E77-D(12), pp. 1321-1329, 1994.
10. CIA Example: EVL CAVE2
Marai, G. E., Forbes, A. G., & Johnson, A. (2016, March). Interdisciplinary immersive
analytics at the electronic visualization laboratory: Lessons learned and upcoming
challenges. In Immersive Analytics (IA), 2016 Workshop on (pp. 54-59). IEEE.
11. CAVE2 Hardware and Software
• Hardware
• 36 computers drive 72 LCD panels – 320 degree display
• 74 megapixel 2D / 37 megapixel passive 3D hybrid reality environment
• 14 Vicon tracking cameras – tracking up to 6 people/objects, 20.2 audio
• Software
• OmegaLib – open source software driving CAVE2, rendering, input, etc.
• SAGE2 – browser based collaboration/interaction platform
• Support for hybrid devices + multi-user input
13. ENDURANCE Case Study (2 Days)
• Working with NASA team
• explore Lake Bonney in the McMurdo Dry Valleys of Antarctica
• ice covered, used Auto. Underwater Vehicle to collect sonar data
• CAVE2 used as hybrid CIA system
• CAVE walls – shared data representation
• Laptops on tables – private workshop, individual data analysis
• Shared VR data exploration – virtually swimming through data
• Subgroups form at large screen to analyze data
“.. the team got more done in 2 days than in 6
months of email, Skype, and Google Hangout.”
14. Types of CIA systems
• Classify system using CSCW Space-Time Taxonomy
15. 1. Co-Located Synchronous Collaboration
• Same time/Same place collaboration
• E.g. CAVE2, shared tables, interactive walls
• Advantages
• Shared awareness, use external tools (laptop, notes)
• Easy moving between individual and group work
17. The MagicBook
• Using AR to transition along Milgram’s continuum
• Moving seamlessly from Reality to AR to VR
• Support for Collaboration
• Face to Face, Shared AR/VR, Multi-scale
• Natural interaction
• Handheld AR and VR viewer
Billinghurst, M., Kato, H., & Poupyrev, I. (2001). The MagicBook: a transitional
AR interface. Computers & Graphics, 25(5), 745-753.
19. 2. Distributed Synchronous Collaboration
• Remote people working at same time
• E.g. Collaborative VR, remote tabletops, shared browsers
• Advantages
• Remote users in same collaborative space, spatial cues
• But: can’t convey same face to face cues
20. Example: Social VR
• Facebook Spaces, AltspaceVR
• Bringing Avatars into VR space
• Natural social interaction
22. 3. Distributed Asynchronous Collaboration
• Collaboration at different time and different place
• E.g. messages in VR, web annotation tools, doc. markup
• Advantages
• Time for more considered response, work whenever
• Combine information from many sources, better discussions
23. 4. Co-Located Asynchronous Collaboration
• Collaborating at the same location but different times
• E.g. Public displays, shared physical message walls, AR annotations
• Not well studied for information visualisation
• Advantages
• Collaborators viewing same physical space
• Can use external objects to support collaboration (pens, notes)
24. Example: Hydrological Data Visualization
Hydrosys uses AR to
display locations of
stations in a global
sensor network as well
as interpolated
temperature plotted as
geodesic contours
Support for
asynchronous
annotation
Image: Eduardo Veas and Ernst Kruijff
Veas, E., Kruijff, E., & Mendez, E. (2009). HYDROSYS-first approaches towards on-site monitoring and
management with handhelds. J. Hrebıcek, J. Hradec, E. Pelikán, O. Mırovský, W. Pillmann, I. Holoubek,
TB, editor, Towards e-environment, EENVI2009, Prague, Czech Republic.
26. Examples
• Many examples
• Mixed Presence tabletop with multiple people at each end
• CAVE VR connecting between multiple people
• Advantages
• Support for distributed collaboration, benefits of face to face groups
• Challenges
• Support for mutual awareness, representation of remote users
27. Lessons Learned
• In co-located systems the following is important:
• supporting different independent viewpoints
• enabling the use of different tools for different data
• supporting face-to-face group work
• support for different data representations
Marai, G. E., Forbes, A. G., & Johnson, A. (2016, March). Interdisciplinary immersive
analytics at the electronic visualization laboratory: Lessons learned and upcoming
challenges. In Immersive Analytics (IA), 2016 Workshop on (pp. 54-59). IEEE.
28. General Guidelines
• In general, collaborative systems should support:
• Shared context – knowledge/context around data
• Awareness of others – aware of others actions
• Negotiation and communication – easy conversation
• Flexible and multiple viewpoints - depending on roles
Churchill, E. F., Snowdon, D. N., & Munro, A. J. (Eds.). (2012). Collaborative virtual
environments: digital places and spaces for interaction. Springer Science & Business Media.
29. Importance of Roles
• Asymmetic/Symmetric problem solving
• Teacher/student vs. equal collaborators
• Three different levels of engagement [Isenberg 2011]:
• Viewing: where people are consuming a data presentation
without interacting with the data, such as in a lecture.
• Interacting/exploring: where people have the means to
choose alternate views or explore the data.
• Sharing/creating: people are able to create and distribute
new datasets and visualizations to be explored.
• Need to design the interface differently for each role
30. Methods for Interacting in CIAs
• Goal: Natural interaction that supports collaboration
• Techniques used
• Pointing and gestures – hand or full body
• Dedicated devices – e.g. handheld tablet
• Multimodal – touch + speech
• Tangible interfaces – physical objects
• Collaborative actions – working together
31. Opportunities for Research
• Many opportunities for research
• Using VR for CIA
• HMD vs. CAVE performance
• Next generation collaboration
• Using AR/VR for FtF/remote collaboration
• Evaluation of CIA systems
• Subjective/objective measures, cognitive evaluation
• Methods for Asynchronous collaboration
• Especially remote asynchronous systems
• Novel interaction methods
• Multimodal input, gaze based system, etc
• Exploring the CIA design space
• Interaction metaphors, design patterns
32. Example: Holoportation (2016)
• Augmented Reality + 3D capture + high bandwidth
• http://research.microsoft.com/en-us/projects/holoportation/
34. Example: Empathy Glasses (CHI 2016)
• Combine together eye-tracking, display, face expression
• Impicit cues – eye gaze, face expression
++
Pupil Labs Epson BT-200 AffectiveWear
Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of
the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
35. AffectiveWear – Emotion Glasses
• Photo sensors to recognize expression
• User calibration
• Machine learning
• Recognizing 8 face expressions
36. Empathy Glasses in Use
• Eye gaze pointer and remote pointing
• Face expression display
• In future integrated eye-tracking/display
39. Conclusion
• Need for research on Collaborative Visualisation
• Less than 5% of Visualisation papers
• New area: Collaborative Immersive Analytics
• Visual Analytics + Mixed Reality + CSCW
• Using Immersive Technologies
• Early promising results – e.g. CAVE2 case studies
• Different classes of CIA systems
• Classify according to space/time taxonomy
• Many directions for future research
• Interaction, evaluation, asynchronous collaboration, etc.