Novel Interfaces for AR Systems

Mark Billinghurst
Mark BillinghurstDirector at HIT Lab NZ à University of South Australia
NOVEL INTERFACES FOR AR SYSTEMS
Mark Billinghurst
mark.billinghurst@unisa.edu.au
September 2022
THE EMPATHIC COMPUTING LABORATORY
Two Amazing Locations
• University of South Australia, Adelaide
• Highest number of AR research outputs in world
• University of Auckland, Auckland
• Top ranked university in New Zealand
One Amazing Team
• Staff (6)
• 2 faculty, 3 post docs, 1 engineer
• Students (39)
• 17 PhD (+2), 2 Masters, 6 undergraduate, 14 interns
Novel Interfaces for AR Systems
“Only through
communication
can Human Life
hold meaning.”
Paulo Freire
Modern Technology Trends
1. Improved Content Capture
• Move from sharing faces to sharing places
2. Increased Network Bandwidth
• Sharing natural communication cues
3. Implicit Understanding
• Recognizing behaviour and emotion
Natural
Collaboration
Implicit
Understanding
Experience
Capture
Natural
Collaboration
Implicit
Understanding
Experience
Capture
Empathic
Computing
“Empathy is Seeing with the
Eyes of another, Listening with
the Ears of another, and Feeling
with the Heart of another..”
Alfred Adler
Empathic Computing Research Focus
Can we develop systems that allow
us to share what we are seeing,
hearing and feeling with others?
Key Elements of Empathic Systems
•Understanding
• Emotion Recognition, physiological sensors
•Experiencing
• Content/Environment capture, VR
•Sharing
• Communication cues, AR
Example Projects
• Remote collaboration in Wearable AR
• Sharing of non-verbal cues (gaze, pointing, face expression, emotion)
• Shared Empathic VR experiences
• Use VR to put a viewer inside the players view
• Measuring emotion
• Detecting emotion from heart rate, GSR, eye gaze, etc.
Empathy Glasses (2016)
• Combine together eye-tracking, display, face expression
• Implicit cues – eye gaze, face expression
+
+
Pupil Labs Epson BT-200 AffectiveWear
Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the
34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
Remote Collaboration
• Eye gaze pointer and remote pointing
• Face expression display
• Implicit cues for remote collaboration
Novel Interfaces for AR Systems
Shared Sphere – 360 Video Sharing
Shared
Live 360 Video
Host User Guest User
Lee, G. A., Teo, T., Kim, S., & Billinghurst, M. (2017). Mixed reality collaboration through sharing a
live panorama. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications (pp. 1-4).
Novel Interfaces for AR Systems
3D Live Scene Capture
• Use cluster of RGBD sensors
• Fuse together 3D point cloud
Scene Capture and Sharing
Scene Reconstruction Remote Expert Local Worker
AR View Remote Expert View
View Sharing Evolution
• Increased immersion
• Improved scene understanding
• Better collaboration
2D 360 3D
NOVELAR INTERFACES
Current Generation of AR Glasses
• Trend towards lightweight thin displays
• Offload processing onto second device
• High bandwidth connectivity to cloud services
Novel Interfaces for AR Systems
The Challenge of AR Interaction
AR HMD INTERACTION
Several Techniques
• Controllers
• Gestures
• Touch input
Limitations
• Imprecise input
• Simple graphics
The Challenge of AR Interaction
HANDHELD AR INTERACTION
Advantages
• Precise touch input
• High resolution display
Limitations
• On viewed on phone screen
• Narrow field of view
The Opportunity
AR HMDs tethered to handheld devices/phones
○New opportunity for interaction/display
○Wide field of view of HMD and Precise input of HHD
Motivation
Complementary nature of
HMDs/HHDs
Previous Work
Our previous work
• Use touch input on phone to interact with AR HMD content
• Use tablet to provide 2D view of 3D AR conferencing space
Bleeker 2013
Budhiraja 2013
Novel Interfaces for AR Systems
Secondsight
A prototyping platform for rapidly testing cross-device interfaces
• Enables an AR HMD to "extend" the screen of a smartphone
Key Features
• Can simulate a range of HMD Field of View
• Enables World-fixed or Device-fixed content placement
• Supports touch screen input, free-hand gestures, head-pose selection
Reichherzer, C., Fraser, J., Rompapas, D. C., & Billinghurst, M. (2021, May). Secondsight: A framework for cross-device augmented
reality interfaces. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-6).
Implementation
Hardware
• Meta2 AR Glasses (82o FOV)
• Samsung Galaxy S8 phone
• OptiTrack motion capture
system
Software
• Unity game engine
• Mirror networking library
Input
Content Placement
© 2021 SIGGRAPH. All Rights Reserved.
Field of View Simulation
Map application
© 2021 SIGGRAPH. All Rights Reserved.
Puzzle application
© 2021 SIGGRAPH. All Rights Reserved.
Pilot user study (4 users)
© 2021 SIGGRAPH. All Rights Reserved.
Goal: Explore effect of AR HMD FOV
Task: Data visualisation task
● Find correspondances on health data
Conditions: Simulated Field of View
● 15, 30 , 55, 82 degrees
Measures
● Subjective Feedback on 5-point Likert Scale
○ How difficult was it to use the system (5 = very easy)
● Observing virtual window placement
● Condition ranking
Pilot study results
© 2021 SIGGRAPH. All Rights Reserved.
• Users felt wider FOV was easiest to use
• Most users had similar window placement
• Useful feedback on how to improve user interface
Typical Window Placement
FOV Ease of Use
15 deg 1.00 +/- 0.0
30 deg 1.75 +/- 0.5
55 deg 2.75 +/- 1.25
82 deg 3.75 +/- 1.25
Average Ease of Use Rating
COLLABORATIVE SYSTEMS
Social Panoramas
• Capture and share social spaces in real time
• Supports independent views into Panorama
Reichherzer, C., Nassani, A., & Billinghurst, M. (2014, September). [Poster] Social panoramas using wearable
computers. In 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (pp. 303-304). IEEE.
Implementation
• Google Glass
• Capture live image panorama (compass +
camera)
• Remote device (tablet)
• Immersive viewing, live annotation
Interface
Glass View
Tablet View
Social Panorama
https://www.youtube.com/watch?v=vdC0-UV3hmY
Lessons Learned
• Good
• Communication easy and natural
• Users enjoy have view independence
• Very natural capturing panorama on Glass
• Sharing panorama enhances the shared experience
• Bad
• Difficult to support equal input
• Need to provide awareness cues
• Using AR/VR to share communication cues
• Gaze, gesture, head pose, body position
• Sharing same environment
• Virtual copy of real world
• Collaboration between AR/VR
• VR user appears in AR user’s space
Piumsomboon, T., Dey, A., Ens, B., Lee, G., & Billinghurst, M. (2019). The effects of sharing awareness cues
in collaborative mixed reality. Frontiers in Robotics and AI, 6, 5.
Sharing Virtual Communication Cues (2019)
Sharing Virtual Communication Cues
• AR/VR displays
• Gesture input (Leap Motion)
• Room scale tracking
• Conditions
• Baseline, FoV, Head-gaze, Eye-gaze
Novel Interfaces for AR Systems
Sharing Gaze Cues
How sharing gaze behavioural cues can improve remote collaboration in Mixed Reality environment.
➔ Developed eyemR-Vis, a 360 panoramic Mixed Reality remote collaboration system
➔ Showed gaze behavioural cues as bi-directional spatial virtual visualisations shared
between a local host (AR) and a remote collaborator (VR).
Jing, A., May, K. W., Naeem, M., Lee, G., & Billinghurst, M. (2021). eyemR-Vis: Using Bi-Directional Gaze Behavioural Cues to Improve Mixed
Reality Remote Collaboration. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-7).
System Design
➔ 360 Panaramic Camera + Mixed Reality View
➔ Combination of HoloLens2 + Vive Pro Eye
➔ 4 gaze behavioural visualisations:
browse, focus, mutual, fixated circle
System Design
Browse Focus
Mutual Fixed
Circle-map
Multi-Scale Collaboration
• Changing the user’s virtual body scale
Novel Interfaces for AR Systems
Sharing: Communication Cues (2018)
• What happens when you can’t see your colleague/agent?
Piumsomboon, T., Lee, G. A., Hart, J. D., Ens, B., Lindeman, R. W., Thomas, B. H., & Billinghurst, M.
(2018, April). Mini-me: An adaptive avatar for mixed reality remote collaboration. In Proceedings of the
2018 CHI conference on human factors in computing systems (pp. 1-13).
Collaborating Collaborator out of View
Mini-Me Communication Cues in MR
• When lose sight of collaborator a Mini-Me avatar appears
• Miniature avatar in real world
• Mini-Me points to shared objects, show communication cues
• Redirected gaze, gestures
Novel Interfaces for AR Systems
Results from User Evaluation
• Collaboration between user in AR, expert in VR
• Hololens, HTC Vive
• Two tasks
• Asymmetric, symmetric collaboration
• Significant performance improvement
• 20% faster with Mini-Me
• Social Presence
• Higher sense of Presence
• Users preferred
• People felt the task was easier to complete
• 60-75% preference
“I feel like I am talking
to my partner”
Technology Trends
• Advanced displays
• Wide FOV, high resolution
• Real time space capture
• 3D scanning, stitching, segmentation
• Natural gesture interaction
• Hand tracking, pose recognition
• Robust eye-tracking
• Gaze points, focus depth
• Emotion sensing/sharing
• Physiological sensing, emotion mapping
Sensor Enhanced HMDs
Eye tracking, heart rate,
pupillometry, and face camera
HP Omnicept Project Galea
EEG, EMG, EDA, PPG,
EOG, eye gaze, etc.
Multiple Physiological Sensors into HMD
• Incorporate range of sensors on HMD faceplate and over head
• EMG – muscle movement
• EOG – Eye movement
• EEG – Brain activity
• EDA, PPG – Heart rate
Novel Interfaces for AR Systems
Brain Synchronization
Pre-training (Finger Pointing) Session Start
Post-Training (Finger Pointing) Session End
Brain Synchronization in VR
Gumilar, I., Sareen, E., Bell, R., Stone, A., Hayati, A., Mao, J., ... & Billinghurst, M. (2021). A comparative study on inter-
brain synchrony in real and virtual environments using hyperscanning. Computers & Graphics, 94, 62-75.
Novel Interfaces for AR Systems
Novel Interfaces for AR Systems
asfd
Novel Interfaces for AR Systems
NeuralDrum
• Using brain synchronicity to increase connection
• Collaborative VR drumming experience
• Measure brain activity using 3 EEG electrodes
• Use PLV to calculate synchronization
• More synchronization increases graphics effects/immersion
Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving Brain
Synchronicity in XR Drumming. In SIGGRAPH Asia 2020 Technical Communications (pp. 1-4).
Set Up
• HTC Vive HMD
• OpenBCI
• 3 EEG electrodes
Novel Interfaces for AR Systems
Results
"It’s quite interesting, I actually felt like my
body was exchanged with my partner."
Poor Player Good Player
Fabric Electrodes
• Can use conductive threads to create soft fabric EEG/GSR electrodes
Novel Interfaces for AR Systems
• Advanced displays
• Real time space capture
• Natural gesture interaction
• Robust eye-tracking
• Emotion sensing/sharing
Empathic
Tele-Existence
Empathic Tele-Existence
• Move from Observer to Participant
• Explicit to Implicit communication
• Experiential collaboration – doing together
CONCLUSIONS
Conclusions
• Empathic Computing Laboratory
• 50 people split between Auckland/Adelaide
• Empathic Computing Research Focus
• Systems that enhance understanding
• Natural collaboration + Experience capture + Implicit understanding
• Combining AR, VR, Physiological sensing
• Research directions
• Hybrid AR, Novel collaboration, brain synchronization, etc.
IVE
ARIVE
• Australasian Researchers in Interactive and Virtual Environments
• Collecting together all the best AR/VR researchers in Australia and NZ
• 8 institutions, 260 researchers, > 30 million USD in funding
• Looking for Industry Partners for research consortium
Novel Interfaces for AR Systems
www.empathiccomputing.org
@marknb00
mark.billinghurst@auckland.ac.nz
1 sur 84

Recommandé

Grand Challenges for Mixed Reality par
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Mark Billinghurst
1K vues127 diapositives
Advanced Methods for User Evaluation in AR/VR Studies par
Advanced Methods for User Evaluation in AR/VR StudiesAdvanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR StudiesMark Billinghurst
3.1K vues77 diapositives
ISS2022 Keynote par
ISS2022 KeynoteISS2022 Keynote
ISS2022 KeynoteMark Billinghurst
1.3K vues51 diapositives
2022 COMP4010 Lecture1: Introduction to XR par
2022 COMP4010 Lecture1: Introduction to XR2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XRMark Billinghurst
389 vues126 diapositives
Research Directions in Transitional Interfaces par
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional InterfacesMark Billinghurst
1.4K vues46 diapositives
Comp4010 Lecture13 More Research Directions par
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsMark Billinghurst
1.2K vues154 diapositives

Contenu connexe

Tendances

Empathic Computing: Developing for the Whole Metaverse par
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseMark Billinghurst
549 vues83 diapositives
Comp4010 Lecture5 Interaction and Prototyping par
Comp4010 Lecture5 Interaction and PrototypingComp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and PrototypingMark Billinghurst
916 vues147 diapositives
2022 COMP 4010 Lecture 7: Introduction to VR par
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VRMark Billinghurst
328 vues236 diapositives
Empathic Computing: Designing for the Broader Metaverse par
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseMark Billinghurst
231 vues50 diapositives
Comp4010 Lecture7 Designing AR Systems par
Comp4010 Lecture7 Designing AR SystemsComp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR SystemsMark Billinghurst
1.7K vues156 diapositives
Developing VR Experiences with Unity par
Developing VR Experiences with UnityDeveloping VR Experiences with Unity
Developing VR Experiences with UnityMark Billinghurst
2.7K vues64 diapositives

Tendances(20)

Empathic Computing: Developing for the Whole Metaverse par Mark Billinghurst
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole Metaverse
Comp4010 Lecture5 Interaction and Prototyping par Mark Billinghurst
Comp4010 Lecture5 Interaction and PrototypingComp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and Prototyping
Empathic Computing: Designing for the Broader Metaverse par Mark Billinghurst
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader Metaverse
Comp 4010 2021 Lecture1-Introduction to XR par Mark Billinghurst
Comp 4010 2021 Lecture1-Introduction to XRComp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XR
Mark Billinghurst1.8K vues
Application in Augmented and Virtual Reality par Mark Billinghurst
Application in Augmented and Virtual RealityApplication in Augmented and Virtual Reality
Application in Augmented and Virtual Reality
Mark Billinghurst4.2K vues

Similaire à Novel Interfaces for AR Systems

Multimodal Multi-sensory Interaction for Mixed Reality par
Multimodal Multi-sensory Interaction for Mixed RealityMultimodal Multi-sensory Interaction for Mixed Reality
Multimodal Multi-sensory Interaction for Mixed RealityMark Billinghurst
2.2K vues64 diapositives
Future Directions for Augmented Reality par
Future Directions for Augmented RealityFuture Directions for Augmented Reality
Future Directions for Augmented RealityMark Billinghurst
4.1K vues46 diapositives
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration par
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationMark Billinghurst
550 vues107 diapositives
Empathic Computing: Delivering the Potential of the Metaverse par
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the MetaverseMark Billinghurst
505 vues80 diapositives
Empathic Computing: New Approaches to Gaming par
Empathic Computing: New Approaches to GamingEmpathic Computing: New Approaches to Gaming
Empathic Computing: New Approaches to GamingMark Billinghurst
656 vues61 diapositives
COMP 4010 Lecture12 - Research Directions in AR and VR par
COMP 4010 Lecture12 - Research Directions in AR and VRCOMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VRMark Billinghurst
3.3K vues66 diapositives

Similaire à Novel Interfaces for AR Systems(20)

Multimodal Multi-sensory Interaction for Mixed Reality par Mark Billinghurst
Multimodal Multi-sensory Interaction for Mixed RealityMultimodal Multi-sensory Interaction for Mixed Reality
Multimodal Multi-sensory Interaction for Mixed Reality
Mark Billinghurst2.2K vues
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration par Mark Billinghurst
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Empathic Computing: Delivering the Potential of the Metaverse par Mark Billinghurst
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the Metaverse
COMP 4010 Lecture12 - Research Directions in AR and VR par Mark Billinghurst
COMP 4010 Lecture12 - Research Directions in AR and VRCOMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VR
Mark Billinghurst3.3K vues
Mobile user experience conference 2009 - The rise of the mobile context par Florent Stroppa
Mobile user experience conference 2009 - The rise of the mobile contextMobile user experience conference 2009 - The rise of the mobile context
Mobile user experience conference 2009 - The rise of the mobile context
Florent Stroppa553 vues
Mark Billinghurst (University of South Australia ): Augmented Teleportation par AugmentedWorldExpo
Mark Billinghurst (University of South Australia ): Augmented TeleportationMark Billinghurst (University of South Australia ): Augmented Teleportation
Mark Billinghurst (University of South Australia ): Augmented Teleportation
COMP 4010 Lecture10 AR/VR Research Directions par Mark Billinghurst
COMP 4010 Lecture10 AR/VR Research DirectionsCOMP 4010 Lecture10 AR/VR Research Directions
COMP 4010 Lecture10 AR/VR Research Directions
Mark Billinghurst2.4K vues
COMP 4010 Lecture12 Research Directions in AR par Mark Billinghurst
COMP 4010 Lecture12 Research Directions in ARCOMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in AR
Mark Billinghurst1.9K vues
Mobile AR lecture 9 - Mobile AR Interface Design par Mark Billinghurst
Mobile AR lecture 9 - Mobile AR Interface DesignMobile AR lecture 9 - Mobile AR Interface Design
Mobile AR lecture 9 - Mobile AR Interface Design
Mark Billinghurst2.4K vues
Empathic Computing and Collaborative Immersive Analytics par Mark Billinghurst
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
Beyond Reality (2027): The Future of Virtual and Augmented Reality par Mark Billinghurst
Beyond Reality (2027): The Future of Virtual and Augmented RealityBeyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented Reality
Mark Billinghurst5.4K vues
Mobile AR Lecture 10 - Research Directions par Mark Billinghurst
Mobile AR Lecture 10 - Research DirectionsMobile AR Lecture 10 - Research Directions
Mobile AR Lecture 10 - Research Directions
Mark Billinghurst1.3K vues
COMP 4010: Lecture 4 - 3D User Interfaces for VR par Mark Billinghurst
COMP 4010: Lecture 4 - 3D User Interfaces for VRCOMP 4010: Lecture 4 - 3D User Interfaces for VR
COMP 4010: Lecture 4 - 3D User Interfaces for VR
Mark Billinghurst3.3K vues

Plus de Mark Billinghurst

Empathic Computing: Capturing the Potential of the Metaverse par
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseMark Billinghurst
452 vues46 diapositives
2022 COMP4010 Lecture4: AR Interaction par
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR InteractionMark Billinghurst
276 vues105 diapositives
Comp4010 lecture11 VR Applications par
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsMark Billinghurst
817 vues147 diapositives
Comp4010 lecture11 VR Applications par
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsMark Billinghurst
565 vues147 diapositives
Comp4010 Lecture10 VR Interface Design par
Comp4010 Lecture10 VR Interface DesignComp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignMark Billinghurst
2.3K vues206 diapositives
Advanced Methods for User Evaluation in Enterprise AR par
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARMark Billinghurst
1.2K vues76 diapositives

Plus de Mark Billinghurst(8)

Dernier

2FA and OAuth2 in CloudStack - Andrija Panić - ShapeBlue par
2FA and OAuth2 in CloudStack - Andrija Panić - ShapeBlue2FA and OAuth2 in CloudStack - Andrija Panić - ShapeBlue
2FA and OAuth2 in CloudStack - Andrija Panić - ShapeBlueShapeBlue
103 vues23 diapositives
VNF Integration and Support in CloudStack - Wei Zhou - ShapeBlue par
VNF Integration and Support in CloudStack - Wei Zhou - ShapeBlueVNF Integration and Support in CloudStack - Wei Zhou - ShapeBlue
VNF Integration and Support in CloudStack - Wei Zhou - ShapeBlueShapeBlue
163 vues54 diapositives
What’s New in CloudStack 4.19 - Abhishek Kumar - ShapeBlue par
What’s New in CloudStack 4.19 - Abhishek Kumar - ShapeBlueWhat’s New in CloudStack 4.19 - Abhishek Kumar - ShapeBlue
What’s New in CloudStack 4.19 - Abhishek Kumar - ShapeBlueShapeBlue
222 vues23 diapositives
"Surviving highload with Node.js", Andrii Shumada par
"Surviving highload with Node.js", Andrii Shumada "Surviving highload with Node.js", Andrii Shumada
"Surviving highload with Node.js", Andrii Shumada Fwdays
53 vues29 diapositives
DRaaS using Snapshot copy and destination selection (DRaaS) - Alexandre Matti... par
DRaaS using Snapshot copy and destination selection (DRaaS) - Alexandre Matti...DRaaS using Snapshot copy and destination selection (DRaaS) - Alexandre Matti...
DRaaS using Snapshot copy and destination selection (DRaaS) - Alexandre Matti...ShapeBlue
98 vues29 diapositives
Business Analyst Series 2023 - Week 4 Session 7 par
Business Analyst Series 2023 -  Week 4 Session 7Business Analyst Series 2023 -  Week 4 Session 7
Business Analyst Series 2023 - Week 4 Session 7DianaGray10
126 vues31 diapositives

Dernier(20)

2FA and OAuth2 in CloudStack - Andrija Panić - ShapeBlue par ShapeBlue
2FA and OAuth2 in CloudStack - Andrija Panić - ShapeBlue2FA and OAuth2 in CloudStack - Andrija Panić - ShapeBlue
2FA and OAuth2 in CloudStack - Andrija Panić - ShapeBlue
ShapeBlue103 vues
VNF Integration and Support in CloudStack - Wei Zhou - ShapeBlue par ShapeBlue
VNF Integration and Support in CloudStack - Wei Zhou - ShapeBlueVNF Integration and Support in CloudStack - Wei Zhou - ShapeBlue
VNF Integration and Support in CloudStack - Wei Zhou - ShapeBlue
ShapeBlue163 vues
What’s New in CloudStack 4.19 - Abhishek Kumar - ShapeBlue par ShapeBlue
What’s New in CloudStack 4.19 - Abhishek Kumar - ShapeBlueWhat’s New in CloudStack 4.19 - Abhishek Kumar - ShapeBlue
What’s New in CloudStack 4.19 - Abhishek Kumar - ShapeBlue
ShapeBlue222 vues
"Surviving highload with Node.js", Andrii Shumada par Fwdays
"Surviving highload with Node.js", Andrii Shumada "Surviving highload with Node.js", Andrii Shumada
"Surviving highload with Node.js", Andrii Shumada
Fwdays53 vues
DRaaS using Snapshot copy and destination selection (DRaaS) - Alexandre Matti... par ShapeBlue
DRaaS using Snapshot copy and destination selection (DRaaS) - Alexandre Matti...DRaaS using Snapshot copy and destination selection (DRaaS) - Alexandre Matti...
DRaaS using Snapshot copy and destination selection (DRaaS) - Alexandre Matti...
ShapeBlue98 vues
Business Analyst Series 2023 - Week 4 Session 7 par DianaGray10
Business Analyst Series 2023 -  Week 4 Session 7Business Analyst Series 2023 -  Week 4 Session 7
Business Analyst Series 2023 - Week 4 Session 7
DianaGray10126 vues
Mitigating Common CloudStack Instance Deployment Failures - Jithin Raju - Sha... par ShapeBlue
Mitigating Common CloudStack Instance Deployment Failures - Jithin Raju - Sha...Mitigating Common CloudStack Instance Deployment Failures - Jithin Raju - Sha...
Mitigating Common CloudStack Instance Deployment Failures - Jithin Raju - Sha...
ShapeBlue138 vues
DRBD Deep Dive - Philipp Reisner - LINBIT par ShapeBlue
DRBD Deep Dive - Philipp Reisner - LINBITDRBD Deep Dive - Philipp Reisner - LINBIT
DRBD Deep Dive - Philipp Reisner - LINBIT
ShapeBlue140 vues
State of the Union - Rohit Yadav - Apache CloudStack par ShapeBlue
State of the Union - Rohit Yadav - Apache CloudStackState of the Union - Rohit Yadav - Apache CloudStack
State of the Union - Rohit Yadav - Apache CloudStack
ShapeBlue253 vues
Centralized Logging Feature in CloudStack using ELK and Grafana - Kiran Chava... par ShapeBlue
Centralized Logging Feature in CloudStack using ELK and Grafana - Kiran Chava...Centralized Logging Feature in CloudStack using ELK and Grafana - Kiran Chava...
Centralized Logging Feature in CloudStack using ELK and Grafana - Kiran Chava...
ShapeBlue101 vues
Backup and Disaster Recovery with CloudStack and StorPool - Workshop - Venko ... par ShapeBlue
Backup and Disaster Recovery with CloudStack and StorPool - Workshop - Venko ...Backup and Disaster Recovery with CloudStack and StorPool - Workshop - Venko ...
Backup and Disaster Recovery with CloudStack and StorPool - Workshop - Venko ...
ShapeBlue144 vues
Import Export Virtual Machine for KVM Hypervisor - Ayush Pandey - University ... par ShapeBlue
Import Export Virtual Machine for KVM Hypervisor - Ayush Pandey - University ...Import Export Virtual Machine for KVM Hypervisor - Ayush Pandey - University ...
Import Export Virtual Machine for KVM Hypervisor - Ayush Pandey - University ...
ShapeBlue79 vues
Migrating VMware Infra to KVM Using CloudStack - Nicolas Vazquez - ShapeBlue par ShapeBlue
Migrating VMware Infra to KVM Using CloudStack - Nicolas Vazquez - ShapeBlueMigrating VMware Infra to KVM Using CloudStack - Nicolas Vazquez - ShapeBlue
Migrating VMware Infra to KVM Using CloudStack - Nicolas Vazquez - ShapeBlue
ShapeBlue176 vues
Live Demo Showcase: Unveiling Dell PowerFlex’s IaaS Capabilities with Apache ... par ShapeBlue
Live Demo Showcase: Unveiling Dell PowerFlex’s IaaS Capabilities with Apache ...Live Demo Showcase: Unveiling Dell PowerFlex’s IaaS Capabilities with Apache ...
Live Demo Showcase: Unveiling Dell PowerFlex’s IaaS Capabilities with Apache ...
ShapeBlue85 vues
Developments to CloudStack’s SDN ecosystem: Integration with VMWare NSX 4 - P... par ShapeBlue
Developments to CloudStack’s SDN ecosystem: Integration with VMWare NSX 4 - P...Developments to CloudStack’s SDN ecosystem: Integration with VMWare NSX 4 - P...
Developments to CloudStack’s SDN ecosystem: Integration with VMWare NSX 4 - P...
ShapeBlue154 vues
Backroll, News and Demo - Pierre Charton, Matthias Dhellin, Ousmane Diarra - ... par ShapeBlue
Backroll, News and Demo - Pierre Charton, Matthias Dhellin, Ousmane Diarra - ...Backroll, News and Demo - Pierre Charton, Matthias Dhellin, Ousmane Diarra - ...
Backroll, News and Demo - Pierre Charton, Matthias Dhellin, Ousmane Diarra - ...
ShapeBlue146 vues
Hypervisor Agnostic DRS in CloudStack - Brief overview & demo - Vishesh Jinda... par ShapeBlue
Hypervisor Agnostic DRS in CloudStack - Brief overview & demo - Vishesh Jinda...Hypervisor Agnostic DRS in CloudStack - Brief overview & demo - Vishesh Jinda...
Hypervisor Agnostic DRS in CloudStack - Brief overview & demo - Vishesh Jinda...
ShapeBlue120 vues

Novel Interfaces for AR Systems

  • 1. NOVEL INTERFACES FOR AR SYSTEMS Mark Billinghurst mark.billinghurst@unisa.edu.au September 2022
  • 3. Two Amazing Locations • University of South Australia, Adelaide • Highest number of AR research outputs in world • University of Auckland, Auckland • Top ranked university in New Zealand
  • 4. One Amazing Team • Staff (6) • 2 faculty, 3 post docs, 1 engineer • Students (39) • 17 PhD (+2), 2 Masters, 6 undergraduate, 14 interns
  • 6. “Only through communication can Human Life hold meaning.” Paulo Freire
  • 7. Modern Technology Trends 1. Improved Content Capture • Move from sharing faces to sharing places 2. Increased Network Bandwidth • Sharing natural communication cues 3. Implicit Understanding • Recognizing behaviour and emotion
  • 10. “Empathy is Seeing with the Eyes of another, Listening with the Ears of another, and Feeling with the Heart of another..” Alfred Adler
  • 11. Empathic Computing Research Focus Can we develop systems that allow us to share what we are seeing, hearing and feeling with others?
  • 12. Key Elements of Empathic Systems •Understanding • Emotion Recognition, physiological sensors •Experiencing • Content/Environment capture, VR •Sharing • Communication cues, AR
  • 13. Example Projects • Remote collaboration in Wearable AR • Sharing of non-verbal cues (gaze, pointing, face expression, emotion) • Shared Empathic VR experiences • Use VR to put a viewer inside the players view • Measuring emotion • Detecting emotion from heart rate, GSR, eye gaze, etc.
  • 14. Empathy Glasses (2016) • Combine together eye-tracking, display, face expression • Implicit cues – eye gaze, face expression + + Pupil Labs Epson BT-200 AffectiveWear Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
  • 15. Remote Collaboration • Eye gaze pointer and remote pointing • Face expression display • Implicit cues for remote collaboration
  • 17. Shared Sphere – 360 Video Sharing Shared Live 360 Video Host User Guest User Lee, G. A., Teo, T., Kim, S., & Billinghurst, M. (2017). Mixed reality collaboration through sharing a live panorama. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications (pp. 1-4).
  • 19. 3D Live Scene Capture • Use cluster of RGBD sensors • Fuse together 3D point cloud
  • 20. Scene Capture and Sharing Scene Reconstruction Remote Expert Local Worker
  • 21. AR View Remote Expert View
  • 22. View Sharing Evolution • Increased immersion • Improved scene understanding • Better collaboration 2D 360 3D
  • 24. Current Generation of AR Glasses • Trend towards lightweight thin displays • Offload processing onto second device • High bandwidth connectivity to cloud services
  • 26. The Challenge of AR Interaction AR HMD INTERACTION Several Techniques • Controllers • Gestures • Touch input Limitations • Imprecise input • Simple graphics
  • 27. The Challenge of AR Interaction HANDHELD AR INTERACTION Advantages • Precise touch input • High resolution display Limitations • On viewed on phone screen • Narrow field of view
  • 28. The Opportunity AR HMDs tethered to handheld devices/phones ○New opportunity for interaction/display ○Wide field of view of HMD and Precise input of HHD
  • 30. Previous Work Our previous work • Use touch input on phone to interact with AR HMD content • Use tablet to provide 2D view of 3D AR conferencing space Bleeker 2013 Budhiraja 2013
  • 32. Secondsight A prototyping platform for rapidly testing cross-device interfaces • Enables an AR HMD to "extend" the screen of a smartphone Key Features • Can simulate a range of HMD Field of View • Enables World-fixed or Device-fixed content placement • Supports touch screen input, free-hand gestures, head-pose selection Reichherzer, C., Fraser, J., Rompapas, D. C., & Billinghurst, M. (2021, May). Secondsight: A framework for cross-device augmented reality interfaces. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-6).
  • 33. Implementation Hardware • Meta2 AR Glasses (82o FOV) • Samsung Galaxy S8 phone • OptiTrack motion capture system Software • Unity game engine • Mirror networking library
  • 34. Input
  • 35. Content Placement © 2021 SIGGRAPH. All Rights Reserved.
  • 36. Field of View Simulation
  • 37. Map application © 2021 SIGGRAPH. All Rights Reserved.
  • 38. Puzzle application © 2021 SIGGRAPH. All Rights Reserved.
  • 39. Pilot user study (4 users) © 2021 SIGGRAPH. All Rights Reserved. Goal: Explore effect of AR HMD FOV Task: Data visualisation task ● Find correspondances on health data Conditions: Simulated Field of View ● 15, 30 , 55, 82 degrees Measures ● Subjective Feedback on 5-point Likert Scale ○ How difficult was it to use the system (5 = very easy) ● Observing virtual window placement ● Condition ranking
  • 40. Pilot study results © 2021 SIGGRAPH. All Rights Reserved. • Users felt wider FOV was easiest to use • Most users had similar window placement • Useful feedback on how to improve user interface Typical Window Placement FOV Ease of Use 15 deg 1.00 +/- 0.0 30 deg 1.75 +/- 0.5 55 deg 2.75 +/- 1.25 82 deg 3.75 +/- 1.25 Average Ease of Use Rating
  • 42. Social Panoramas • Capture and share social spaces in real time • Supports independent views into Panorama Reichherzer, C., Nassani, A., & Billinghurst, M. (2014, September). [Poster] Social panoramas using wearable computers. In 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (pp. 303-304). IEEE.
  • 43. Implementation • Google Glass • Capture live image panorama (compass + camera) • Remote device (tablet) • Immersive viewing, live annotation
  • 46. Lessons Learned • Good • Communication easy and natural • Users enjoy have view independence • Very natural capturing panorama on Glass • Sharing panorama enhances the shared experience • Bad • Difficult to support equal input • Need to provide awareness cues
  • 47. • Using AR/VR to share communication cues • Gaze, gesture, head pose, body position • Sharing same environment • Virtual copy of real world • Collaboration between AR/VR • VR user appears in AR user’s space Piumsomboon, T., Dey, A., Ens, B., Lee, G., & Billinghurst, M. (2019). The effects of sharing awareness cues in collaborative mixed reality. Frontiers in Robotics and AI, 6, 5. Sharing Virtual Communication Cues (2019)
  • 48. Sharing Virtual Communication Cues • AR/VR displays • Gesture input (Leap Motion) • Room scale tracking • Conditions • Baseline, FoV, Head-gaze, Eye-gaze
  • 50. Sharing Gaze Cues How sharing gaze behavioural cues can improve remote collaboration in Mixed Reality environment. ➔ Developed eyemR-Vis, a 360 panoramic Mixed Reality remote collaboration system ➔ Showed gaze behavioural cues as bi-directional spatial virtual visualisations shared between a local host (AR) and a remote collaborator (VR). Jing, A., May, K. W., Naeem, M., Lee, G., & Billinghurst, M. (2021). eyemR-Vis: Using Bi-Directional Gaze Behavioural Cues to Improve Mixed Reality Remote Collaboration. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-7).
  • 51. System Design ➔ 360 Panaramic Camera + Mixed Reality View ➔ Combination of HoloLens2 + Vive Pro Eye ➔ 4 gaze behavioural visualisations: browse, focus, mutual, fixated circle
  • 53. Multi-Scale Collaboration • Changing the user’s virtual body scale
  • 55. Sharing: Communication Cues (2018) • What happens when you can’t see your colleague/agent? Piumsomboon, T., Lee, G. A., Hart, J. D., Ens, B., Lindeman, R. W., Thomas, B. H., & Billinghurst, M. (2018, April). Mini-me: An adaptive avatar for mixed reality remote collaboration. In Proceedings of the 2018 CHI conference on human factors in computing systems (pp. 1-13). Collaborating Collaborator out of View
  • 56. Mini-Me Communication Cues in MR • When lose sight of collaborator a Mini-Me avatar appears • Miniature avatar in real world • Mini-Me points to shared objects, show communication cues • Redirected gaze, gestures
  • 58. Results from User Evaluation • Collaboration between user in AR, expert in VR • Hololens, HTC Vive • Two tasks • Asymmetric, symmetric collaboration • Significant performance improvement • 20% faster with Mini-Me • Social Presence • Higher sense of Presence • Users preferred • People felt the task was easier to complete • 60-75% preference “I feel like I am talking to my partner”
  • 59. Technology Trends • Advanced displays • Wide FOV, high resolution • Real time space capture • 3D scanning, stitching, segmentation • Natural gesture interaction • Hand tracking, pose recognition • Robust eye-tracking • Gaze points, focus depth • Emotion sensing/sharing • Physiological sensing, emotion mapping
  • 60. Sensor Enhanced HMDs Eye tracking, heart rate, pupillometry, and face camera HP Omnicept Project Galea EEG, EMG, EDA, PPG, EOG, eye gaze, etc.
  • 61. Multiple Physiological Sensors into HMD • Incorporate range of sensors on HMD faceplate and over head • EMG – muscle movement • EOG – Eye movement • EEG – Brain activity • EDA, PPG – Heart rate
  • 66. Brain Synchronization in VR Gumilar, I., Sareen, E., Bell, R., Stone, A., Hayati, A., Mao, J., ... & Billinghurst, M. (2021). A comparative study on inter- brain synchrony in real and virtual environments using hyperscanning. Computers & Graphics, 94, 62-75.
  • 69. asfd
  • 71. NeuralDrum • Using brain synchronicity to increase connection • Collaborative VR drumming experience • Measure brain activity using 3 EEG electrodes • Use PLV to calculate synchronization • More synchronization increases graphics effects/immersion Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving Brain Synchronicity in XR Drumming. In SIGGRAPH Asia 2020 Technical Communications (pp. 1-4).
  • 72. Set Up • HTC Vive HMD • OpenBCI • 3 EEG electrodes
  • 74. Results "It’s quite interesting, I actually felt like my body was exchanged with my partner." Poor Player Good Player
  • 75. Fabric Electrodes • Can use conductive threads to create soft fabric EEG/GSR electrodes
  • 77. • Advanced displays • Real time space capture • Natural gesture interaction • Robust eye-tracking • Emotion sensing/sharing Empathic Tele-Existence
  • 78. Empathic Tele-Existence • Move from Observer to Participant • Explicit to Implicit communication • Experiential collaboration – doing together
  • 80. Conclusions • Empathic Computing Laboratory • 50 people split between Auckland/Adelaide • Empathic Computing Research Focus • Systems that enhance understanding • Natural collaboration + Experience capture + Implicit understanding • Combining AR, VR, Physiological sensing • Research directions • Hybrid AR, Novel collaboration, brain synchronization, etc.
  • 81. IVE
  • 82. ARIVE • Australasian Researchers in Interactive and Virtual Environments • Collecting together all the best AR/VR researchers in Australia and NZ • 8 institutions, 260 researchers, > 30 million USD in funding • Looking for Industry Partners for research consortium