Director at HIT Lab NZ à University of South Australia
6 Oct 2021•0 j'aime•1,212 vues
1 sur 174
Comp4010 Lecture9 VR Input and Systems
6 Oct 2021•0 j'aime•1,212 vues
Télécharger pour lire hors ligne
Signaler
Technologie
Lecture 9 of the COMP 4010 course in AR/VR from the University of South Australia. This was taught by Mark Billinghurst on October 5th, 2021. This lecture describes VR input devices, VR systems and rapid prototyping tools.
3. Reality vs. Virtual Reality
• In a VR system there are input and output devices
between human perception and action
4. Using Technology to Stimulate Senses
• Simulate output
• E.g. simulate real scene
• Map output to devices
• Graphics to HMD
• Use devices to
stimulate the senses
• HMD stimulates eyes
Visual
Simulation
3D Graphics HMD Vision
System
Brain
Example: Visual Simulation
Human-Machine Interface
5. Key Technologies for VR Systems
• Display (Immersion)
• Stimulate senses
• visual, auditory, tactile sense, etc..
• Tracking (Independence)
• Changing viewpoint
• independent movement
• Input Devices (Interaction)
• Supporting user interaction
• User input
13. Projection/Large Display Technologies
• Room Scale Projection
• CAVE, multi-wall environment
• Dome projection
• Hemisphere/spherical display
• Head/body inside
• Vehicle Simulator
• Simulated visual display in windows
14. CAVE
• Developed in 1992, EVL University of Illinois Chicago
• Multi-walled stereo projection environment
• Head tracked active stereo
Cruz-Neira, C., Sandin, D. J., DeFanti, T. A., Kenyon, R. V., & Hart, J. C. (1992). The CAVE: audio
visual experience automatic virtual environment. Communications of the ACM, 35(6), 64-73.
15. Vehicle Simulators
• Combine VR displays with vehicle
• Visual displays on windows
• Motion base for haptic feedback
• Audio feedback
• Physical vehicle controls
• Steering wheel, flight stick, etc
• Full vehicle simulation
• Emergencies, normal operation, etc
• Weapon operation
• Training scenarios
16. Audio Displays
• Spatialization vs. Localization
• Spatialization is the processing of sound signals to make them emanate from
a point in space
• This is a technical topic
• Localization is the ability of people to identify the source position of a sound
• This is a human topic, i.e., some people are better at it than others.
17. Stereo Sound
• Seems to come from inside users head
• Follows head motion as user moves head
18. 3D Spatial Sound
• Seems to be external to the head
• Fixed in space when user moves head
• Has reflected sound properties
19. Head-Related Transfer Functions (HRTFs)
• A set of functions that model how sound from a
source at a known location reaches the eardrum
21. Haptic Feedback
• Greatly improves realism
• Hands and wrist are most important
• High density of touch receptors
• Two kinds of feedback:
• Touch Feedback
• information on texture, temperature, etc.
• Does not resist user contact
• Force Feedback
• information on weight, and inertia.
• Actively resists contact motion
22. Active Haptics
• Actively resists motion
• Key properties
• Force resistance
• Frequency Response
• Degrees of Freedom
• Latency
23. Passive Haptics
• Not controlled by system
• Use real props (Styrofoam for walls)
• Pros
• Cheap
• Large scale
• Accurate
• Cons
• Not dynamic
• Limited use
29. MagneticTracker (Active)
• Idea: difference between a magnetic
transmitter and a receiver
• ++: 6DOF, robust
• -- : wired, sensible to metal, noisy, expensive
• -- : error increases with distance
Flock of Birds (Ascension)
30. InertialTracker (Passive)
• Idea: measuring linear and angular orientation rates
(accelerometer/gyroscope)
• ++: no transmitter, cheap, small, high frequency, wireless
• -- : drift, hysteris only 3DOF
IS300 (Intersense)
Wii Remote
39. Motivation
• Mouse and keyboard are good for desktop UI tasks
• Text entry, selection, drag and drop, scrolling, rubber banding, …
• 2D mouse for 2D windows
• What devices are best for 3D input in VR?
• Use multiple 2D input devices?
• Use new types of devices?
vs.
40. Input Device Characteristics
• Size and shape, encumbrance
• Degrees of Freedom
• Integrated (mouse) vs. separable (Etch-a-sketch)
• Direct vs. indirect manipulation
• Relative vs. Absolute input
• Relative: measure difference between current and last input (mouse)
• Absolute: measure input relative to a constant point of reference (tablet)
• Rate control vs. position control
• Isometric vs. Isotonic
• Isometric: measure pressure or force with no actual movement
• Isotonic: measure deflection from a center point (e.g. mouse)
41. Hand Input Devices
• Devices that integrate hand input into VR
• World-Grounded input devices
• Devices fixed in real world (e.g. joystick)
• Non-Tracked handheld controllers
• Devices held in hand, but not tracked in 3D (e.g. xbox controller)
• Tracked handheld controllers
• Physical device with 6 DOF tracking inside (e.g. Vive controllers)
• Hand-Worn Devices
• Gloves, EMG bands, rings, or devices worn on hand/arm
• Bare Hand Input
• Using technology to recognize natural hand input
42. World Grounded Devices
• Devices constrained or fixed in real world
• Not ideal for VR
• Constrains user motion
• Good for VR vehicle metaphor
• Used in location based entertainment (e.g. Disney Aladdin ride)
Disney Aladdin Magic Carpet VR Ride
43. Eight360 - Nova
• World grounded interface
• Mounted in sphere
• Full 360-degree range of motion
• Vehicle motion simulator
• https://www.eight360.com/
45. Non-Tracked Handheld Controllers
• Devices held in hand
• Buttons, joysticks, game controllers, etc.
• Traditional video game controllers
• Xbox controller
46. Tracked Handheld Controllers
• Handheld controller with 6 DOF tracking
• Combines button/joystick input plus tracking
• One of the best options for VR applications
• Physical prop enhancing VR presence
• Providing proprioceptive, passive haptic touch cues
• Direct mapping to real hand motion
HTC Vive Controllers Oculus Touch Controllers
47. Example: WMR Handheld Controllers
• Windows Mixed Reality Controllers
• Left and right hand
• Combine computer vision + IMU tracking
• Track both in and out of view
• Button input, Vibration feedback
49. Cubic Mouse
• Plastic box
• Polhemus Fastrack inside (magnetic 6 DOF tracking)
• 3 translating rods, 6 buttons
• Two handed interface
• Supports object rotation, zooming, cutting plane, etc.
Fröhlich, B., & Plate, J. (2000). The cubic mouse: a new device for three-dimensional input. In Proceedings of
the SIGCHI conference on Human Factors in Computing Systems (pp. 526-531). ACM.
51. Hand Worn Devices
• Devices worn on hands/arms
• Glove, EMG sensors, rings, etc.
• Advantages
• Natural input with potentially rich gesture interaction
• Hands can be held in comfortable positions – no line of sight issues
• Hands and fingers can fully interact with real objects
53. Data Gloves
• Bend sensing gloves
• Passive input device
• Detecting hand posture and gestures
• Continuous raw data from bend sensors
• Fibre optic, resistive ink, strain-gauge
• Large DOF output, natural hand output
• Pinch gloves
• Conductive material at fingertips
• Determine if fingertips touching
• Used for discrete input
• Object selection, mode switching, etc.
54. How Pinch Gloves Work
• Contact between conductive
fabric completes circuit
• Each finger receives voltage
in turn (T3 – T7)
• Look for output voltage at
different times
57. Bare Hands
• Using computer vision to track bare hand input
• Creates compelling sense of Presence, natural interaction
• Challenges need to be solved
• Not having sense of touch
• Line of sight required to sensor
• Fatigue from holding hands in front of sensor
58. Oculus Quest 2 – Hand Tracking
https://www.youtube.com/watch?v=uztFcEA6Rf0
59. Non-Hand Input Devices
• Capturing input from other parts of the body
• Head Tracking
• Use head motion for input
• Eye Tracking
• Largely unexplored for VR
• Face Tracking
• Use for lip syncing
• Microphones
• Audio input, speech
• Full-Body tracking
• Motion capture, body movement
60. Eye Tracking
• Technology
• Shine IR light into eye and look for reflections
• Advantages
• Provides natural hands-free input
• Gaze provides cues as to user attention
• Can be combined with other input technologies
61. HTC Vive Pro Eye
• HTC Vive Pro with integrated eye-tracking
• Tobii systems eye-tracker
• Easy calibration and set-up
• Auto-calibration software compensates for HMD motion
65. Full Body Tracking
• Adding full-body input into VR
• Creates illusion of self-embodiment
• Significantly enhances sense of Presence
• Technologies
• Motion capture suit, camera based systems
• Can track large number of significant feature points
66. Camera Based Motion Capture
• Use multiple cameras
• Reflective markers on body
• Eg – Opitrack (www.optitrack.com)
• 120 – 360 fps, < 10ms latency, < 1mm accuracy
72. Omnidirectional Treadmills
• Infinadeck
• 2 axis treadmill, flexible material
• Tracks user to keep them in centre
• Limitless walking input in VR
• www.infinadeck.com
78. Creating a Good VR Experience
• Creating a good experience requires good system design
• Integrating multiple hardware, software, interaction, content elements
79. Example: Shard VR Slide
Ride down the Shard at 100 mph - Multi-sensory VR
https://www.youtube.com/watch?v=HNXYoEdBtoU
80. Key Components to Consider
• Five key components:
• Inputs
• Outputs
• Computation/Simulation
• Content/World database
• User interaction
From: Sherman, W. R., & Craig, A. B. (2018). Understanding virtual reality:
Interface, application, and design. Morgan Kaufmann.
81. Typical VR System
• Combining multiple technology elements for good user experience
• Input devices, output modality, content databases, networking, etc.
82. From Content to User
Modelling
Program
Content
• 3d model
• Textures
Translation
• CAD data
Application
programming
Dynamics
Generator
Input Devices
• Gloves, Mic
• Trackers
Renderers
• 3D, sound
Output Devices
• HMD, audio
• Haptic
User Actions
• Speak
• Grab
Software
Content
User I/O
83. Types of VR Graphics Content
• Panoramas
• 360 images/video
• Captured 3D content
• Scanned objects/spaces
• Modelled Content
• Hand created 3D models
• Existing 3D assets
84. Capturing Panoramas
• Stitching individual photos together
• Image Composite Editor (Microsoft)
• AutoPano (Kolor)
• Using 360 camera
• Ricoh Theta-S
• Insta360
88. • Use camera pairs to capture stereo 360 video
• Vuze+ VR camera
• 8 lenses, 4K Stereoscopic 3D 360⁰ video and photo, $999 USD
Stereo Video Capture
Vuze
89. 3D Scanning
• A range of products support 3D scanning
• Create point cloud or mesh model
• Typically combine RGB cameras with depth sensing
• Captures texture plus geometry
• Multi-scale
• Object Scanners
• Handheld, Desktop
• Body Scanners
• Rotating platform, multi-camera
• Room scale
• Mobile, tripod mounted
90. Example: Matterport
• Matterport Pro2 3D scanner
• Room scale scanner, panorama and 3D model
• 360° (left-right) x 300° (vertical) field of view
• Structured light (infared) 3D sensor
• 15 ft (4.5 m) maximum range
• 4K HDR images
92. Handheld/Desktop Scanners
• Capture people/objects
• Sense 3D scanner
• accuracy of 0.90 mm, colour resolution of 1920×1080 pixels
• Occipital Structure sensor
• Add-on to iPad, mesh scanning, IR light projection, 60 Hz
94. 3D Modelling
• A variety of 3D modelling tools can be used
• Export in VR compatible file format (.obj, .fbx, etc)
• Especially useful for animation - difficult to create from scans
• Popular tools
• Blender (free), 3DS max, Maya, etc.
• Easy to Use
• Tinkercad, Sketchup Free, Meshmixer, Fusion 360, etc.
95. Modelling in VR
• Several tools for modelling in VR
• Natural interaction, low polygon count, 3D object viewins
• Low end
• Google Blocks
• High end
• Quill, Tilt brush – 3D painting
• Gravity Sketch – 3D CAD
97. Download Existing VR Content
• Many locations for 3D objects, textures, etc.
• Sketchfab, Sketchup, Free3D (www.free3d.com), etc.
• Asset stores - Unity, Unreal
• Provide 3D models, materials, code, etc..
99. • Low level code for loading models and showing on screen
• Using shaders and low-level GPU programming to improve graphics
Traditional 3D Graphics Pipeline
100. Graphics Challenges with VR
• Higher data throughput (> 7x desktop requirement)
• Lower latency requirements (from 150ms/frame to 20ms)
• HMD Lens distortion
101. • HMD may have cheap lens
• Creates chromatic aberration and distorted image
• Warp graphics images to create undistorted view
• Use low level shader programming
Lens Distortion
103. Perception Based Graphics
• Eye Physiology
• Rods in eye centre = colour vision, cones in periphery = motion, B+W
• Foveated Rendering
• Use eye tracking to draw highest resolution where user looking
• Reduces graphics throughput
105. Scene Graphs
• Tree-like structure for organising VR graphics
• e.g. VRML, OSG, X3D
• Hierarchy of nodes that define:
• Groups (and Switches, Sequences etc…)
• Transformations
• Projections
• Geometry
• …
• And states and attributes that define:
• Materials and textures
• Lighting and blending
• …
106. Example Scene Graph
• Car model with four wheels
• Only need one wheel geometry object in scene graph
107. More Complex
• Everything off root node
• Parent/child node
relationships
• Can move car by
transforming group node
108. Adding Cameras and Lights
• Scene graph includes:
• Cameras
• Lighting
• Material properties
• Etc..
• All passed to renderer
109. Benefits of Using a Scene Graph
• Performance
• Structuring data facilitates optimization
• Culling, state management, etc…
• Hardware Abstraction
• Underlying graphics pipeline is hidden
• No Low-level programming
• Think about objects, not polygons
• Supports Behaviours
• Collision detection, animation, etc..
110. Scene Graph in the Rendering Pipeline
• Scene graph used to optimize scene creation in pipeline
111. Scene Graph Libraries
• VRML/X3D
• descriptive text format, ISO standard
• OpenInventor
• based on C++ and OpenGL
• originally Silicon Graphics, 1988
• now supported by VSG3d.com
• Java3D
• provides 3D data structures in Java
• not supported anymore
• Open Scene Graph (OSG)
• Various Game Engines
• e.g. JMonkey 3 (scene graph based game engine for Java)
112. OpenSceneGraph
• http://www.openscenegraph.org/
• Open-source scene graph implementation
• Based on OpenGL
• Object-oriented C++ following design pattern principles
• Used for simulation, games, research, and industrial projects
• Active development community
• mailing list, documentation (www.osgbooks.com)
• Uses the OSG Public License (similar to LGPL)
113. OpenSceneGraph Features
• Plugins for loading and saving
• 3D: 3D Studio (.3ds), OpenFlight (.flt), Wavefront (.obj)…
• 2D: .png, .jpg, .bmp, QuickTime movies
• NodeKits to extend functionality
• osgTerrain - terrain rendering
• osgAnimation - character animation
• osgShadow - shadow framework
• Multi-language support
• C++, Java, Lua and Python
• Cross-platform support:
• Windows, Linux, MacOS, iOS, Android, etc.
114. OpenSceneGraph Architecture
Scene graph and
Rendering
functionality
Plugins read and
write 2D image
and 3D model
files
NodeKits extend
core functionality,
exposing higher-level
node types
115. OpenSceneGraph and Virtual Reality
• Need to create VR wrapper on top of OSG
• Add support for HMDs, device interaction, etc..
• Several viewer nodes available with VR support
• OsgOpenVRViewer: viewing on VR devices compatible with openVR/steamVR
• OsgOculusViewer: OsgViewer with support for the Oculus Rift
120. Typical System Delays
• Total Delay = 50 + 2 + 33 + 17 = 102 ms
• 1 ms delay = 1/3 mm error for object drawn at arms length
• So total of 33mm error from when user begins moving to when object drawn
Tracking Calculate
Viewpoint
Simulation
Render
Scene
Draw to
Display
x,y,z
r,p,y
Application Loop
20 Hz = 50ms 500 Hz = 2ms 30 Hz = 33ms 60 Hz = 17ms
121. Living with High Latency (1/3 sec – 3 sec)
https://www.youtube.com/watch?v=_fNp37zFn9Q
122. Effects of System Latency
• Degraded Visual Acuity
• Scene still moving when head stops = motion blur
• Degraded Performance
• As latency increases it’s difficult to select objects etc.
• If latency > 120 ms, training doesn’t improve performance
• Breaks-in-Presence
• If system delay high user doesn’t believe they are in VR
• Negative Training Effects
• User train to operative in world with delay
• Simulator Sickness
• Latency is greatest cause of simulator sickness
124. What Happens When Senses Don’t Match?
• 20-30% VR users experience motion sickness
• Sensory Conflict Theory
• Visual cues don’t match vestibular cues
• Eyes – “I’m moving!”, Vestibular – “No, you’re not!”
125. Avoiding Motion Sickness
• Better VR experience design
• More natural movements
• Improved VR system performance
• Less tracking latency, better graphics frame rate
• Provide a fixed frame of reference
• Ground plane, vehicle window
• Add a virtual nose
• Provide peripheral cue
• Eat ginger
• Reduces upset stomach
126. Many Causes of Simulator Sickness
• 25-40% of VR users get Simulator Sickness, due to:
• Latency
• Major cause of simulator sickness
• Tracking accuracy/precision
• Seeing world from incorrect position, viewpoint drift
• Field of View
• Wide field of view creates more periphery vection = sickness
• Refresh Rate/Flicker
• Flicker/low refresh rate creates eye fatigue
• Vergence/Accommodation Conflict
• Creates eye strain over time
• Eye separation
• If IPD not matching to inter-image distance then discomfort
128. The VR Book
• The VR Book: Human-Centered
Design for Virtual Reality
• Jason Jerald
• https://thevrbook.net/
129. System Design Guidelines - I
• Hardware
• Choose HMDs with fast pixel response time, no flicker
• Choose trackers with high update rates, accurate, no drift
• Choose HMDs that are lightweight, comfortable to wear
• Use hand controllers with no line-of-sight requirements
• System Calibration
• Have virtual FOV match actual FOV of HMD
• Measure and set users IPD
• Latency Reduction
• Minimize overall end to end system delay
• Use displays with fast response time and low persistence
• Use latency compensation to reduce perceived latency
Jason Jerald, The VR Book, 2016
130. System Design Guidelines - II
• General Design
• Design for short user experiences
• Minimize visual stimuli closer to eye (vergence/accommodation)
• For binocular displays, do not use 2D overlays/HUDs
• Design for sitting, or provide physical barriers
• Show virtual warning when user reaches end of tracking area
• Motion Design
• Move virtual viewpoint with actual motion of the user
• If latency high, no tasks requiring fast head motion
• Interface Design
• Design input/interaction for user’s hands at their sides
• Design interactions to be non-repetitive to reduce strain injuries
Jason Jerald, The VR Book, 2016
148. Gravity Sketch
•Intuitive immersive 3D design platform
•Move from sketch to final 3D model render
•Natural 3D UI manipulation
•Two handed input, 3D menus, etc
•Multi-platform
•HMD (Quest, Steam), tablet, etc
•Support for collaboration
150. Scene Assembly
•Assemble assets into 3D scene
•Create high-fidelity view
•Collect user feedback
•Immersive Scene Assembly
•Microsoft Maquette: https://www.maquette.ms/
•Sketchbox: https://www.sketchbox3d.com/
151. Example: Mocking up a Scene - Twitch VR
https://www.notion.so/Twitch-VR-Prototype-Workflow-5cf65c7bfcd84226a87b1c0db07e46f2
152. 1. Collect tools needed
2. Know/imagine the user
3. Storyboard/sketch the experience
4. Create assets needed
5. Create the interface scenes
6. View in VR/record video
Prototyping Process
158. 3D Scene Layout
•Scene layout in VR
•Sketchbox
•https://www.sketchbox3d.com/
•Fast and simple VR prototyping tool
•Collaborative VR design tool
•Export to SketchFab/fbx
163. Digital Authoring Tools for VR
• Support visual authoring of 3D
scene graphs with VR previews
• Basic interactions can be
implemented without coding
• Advanced interactions require
JavaScript, C#, or C++
Amazon Sumerian
Unity Editor
164. Immersive Authoring Tools for VR
• Enable visual authoring of 3D
content in VR
• Make it possible to edit while
previewing VR experience
• Focus on 3D modeling rather than
animation & scripting
• Typically support export to common
3D model formats and asset sharing
platforms like Google Poly,
Sketchfab, or 3D Warehouse
Google Blocks
Oculus Quill
165. Interactive 360 Prototyping for VR
•Create 360 images and add interactive elements
•Many possible tools
•InstaVR
•http://www.instavr.co/
•Free, fast panorama VR
•Drag and drop web interface
•Deploy to multi platforms (Quest, Vive, phone, etc)
•VR Direct
•https://www.vrdirect.com/
•Connect multiple 360 scenes
•Instant content update
•EasyVR
•https://www.360easyvr.com/
167. VR Visual Programming
• Drag and drop VR development
• Visual Programming for Unity
• VR Easy - http://blog.avrworks.com/
• Key VR functionality (navigation, etc)
• HMD and VR controller support
• Bolt
• Rich visual flow
• Integrated with Unity
• Playmaker - https://hutonggames.com/
• Popular game authoring tool
• Can be combined with VR toolkits
168. Video Demo - VR Easy
https://www.youtube.com/watch?v=L49ENduYgac