SlideShare une entreprise Scribd logo
1  sur  236
Télécharger pour lire hors ligne
INTRODUCTION TO VR
COMP 4010 Lecture Seven
Mark Billinghurst
September 8th 2022
mark.billinghurst@unisa.edu.au
LECTURE 6 REVIEW
Design in Interaction Design
Key Prototyping
Steps
AR. Design Considerations
• 1. Design for Humans
• Use Human Information Processing model
• 2. Design for Different User Groups
• Different users may have unique needs
• 3. Design for the Whole User
• Social, cultural, emotional, physical cognitive
• 4. Use UI Best Practices
• Adapt known UI guidelines to AR/VR
• 5. Use of Interface Metaphors/Affordances
• Decide best metaphor for AR/VR application
1. Design for Human Information Processing
• High level staged model from Wickens and Carswell (1997)
• Relates perception, cognition, and physical ergonomics
Perception Cognition Ergonomics
Design for Perception
• Need to understand perception to design AR
• Visual perception
• Many types of visual cues (stereo, oculomotor, etc.)
• Auditory system
• Binaural cues, vestibular cues
• Somatosensory
• Haptic, tactile, kinesthetic, proprioceptive cues
• Chemical Sensing System
• Taste and smell
Improving Depth Perception
Cutaways
Occlusion
Shadows
Design for Cognition
• Design for Working and Long-term memory
• Working memory
• Short term storage, Limited storage (~5-9 items)
• Long term memory
• Memory recall trigger by associative cues
• Situational Awareness
• Model of current state of user’s environment
• Used for wayfinding, object interaction, spatial awareness, etc..
• Provide cognitive cues to help with situational awareness
• Landmarks, procedural cues, map knowledge
• Support both ego-centric and exo-centric views
Design for Physical Ergonomics
• Design for the human motion range
• Consider human comfort and natural posture
• Design for hand input
• Coarse and fine scale motions, gripping and grasping
• Avoid “Gorilla arm syndrome” from holding arm pose
Gorilla Arm in AR
• Design interface to reduce mid-air gestures
2. Designing for Different User Groups
• Design for Difference Ages
• Children require different interface design than adults
• Older uses have different needs than younger
• Prior Experience with AR systems
• Familiar with HMDs, AR input devices
• People with Different Physical Characteristics
• Height and arm reach, handedness
• Perceptual, Cognitive and Motor Abilities
• Colour perception varies between people
• Spatial ability, cognitive or motor disabilities
3. Design for the Whole User
4. Use UI Best Practices
• General UI design principles can be applied to AR
• E.g. Shneiderman’s UI guidelines from 1998
• Providing interface feedback
• Mixture of reactive, instrumental and operational feedback
• Maintain spatial and temporal correspondence
• Use constraints
• Specify relations between variables that must be satisfied
• E.g. physical constraints reduce freedom of movement
• Support Two-Handed control
• Use Guiard’s framework of bimanual manipulation
• Dominant vs. non-dominant hands
•Interface Components
• Physical components
• Display elements
• Visual/audio
• Interaction metaphors
Physical
Elements
Display
Elements
Interaction
Metaphor
Input Output
5. Use Interface Metaphors
AR Design Space
Reality Virtual Reality
Augmented Reality
Physical Design Virtual Design
•AR design is mixture of physical
affordance and virtual affordance
•Physical
•Tangible controllers and objects
•Virtual
•Virtual graphics and audio
Affordances in AR
• Design AR interface objects to show how they are used
• Use visual and physical cues to show possible affordances
• Perceived affordances should match actual affordances
• Physical and virtual affordances should match
Merge Cube Tangible Molecules
AR Chemistry Input Devices
Summary
•When designing AR interfaces, think of:
• Physical Components
• Physical affordances
• Virtual Components
• Virtual affordances
• Interface Metaphors
• Tangible AR or similar
INTRODUCTION TO VR
From Reality to Virtual Reality
Internet of Things Augmented Reality Virtual Reality
Real World Virtual World
Virtual Reality (VR)
• Users immersed in Computer Generated environment
• HMD, gloves, 3D graphics, body tracking
Goal of Virtual Reality
“.. to make it feel like you’re actually in a place that
you are not.”
Palmer Luckey
Co-founder, Oculus
Virtual Reality Definition
•Defining Characteristics
• Immersion
• User feels immersed in computer generated scene
• Interaction
• The virtual content can be interacted with
• Independence
• User can have independent view and react to environment
From Immersion to Presence
• Immersion: describes the extent to which technology is capable of
delivering a vivid illusion of reality to the senses of a human participant.
• Presence: a state of consciousness, the (psychological) sense of being
in the virtual environment.
• So Immersion, defined in technical terms, is capable of producing a
sensation of Presence
• Goal of VR: Create a high degree of Presence
• Make people believe they are really in Virtual Environment
Slater, M., & Wilbur, S. (1997). A framework for immersive virtual environments (FIVE): Speculations on the role
of presence in virtual environments. Presence: Teleoperators and virtual environments, 6(6), 603-616.
Presence ..
“The subjective experience of being in one place or
environment even when physically situated in another”
Witmer, B. G., & Singer, M. J. (1998). Measuring presence in virtual environments: A presence
questionnaire. Presence: Teleoperators and virtual environments, 7(3), 225-240.
Why do people behave like this?
• Presence can be decomposed into two dimensions (Slater 2009):
• “Place Illusion” (PI): being in the place depicted in the VR environment
• perception in VR matches natural sensorimotor input
• Plausibility Illusion (Psi): the events in the VR environment are actually occurring
• VR environment responds to user actions
• When both PI and Psi are high, people respond realistically to events in the VR
Slater, M. (2009). Place illusion and plausibility can lead to realistic behaviour in immersive virtual
environments. Philosophical Transactions of the Royal Society B: Biological Sciences, 364(1535), 3549-3557.
Presence = PI + Psi + ??
Slater, M., Banakou, D., Beacco, A., Gallego, J., Macia-Varela, F., & Oliva, R. (2022). A Separate Reality: An
Update on Place Illusion and Plausibility in Virtual Reality. Frontiers in Virtual Reality, 81.
Four Illusions of Presence (Slater 2022)
• Place Illusion: being in the place
• Plausibility Illusion: events are real
• Body Ownership: seeing your body in VR
• Copresence/Social Presence: other people are in VR
Reality vs. Virtual Reality
• In a VR system there are input and output devices
between human perception and action
Using Technology to Stimulate Senses
• Simulate output
• E.g. simulate real scene
• Map output to devices
• Graphics to HMD
• Use devices to
stimulate the senses
• HMD stimulates eyes
Visual
Simulation
3D Graphics HMD Vision
System
Brain
Example: Visual Simulation
Human-Machine Interface
Key Technologies for VR Systems
• Display (Immersion)
• Stimulate senses
• visual, auditory, tactile sense, etc..
• Tracking (Independence)
• Changing viewpoint
• independent movement
• Input Devices (Interaction)
• Supporting user interaction
• User input
DISPLAY TECHNOLOGY
VR Display Taxonomy
Creating an Immersive Experience
•Head Mounted Display
• Immerse the eyes
•Projection/Large Screen
• Immerse the head/body
•Future Technologies
• Neural implants
• Contact lens displays, etc
VR Head Mounted Displays
• Wide range of HMDs available
Key Properties of HMDs
• Lens
• Focal length, Field of View
• Occularity, Interpupillary distance
• Eye relief, Eye box
• Display
• Resolution, contrast
• Power, brightness
• Refresh rate
• Ergonomics
• Size, weight
• Wearability
HMD Basic Principles
• Use display with optics to create illusion of virtual screen
Simple Magnifier HMD Design
p
q
Eyepiece
(one or more lenses) Display
(Image Source)
Eye f
Virtual
Image
1/p + 1/q = 1/f where
p = object distance (distance from image source to eyepiece)
q = image distance (distance of image from the lens)
f = focal length of the lens
Distortion in Lens Optics
A rectangle Maps to this
HTC Vive Optics
To Correct for Distortion
• Must pre-distort image
• This is a pixel-based
distortion
• Use shader programming
VR Distorted Image
Interpupillary Distance (IPD)
nHorizontal distance between a
user's eyes
nDistance between the two
optical axes in a HMD
nTypical IPD ~ 63mm
Field of View
Monocular FOV is the angular
subtense of the displayed image as
measured from the pupil of one eye.
Total FOV is the total angular size of the
displayed image visible to both eyes.
Binocular(or stereoscopic) FOV refers to the
part of the displayed image visible to both eyes.
FOV may be measured horizontally,
vertically or diagonally.
Typical VR HMD FOV
Vive HMD (Released 2016)
• Field of View
• 110 degrees horizontal, 110 degrees horizontal
• Resolution
• 1080×1200 per eye
• Refresh rate
• 90 Hz
• Other features
• Adjustable lens separation
• Over the ear headphones
• Integrated tracking
• Tethered to PC
Foveated Displays
• Combine high resolution center
with low resolution periphery
Varjo Display
Varjo resolution
Non-Varjo resolution
Focus area (27° x 27°)
70 PPD, 1920 x 1920px
115° FOV
30 PPD
2880 x 2720px
• 1 LCD (wide FOV)
• 1 uOLED panel (centre)
Varjo XR-3 Demo – Threading a Needle
https://www.youtube.com/watch?v=5iEwlOEUQjI
Computer Based vs. Mobile VR Displays
Google Cardboard
• Released 2014 (Google 20% project)
• >10 million shipped/given away
• Easy to use developer tools
+ =
Multiple Mobile VR Viewers Available
VR Head Mounted Displays (HMDs)
• HTC Vive Pro - $1100 USD
• 110 degree FOV, 2 handed input
• Tethered VR, room scale tracking
• Oculus Quest - $400 USD
• 110 degree FOV, 2 handed input
• Self contained VR, inside out tracking
• Google Cardboard - $10 USD
• 90 degree FOV, button input
• Mobile VR, 3 DOF tracking
HMD Design Trade-offs
• Resolution vs. field of view
• As FOV increases, resolution decreases for fixed pixels
• Eye box vs. field of view
• Larger eye box limits field of view
• Size, Weight and Power vs. everything else
vs.
Projection/Large Display Technologies
• Room Scale Projection
• CAVE, multi-wall environment
• Dome projection
• Hemisphere/spherical display
• Head/body inside
• Vehicle Simulator
• Simulated visual display in windows
Stereo Projection
• Active Stereo
• Active shutter glasses
• Time synced signal
• Brighter images
• More expensive
• Passive Stereo
• Polarized images
• Two projectors (one/eye)
• Cheap glasses (powerless)
• Lower resolution/dimmer
• Less expensive
CAVE
• Developed in 1992, EVL University of Illinois Chicago
• Multi-walled stereo projection environment
• Head tracked active stereo
Cruz-Neira, C., Sandin, D. J., DeFanti, T. A., Kenyon, R. V., & Hart, J. C. (1992). The CAVE: audio
visual experience automatic virtual environment. Communications of the ACM, 35(6), 64-73.
Typical CAVE Setup
• 4 sides, rear projected stereo images
Demo Video – Wisconsin CAVE
• https://www.youtube.com/watch?v=mBs-OGDoPDY
CAVE Variations
Caterpillar Demo
• https://www.youtube.com/watch?v=r9N1w8PmD1E
Multi-User CAVEs
• Limitation of CAVEs
• Stereo projection from only one user’s viewpoint
• Solution
• Higher frequency projectors and time slicing
Kulik, A., Kunert, A., Beck, S., Reichel, R., Blach, R., Zink, A., & Froehlich, B. (2011). C1x6: a
stereoscopic six-user display for co-located collaboration in shared virtual environments. ACM
Transactions on Graphics (TOG), 30(6), 188.
Multiuser Demo
https://www.uni-weimar.de/de/medien/professuren/vr/research/multi-user-virtual-
reality/c1x6-a-stereoscopic-six-user-display/
Allosphere
• Univ. California Santa Barbara
• One of a kind facility
• Immersive Spherical display
• 10 m diameter
• Inside 3 story anechoic cube
• Passive stereoscopic projection
• 26 projectors, 146 speakers
• Visual tracking system for input
• See http://www.allosphere.ucsb.edu/
Kuchera-Morin, J., Wright, M., Wakefield, G.,
Roberts, C., Adderton, D., Sajadi, B., ... & Majumder,
A. (2014). Immersive full-surround multi-user system
design. Computers & Graphics, 40, 10-21.
Allosphere Demo
• https://www.youtube.com/watch?v=25Ch8eE0vJg
Allosphere Research
• Multi-disciplinary research
• Science, art, engineering
• Typical research projects
• Brain imaging
• fMRI imaging data
• Atomic bonding
• Bond simulation models
• Nano medicine
• Simulate chemotherapy
• Graph browser
• Mathematical visualization
• Etc
Brain Imaging
Hydrogen bond
Nano Medicine
Vehicle Simulators
• Combine VR displays with vehicle
• Visual displays on windows
• Motion base for haptic feedback
• Audio feedback
• Physical vehicle controls
• Steering wheel, flight stick, etc
• Full vehicle simulation
• Emergencies, normal operation, etc
• Weapon operation
• Training scenarios
Demo: Boeing 787 Simulator
• https://www.youtube.com/watch?v=3iah-blsw_U
Lexus Driving Simulator
https://www.youtube.com/watch?v=ljfKJskjw08
AUDIO DISPLAYS
Audio Displays
Definition: Computer interfaces that provide synthetic sound
feedback to users interacting with the virtual world.
The sound can be monoaural (both ears hear the same sound), or
binaural (each ear hears a different sound)
Burdea, Coiffet (2003)
Motivation
• Most of the focus in Virtual Reality is on the visuals
• GPUs continue to drive the field
• Users want more
• More realism, More complexity, More speed
• However, sound can significantly enhance realism
• Example: Mood music in horror games
• Sound can provide valuable user interface feedback
• Example: Alert in training simulation
360 Video + Spatial Audio (wear headphones)
• https://www.youtube.com/watch?v=G8pABGosD38
Types of Audio Recordings
• Monaural: Recording with one microphone – no positioning
• Stereo Sound: Recording with two microphones placed several feet
apart. Perceived sound position as recorded by microphones.
• Binaural: Recording microphones embedded in a dummy head. Audio
filtered by head shape.
• 3D Sound: Using tiny microphones in the ears of a real person.
Generate HRTF based on ear shape and audio response.
Capturing 3D Audio for Playback
• Binaural recording
• 3D Sound recording, from microphones in simulated ears
• Hear some examples (use headphones)
• http://binauralenthusiast.com/examples/
Example (Use Headphones)
• https://www.youtube.com/watch?v=tNEUObTs1kg
Synthetic Sounds
• Complex sounds can be built from simple waveforms (e.g., sawtooth, sine)
and combined using operators
• Waveform parameters (frequency, amplitude) could be taken from motion
data, such as object velocity
• Can combine wave forms in various ways
• This is what classic synthesizers do
• Works well for many non-speech sounds
Audio Display Properties
Presentation Properties
• Number of channels
• Sound stage
• Localization
• Masking
• Amplification
Logistical Properties
• Noise pollution
• User mobility
• Interface with tracking
• Integration
• Portability
• Throughput
• Safety
• Cost
Audio Displays: Head-worn
Ear Buds On Ear Open
Back
Closed Bone
Conduction
Spatialization vs. Localization
• Spatialization is the processing of sound signals to make them
emanate from a point in space
• This is a technical topic
• Localization is the ability of people to identify the source position
of a sound
• This is a human topic, some people are better at it than others.
Stereo Sound
• Seems to come from inside user’s head
• Follows head motion as user moves head
3D Spatial Sound
• Seems to be external to the head
• Fixed in space when user moves head
• Has reflected sound properties
Example: Sound Spatialisation
• Note: Use stereo speakers/headphones for best effect
Spatialized Audio Effects
• Naïve approach
• Simple left/right shift for lateral position
• Amplitude adjustment for distance
• Easy to produce using consumer hardware/software
• Does not give us "true" realism in sound
• No up/down or front/back cues
• We can use multiple speakers for this
• Surround the user with speakers
• Send different sound signals to each one
Example: The BoomRoom
• Use surround speakers to create spatial audio effects
• Gesture based interaction
• https://www.youtube.com/watch?time_continue=54&v=6RQMOyQ3lyg
Audio Localization
• Main cues used by humans to localize sound:
1. Interaural time differences: Time difference for
sound wave to travel between ears
2. Interaural level differences: For high frequency
sounds (> 1.5 kHz), volume difference between
ears used to determine source direction
3. Spectral filtering done by outer ears: Ear shape
changes frequency heard
Interaural Time Difference
• Takes fixed time to travel between ears
• Can use time difference to determine sound location
Spectral Filtering
Ear shape filters sound depending on direction it is coming from.
This change in frequency determines sound source elevation.
Head-Related Transfer Functions (HRTFs)
• A set of functions that model how sound from a
source at a known location reaches the eardrum
More About HRTFs
• Functions take into account,
• Individual ear shape
• Slope of shoulders
• Head shape
• So, each person has his/her own HRTF!
• Need to have a parameterizable HRTFs
• Some sound cards/APIs allow specifying an HRTF
• adsfa
Measuring HRTFs
• Putting microphones in Manikin or human ears
• Playing sound from fixed positions
• Record response
Environmental Effects
• Sound is also changed by objects in the
environment
• Can reverberate off of reflective objects
• Can be absorbed by objects
• Can be occluded by objects
• Doppler shift
• Moving sound sources
• Need to simulate environmental audio properties
• Takes significant processing power
Sound Reverberation
• Need to consider first and second order reflections
• Need to model material properties, objects in room, etc
Example: Sound Reflection
The Tough Part
• All of this takes a lot of processing
• Need to keep track of
• Multiple (possibly moving) sound sources
• Path of sounds through a dynamic environment
• Position and orientation of listener(s)
• Most sound cards only support a limited number of
spatialized sound channels
• Increasingly complex geometry increases load on
audio system as well as visuals
• That's why we fake it ;-)
• GPUs might change this too!
GPU Based Audio Acceleration
• Using GPU for audio physics calculations
• AMD TrueAudio Next - https://gpuopen.com/true-audio-next/
https://www.youtube.com/watch?v=Z6nwYLHG8PU
Audio Software SDKs
• Modern CPUs are fast enough spatial audio can be
generated without dedicated hardware
• Several 3D audio SDKs exist
• OpenAL
• www.openal.org
• Open source, cross platform
• Renders multichannel three-dimensional positional audio
• Google VR SDK
• Android, iOS, Unity
• https://developers.google.com/vr/concepts/spatial-audio
• Unity
• Unity Audio Spatializer SDK
• Microsoft DirectX, MRTK, etc
Google VR Spatial Audio Demo
• https://www.youtube.com/watch?v=I9zf4hCjRg0&feature=youtu.be
Demo: Spatial Audio In VR
• AltspaceVR spatial audio for speaker discrimination
• https://www.youtube.com/watch?v=dV3Qog44z6E
Designing Spatial Audio
• There are several tools available for designing 3D audio
• E.g. Facebook Spatial Workstation
• Audio tools for cinematic VR and360 video
• https://facebook360.fb.com/spatial-workstation/
• Spatial Audio Designer
• Mixing of surround sound and 3D audio
• http://www.newaudiotechnology.com/en/products/spatial-audio-designer/
HAPTIC/TACTILE DISPLAYS
Haptic Feedback
• Greatly improves realism
• Hands and wrist are most important
• High density of touch receptors
• Two kinds of feedback:
• Touch Feedback
• information on texture, temperature, etc.
• Does not resist user contact
• Force Feedback
• information on weight, and inertia.
• Actively resists contact motion
Active Haptics
• Actively resists motion
• Key properties
• Force resistance
• Frequency Response
• Degrees of Freedom
• Latency
Force Feedback Joysticks
• WingMan Force 3D
• Inexpensive ($60)
• Actuators that can move the
joystick given system
commands
• Max 3.3 N of force
• Force feedback driving wheel
Example: Phantom Omni
• Combined stylus input/haptic output
• 6 DOF haptic feedback
Phantom Omni Demo
• https://www.youtube.com/watch?v=REA97hRX0WQ
Haptic Glove
• Many examples of haptic gloves
• Typically use mechanical device to provide haptic feedback
HaptX Gloves
• Tactile + Haptic feedback
• Tactile actuators
• 130 feedback points/hand
• Force feedback exo-skeleton
• 8lbs force/finger
• Magnetic finger tracking
• 2 mm tracking accuracy
• https://haptx.com/
• https://www.youtube.com/watch?v=4K-MLVqD1_A
Homebrew Glove
• LucidVR Budget Haptic Glove
• Simple hand tracking, force feedback,
• $22 in parts..
• https://hackaday.io/project/178243-
lucidvr-budget-haptic-glove
Passive Haptics
• Not controlled by system
• Use real props (Styrofoam for walls)
• Pros
• Cheap
• Large scale
• Accurate
• Cons
• Not dynamic
• Limited use
UNC Being There Project
Passive Haptic Paddle
• Using physical props to provide haptic feedback
• http://www.cs.wpi.edu/~gogo/hive/
Tactile Feedback Interfaces
• Goal: Stimulate skin tactile receptors
• Using different technologies
• Air bellows
• Jets
• Actuators (commercial)
• Micropin arrays
• Electrical (research)
• Neuromuscular stimulations (research)
Vibrotactile Cueing Devices
• Vibrotactile feedback has been incorporated into many devices
• Can we use this technology to provide scalable, wearable touch cues?
Vibrotactile Feedback Projects
Navy TSAS Project
TactaBoard and
TactaVest
Teslasuit
• Full body haptic feedback - https://teslasuit.io/
• Electrical muscle stimulation
• https://www.youtube.com/watch?v=rFcbVrQWJSU
TRACKING TECHNOLOGY
Immersion and Tracking
• Motivation: For immersion, when the user changes
position in reality the VR view also needs to change
• Requires tracking of the user’s pose (position/orientation) in
the real world and mapping to the Virtual World
Tracking in VR
• Need for Tracking
• User turns their head and the VR graphics scene changes
• User wants to walking through a virtual scene
• User reaches out and grab a virtual object
• The user wants to use a real prop in VR
• All of these require technology to track the user or object
• Continuously provide information about position and orientation
Head Tracking
Hand Tracking
Tracking and Rendering in VR
Tracking fits into the graphics pipeline for VR
• Degree of Freedom = independent movement about an axis
• 3 DoF Orientation = roll, pitch, yaw (rotation about x, y, or z axis)
• 3 DoF Translation = movement along x,y,z axis
• Different requirements
• User turns their head in VR -> needs 3 DoF orientation tracker
• Moving in VR -> needs a 6 DoF tracker (r,p,y) and (x, y, z)
Degrees of Freedom
Tracking Technologies
§ Active (device sends out signal)
• Mechanical, Magnetic, Ultrasonic
• GPS, Wifi, cell location
§ Passive (device senses world)
• Inertial sensors (compass, accelerometer, gyro)
• Computer Vision
• Marker based, Natural feature tracking
§ Hybrid Tracking
• Combined sensors (e.g. Vision + Inertial)
Key Tracking Performance Criteria
• Static Accuracy
• Dynamic Accuracy
• Latency
• Update Rate
• Tracking Jitter
• Signal to Noise Ratio
• Tracking Drift
Static vs. Dynamic Accuracy
• Static Accuracy
• Ability of tracker to determine
coordinates of a position in space
• Depends on sensor sensitivity, errors
(algorithm, operator), environment
• Dynamic Accuracy
• System accuracy as sensor moves
• Depends on static accuracy
• Resolution
• Minimum change sensor can detect
• Repeatability
• Same input giving same output
Tracker Latency, Update Rate
• Latency: Time between change
in object pose and time sensor
detects the change
• Large latency (> 10 ms) can cause
simulator sickness
• Larger latency (> 50 ms) can
reduce VR immersion
• Update Rate: Number of
measurements per second
• Typically > 30 Hz
Tracker Jitter, Signal to Noise Ratio
• Jitter: Change in tracker output
when tracked object is stationary
• Range of change is sensor noise
• Tracker with no jitter reports constant
value if tracked object stationary
• Makes tracker data changing
randomly about average value
• Signal to Noise Ratio: Signal in
data relative to noise
• Found from calculating mean of
samples in known positions
Tracker Drift
• Drift: Steady increase in
tracker error over time
• Accumulative (additive) error
over time
• Relative to Dynamic sensitivity
over time
• Controlled by periodically
recalibration (zeroing)
MechanicalTracker (Active)
•Idea: mechanical arms with joint sensors
•++: high accuracy, haptic feedback
•-- : cumbersome, expensive
Microscribe Sutherland
Example: Fake Space Boom
• BOOM (Binocular Omni-Orientation Monitor)
• Counterbalanced arm with 100
o
FOV HMD mounted on it
• 6 DOF, 4mm position accuracy, 300Hz sampling, < 5 ms latency
Demo: Fake Space Tele Presence
• Using Boom with HMD to control robot view
• https://www.youtube.com/watch?v=QpTQTu7A6SI
MagneticTracker (Active)
• Idea: difference between a magnetic
transmitter and a receiver
• ++: 6DOF, robust
• -- : wired, sensible to metal, noisy, expensive
• -- : error increases with distance
Flock of Birds (Ascension)
Example: Razer Hydra
• Developed by Sixense
• Magnetic source + 2 wired controllers
• Short range (< 1 m), Precision of 1mm and 1o
• 62Hz sampling rate, < 50 ms latency
• $600 USD
Razor Hydra Demo
• https://www.youtube.com/watch?v=jnqFdSa5p7w
MagneticTracking Error
InertialTracker (Passive)
• Idea: measuring linear and angular orientation rates
(accelerometer/gyroscope)
• ++: no transmitter, cheap, small, high frequency, wireless
• -- : drift, hysteris only 3DOF
IS300 (Intersense)
Wii Remote
Types of Inertial Trackers
• Gyroscopes
• The rate of change in object orientation or angular velocity is measured.
• Accelerometers
• Measure acceleration.
• Can be used to determine object position, if the starting point is known.
• Inclinometer
• Measures inclination, ”level” position.
• Like carpenter’s level, but giving electrical signal.
Example: MEMS Sensor
• Uses spring-supported load
• Reacts to gravity and inertia
• Changes its electrical parameters
• < 5 ms latency, 0.01o
accuracy
• up to 1000Hz sampling
• Problems
• Rapidly accumulating errors.
• Error in position increases with the square of time.
• Cheap units can get position drift of 4 cm in 2 seconds.
• Expensive units have same error in 200 seconds.
• Not good for measuring location
• Need to periodically reset the output
Demo: MEMS Sensor Working
• https://www.youtube.com/watch?v=9eSnxebfuxg
MEMS Gyro Bias Drift
• Zero reading of MEMS Gyro drifts over time due to noise
Acoustic - UltrasonicsTracker
• Idea:Time of Flight or Phase-Coherence SoundWaves
• ++: Small, Cheap
• -- : 3DOF, Line of Sight, Low resolution, Affected by
Environment (pressure, temperature), Low sampling rate
Ultrasonic
Logitech IS600
OpticalTracker (Passive)
• Idea: Image Processing and ComputerVision
• Specialized
• Infrared, Retro-Reflective, Stereoscopic
• Monocular BasedVision Tracking
ART Hi-Ball
Outside-In vs.Inside-OutTracking
Example: Vive Lighthouse Tracking
• Outside-in tracking system
• 2 base stations
• Each with 2 laser scanners, LED array
• Headworn/handheld sensors
• 37 photo-sensors in HMD, 17 in hand
• Additional IMU sensors (500 Hz)
• Performance
• Tracking server fuses sensor samples
• Sampling rate 250 Hz, 4 ms latency
• See http://doc-ok.org/?p=1478
Lighthouse Components
Base station
- IR LED array
- 2 x scanned lasers
Head Mounted Display
- 37 photo sensors
- 9 axis IMU
Lighthouse Setup
How Lighthouse Tracking Works
• Position tracking using IMU
• 500 Hz sampling
• But drifts over time
• Drift correction using optical tracking
• IR synchronization pulse (60 Hz)
• Laser sweep between pulses
• Photo-sensors recognize sync pulse, measure time to laser
• Know when sensor hit and which sensor hit
• Calculate position of sensor relative to base station
• Use 2 base stations to calculate pose
• Use IMU sensor data between pulses (500Hz)
• See http://xinreality.com/wiki/Lighthouse
Lighthouse Tracking
Base station scanning
https://www.youtube.com/watch?v=avBt_P0wg_Y
https://www.youtube.com/watch?v=oqPaaMR4kY4
Room tracking
Example: Oculus Quest
• Inside out tracking
• Four cameras on corner of display
• Searching for visual features
• On setup creates map of room
Oculus Quest Tracking
• https://www.youtube.com/watch?v=2jY3B_F3GZk
Occipital Bridge Engine/Structure Core
• Inside out tracking
• Uses structured light
• Better than room scale tracking
• Integrated into bridge HMD
• https://structure.io/
Tracking Coordinate Frames
• There can be several coordinate frames to consider
• Head pose with respect to real world
• Coordinate fame of tracking system wrt HMD
• Position of hand in coordinate frame of hand tracker
Example: Finding your hand in VR
• Using Lighthouse and LeapMotion
• Multiple Coordinate Frames
• LeapMotion tracks hand in LeapMotion coordinate frame (HLM)
• LeapMotion is fixed in HMD coordinate frame (LMHMD)
• HMD is tracked in VR coordinate frame (HMDVR) (using Lighthouse)
• Where is your hand in VR coordinate frame?
• Combine transformations in each coordinate frame
• HVR = HLM x LMHMD x HMDVR
INPUT TECHNOLOGY
VR Input Devices
• Physical devices that convey information into the application
and support interaction in the Virtual Environment
Mapping Between Input and Output
Input
Output
Motivation
• Mouse and keyboard are good for desktop UI tasks
• Text entry, selection, drag and drop, scrolling, rubber banding, …
• 2D mouse for 2D windows
• What devices are best for 3D input in VR?
• Use multiple 2D input devices?
• Use new types of devices?
vs.
Input Device Characteristics
• Size and shape, encumbrance
• Degrees of Freedom
• Integrated (mouse) vs. separable (Etch-a-sketch)
• Direct vs. indirect manipulation
• Relative vs. Absolute input
• Relative: measure difference between current and last input (mouse)
• Absolute: measure input relative to a constant point of reference (tablet)
• Rate control vs. position control
• Isometric vs. Isotonic
• Isometric: measure pressure or force with no actual movement
• Isotonic: measure deflection from a center point (e.g. mouse)
Hand Input Devices
• Devices that integrate hand input into VR
• World-Grounded input devices
• Devices fixed in real world (e.g. joystick)
• Non-Tracked handheld controllers
• Devices held in hand, but not tracked in 3D (e.g. xbox controller)
• Tracked handheld controllers
• Physical device with 6 DOF tracking inside (e.g. Vive controllers)
• Hand-Worn Devices
• Gloves, EMG bands, rings, or devices worn on hand/arm
• Bare Hand Input
• Using technology to recognize natural hand input
World Grounded Devices
• Devices constrained or fixed in real world
• Not ideal for VR
• Constrains user motion
• Good for VR vehicle metaphor
• Used in location based entertainment (e.g. Disney Aladdin ride)
Disney Aladdin Magic Carpet VR Ride
Non-Tracked Handheld Controllers
• Devices held in hand
• Buttons, joysticks, game controllers, etc.
• Traditional video game controllers
• Xbox controller
Tracked Handheld Controllers
• Handheld controller with 6 DOF tracking
• Combines button/joystick input plus tracking
• One of the best options for VR applications
• Physical prop enhancing VR presence
• Providing proprioceptive, passive haptic touch cues
• Direct mapping to real hand motion
HTC Vive Controllers Oculus Touch Controllers
Example: WMR Handheld Controllers
• Windows Mixed Reality Controllers
• Left and right hand
• Combine computer vision + IMU tracking
• Track both in and out of view
• Button input, Vibration feedback
Hand Worn Devices
• Devices worn on hands/arms
• Glove, EMG sensors, rings, etc.
• Advantages
• Natural input with potentially rich gesture interaction
• Hands can be held in comfortable positions – no line of sight issues
• Hands and fingers can fully interact with real objects
Facebook EMG Band
• https://www.youtube.com/watch?v=WmxLiXAo9ko
Data Gloves
• Bend sensing gloves
• Passive input device
• Detecting hand posture and gestures
• Continuous raw data from bend sensors
• Fibre optic, resistive ink, strain-gauge
• Large DOF output, natural hand output
• Pinch gloves
• Conductive material at fingertips
• Determine if fingertips touching
• Used for discrete input
• Object selection, mode switching, etc.
StretchSense Gloves
• Wearable motion capture sensors
• Capacitive sensors
• Measure stretch, pressure, bend, shear
• Many applications
• Garments, gloves, etc.
• http://stretchsense.com/
StretchSense Glove Demo
• https://www.youtube.com/watch?v=ZDq7fQguFPI
Bare Hands
• Using computer vision to track bare hand input
• Creates compelling sense of Presence, natural
interaction
• Challenges need to be solved
• Not having sense of touch
• Line of sight required to sensor
• Fatigue from holding hands in front of sensor
Oculus Quest 2 – Hand Tracking
• https://www.youtube.com/watch?v=uztFcEA6Rf0
Non-Hand Input Devices
• Capturing input from other parts of the body
• Head Tracking
• Use head motion for input
• Eye Tracking
• Largely unexplored for VR
• Microphones
• Audio input, speech
• Full-Body tracking
• Motion capture, body movement
Eye Tracking
• Technology
• Shine IR light into eye and look for reflections
• Advantages
• Provides natural hands-free input
• Gaze provides cues as to user attention
• Can be combined with other input technologies
HTC Vive Pro Eye
• HTC Vive Pro with integrated eye-tracking
• Tobii systems eye-tracker
• Easy calibration and set-up
• Auto-calibration software compensates for HMD motion
• https://www.youtube.com/watch?v=y_jdjjNrJyk
Full Body Tracking
• Adding full-body input into VR
• Creates illusion of self-embodiment
• Significantly enhances sense of Presence
• Technologies
• Motion capture suit, camera based systems
• Can track large number of significant feature points
Camera Based Motion Capture
• Use multiple cameras
• Reflective markers on body
• Eg – Opitrack (www.optitrack.com)
• 120 – 360 fps, < 10ms latency, < 1mm accuracy
Optitrack Demo
• https://www.youtube.com/watch?v=tBAvjU0ScuI
Wearable Motion Capture: PrioVR
• Wearable motion capture system
• 8 – 17 inertial sensors + wireless data transmission
• 30 – 40m range, 7.5 ms latency, 0.09o
precision
• Supports full range of motion, no occlusion
• https://yostlabs.com/priovr/
PrioVR Demo
• https://www.youtube.com/watch?v=q72iErtvhNc
Pedestrian Devices
• Pedestrian input in VR
• Walking/running in VR
• Virtuix Omni
• Special shoes
• http://www.virtuix.com
• Cyberith Virtualizer
• Socks + slippery surface
• http://cyberith.com
Virtuix Omni Demo
• https://www.youtube.com/watch?v=aOYHg8qdxTE
Omnidirectional Treadmills
• Infinadeck
• 2 axis treadmill, flexible material
• Tracks user to keep them in centre
• Limitless walking input in VR
• www.infinadeck.com
Infinadeck Demo
• https://www.youtube.com/watch?v=seML5CQBzP8
Comparison Between Devices
From Jerald (2015)
Interaction
Trends
COMPLETE VR SYSTEMS
Creating a Good VR Experience
• Creating a good experience requires good system design
• Integrating multiple hardware, software, interaction, content elements
Example: Shard VR Slide
• Ride down the Shard at 100 mph - Multi-sensory VR
https://www.youtube.com/watch?v=HNXYoEdBtoU
Key Components to Consider
• Five key components:
• Inputs
• Outputs
• Computation/Simulation
• Content/World database
• User interaction
From: Sherman, W. R., & Craig, A. B. (2018). Understanding virtual reality:
Interface, application, and design. Morgan Kaufmann.
Typical VR System
• Combining multiple technology elements for good user
experience
• Input devices, output modality, content databases, networking, etc.
From Content to User
Modelling
Program
Content
• 3d model
• Textures
Translation
• CAD data
Application
programming
Dynamics
Generator
Input Devices
• Gloves, Mic
• Trackers
Renderers
• 3D, sound
Output Devices
• HMD, audio
• Haptic
User Actions
• Speak
• Grab
Software
Content
User I/O
Types of VR Graphics Content
• Panoramas
• 360 images/video
• Captured 3D content
• Scanned objects/spaces
• Modelled Content
• Hand created 3D models
• Existing 3D assets
Capturing Panoramas
• Stitching individual photos together
• Image Composite Editor (Microsoft)
• AutoPano (Kolor)
• Using 360 camera
• Ricoh Theta-S
• Fly360
Consumer 360 Capture Devices
Kodac 360 Fly 360 Gear 360 Theta S Nikon
LG 360 Pointgrey Ladybug Panono 360 Bublcam
Example: Cardboard Camera
• Capture 360 panoramas
• Stitch together images on phone
• View in VR on Google Cardboard Viewer
Cardboard Camera
• https://www.youtube.com/watch?v=d5lUXZhWaZY
• Use camera pairs to capture stereo 360 video
• Samsung 360 round
• 17 lenses, 4K 3D images, live video streaming, $10K USD
• Vuze+ VR camera
• 8 lenses, 4K Stereoscopic 3D 360⁰ video and photo, $999 USD
Stereo Video Capture
Vuze Samsung
Samsung 360 Round
• https://www.youtube.com/watch?v=X_ytJJOmVF0
3D Scanning
• A range of products support 3D scanning
• Create point cloud or mesh model
• Typically combine RGB cameras with depth sensing
• Captures texture plus geometry
• Multi-scale
• Object Scanners
• Handheld, Desktop
• Body Scanners
• Rotating platform, multi-camera
• Room scale
• Mobile, tripod mounted
Example: Matterport
• Matterport Pro2 3D scanner
• Room scale scanner, panorama and 3D model
• 360° (left-right) x 300° (vertical) field of view
• Structured light (infared) 3D sensor
• 15 ft (4.5 m) maximum range
• 4K HDR images
Matterport Pro2 Lite
• https://www.youtube.com/watch?v=SjHk0Th-j1I
Handheld/Desktop Scanners
• Capture people/objects
• Sense 3D scanner
• accuracy of 0.90 mm, colour resolution of 1920×1080 pixels
• Occipital Structure sensor
• Add-on to iPad, mesh scanning, IR light projection, 60 Hz
Structure Sensor
• https://www.youtube.com/watch?v=7j3HQxUGvq4
3D Modelling
• A variety of 3D modelling tools can be used
• Export in VR compatible file format (.obj, .fbx, etc)
• Especially useful for animation - difficult to create from scans
• Popular tools
• Blender (free), 3DS max, Maya, etc.
• Easy to Use
• Tinkercad, Sketchup Free, Meshmixer, Fusion 360, etc.
Modelling in VR
• Several tools for modelling in VR
• Natural interaction, low polygon count, 3D object viewins
• Low end
• Google Blocks
• High end
• Quill, Tilt brush – 3D painting
• Gravity Sketch – 3D CAD
Example: Google Blocks
• https://www.youtube.com/watch?v=1TX81cRqfUU
Example: Gravity Sketch
• https://www.youtube.com/watch?v=VK2DDnT_3l0
Download Existing VR Content
• Many locations for 3D objects, textures, etc.
• Sketchfab, Sketchup, Free3D (www.free3d.com), etc.
• Asset stores - Unity, Unreal
• Provide 3D models, materials, code, etc..
VR Graphics Architecture
• Application Layer
• User interface libraries
• Simulation/behaviour code
• User interaction specification
• Graphics Layer (CPU acceleration)
• Scene graph specification
• Object physics engine
• Specifying graphics objects
• Rendering Layer (GPU acceleration)
• Low level graphics code
• Rendering pixels/polygons
• Interface with graphics card/frame buffer
• Low level code for loading models and showing on screen
• Using shaders and low level GPU programming to improve graphics
Traditional 3D Graphics Pipeline
Graphics Challenges with VR
• Higher data throughput (> 7x desktop requirement)
• Lower latency requirements (from 150ms/frame to 20ms)
• HMD Lens distortion
• HMD may have cheap lens
• Creates chromatic aberration and distorted image
• Warp graphics images to create undistorted view
• Use low level shader programming
Lens Distortion
VR System Pipeline
• Using time warping and lens distortion
Perception Based Graphics
• Eye Physiology
• Rods in eye centre = colour vision, cones in periphery = motion, B+W
• Foveated Rendering
• Use eye tracking to draw highest resolution where user looking
• Reduces graphics throughput
Foveated Rendering
• https://www.youtube.com/watch?v=lNX0wCdD2LA
Typical VR Simulation Loop
• User moves head, scene updates, displayed graphics change
• Need to synchronize system to reduce delays
System Delays
Typical Delay from Tracking to Rendering
System Delay
Typical System Delays
• Total Delay = 50 + 2 + 33 + 17 = 102 ms
• 1 ms delay = 1/3 mm error for object drawn at arms length
• So total of 33mm error from when user begins moving to when object drawn
Tracking Calculate
Viewpoint
Simulation
Render
Scene
Draw to
Display
x,y,z
r,p,y
Application Loop
20 Hz = 50ms 500 Hz = 2ms 30 Hz = 33ms 60 Hz = 17ms
Living with High Latency (1/3 sec – 3 sec)
• https://www.youtube.com/watch?v=_fNp37zFn9Q
Effects of System Latency
• Degraded Visual Acuity
• Scene still moving when head stops = motion blur
• Degraded Performance
• As latency increases it’s difficult to select objects etc.
• If latency > 120 ms, training doesn’t improve performance
• Breaks-in-Presence
• If system delay high user doesn’t believe they are in VR
• Negative Training Effects
• User train to operative in world with delay
• Simulator Sickness
• Latency is greatest cause of simulator sickness
Simulator Sickness
• Visual input conflicting with vestibular system
What Happens When Senses Don’t Match?
• 20-30% VR users experience motion sickness
• Sensory Conflict Theory
• Visual cues don’t match vestibular cues
• Eyes – “I’m moving!”, Vestibular – “No, you’re not!”
Avoiding Motion Sickness
• Better VR experience design
• More natural movements
• Improved VR system performance
• Less tracking latency, better graphics frame rate
• Provide a fixed frame of reference
• Ground plane, vehicle window
• Add a virtual nose
• Provide peripheral cue
• Eat ginger
• Reduces upset stomach
Many Causes of Simulator Sickness
• 25-40% of VR users get Simulator Sickness, due to:
• Latency
• Major cause of simulator sickness
• Tracking accuracy/precision
• Seeing world from incorrect position, viewpoint drift
• Field of View
• Wide field of view creates more periphery vection = sickness
• Refresh Rate/Flicker
• Flicker/low refresh rate creates eye fatigue
• Vergence/Accommodation Conflict
• Creates eye strain over time
• Eye separation
• If IPD not matching to inter-image distance then discomfort
Motion Sickness
• https://www.youtube.com/watch?v=BznbIlW8iqE
System Design Guidelines - I
• Hardware
• Choose HMDs with fast pixel response time, no flicker
• Choose trackers with high update rates, accurate, no drift
• Choose HMDs that are lightweight, comfortable to wear
• Use hand controllers with no line-of-sight requirements
• System Calibration
• Have virtual FOV match actual FOV of HMD
• Measure and set users IPD
• Latency Reduction
• Minimize overall end to end system delay
• Use displays with fast response time and low persistence
• Use latency compensation to reduce perceived latency
Jason Jerald, The VR Book, 2016
System Design Guidelines - II
• General Design
• Design for short user experiences
• Minimize visual stimuli closer to eye (vergence/accommodation)
• For binocular displays, do not use 2D overlays/HUDs
• Design for sitting, or provide physical barriers
• Show virtual warning when user reaches end of tracking area
• Motion Design
• Move virtual viewpoint with actual motion of the user
• If latency high, no tasks requiring fast head motion
• Interface Design
• Design input/interaction for user’s hands at their sides
• Design interactions to be non-repetitive to reduce strain injuries
Jason Jerald, The VR Book, 2016
CONCLUSIONS
www.empathiccomputing.org
@marknb00
mark.billinghurst@unisa.edu.au

Contenu connexe

Tendances

COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic DisplaysCOMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic DisplaysMark Billinghurst
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsMark Billinghurst
 
Comp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and PrototypingComp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and PrototypingMark Billinghurst
 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Mark Billinghurst
 
Comp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionComp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionMark Billinghurst
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsMark Billinghurst
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRMark Billinghurst
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR TechnologyMark Billinghurst
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsMark Billinghurst
 
COMP 4010: Lecture 5 - Interaction Design for Virtual Reality
COMP 4010: Lecture 5 - Interaction Design for Virtual RealityCOMP 4010: Lecture 5 - Interaction Design for Virtual Reality
COMP 4010: Lecture 5 - Interaction Design for Virtual RealityMark Billinghurst
 
Comp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XRComp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XRMark Billinghurst
 
Application in Augmented and Virtual Reality
Application in Augmented and Virtual RealityApplication in Augmented and Virtual Reality
Application in Augmented and Virtual RealityMark Billinghurst
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR PrototypingMark Billinghurst
 
Comp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR SystemsComp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR SystemsMark Billinghurst
 
COMP 4010 - Lecture 3 VR Systems
COMP 4010 - Lecture 3 VR SystemsCOMP 4010 - Lecture 3 VR Systems
COMP 4010 - Lecture 3 VR SystemsMark Billinghurst
 
Advanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR StudiesAdvanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR StudiesMark Billinghurst
 

Tendances (20)

COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic DisplaysCOMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
 
Comp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and PrototypingComp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and Prototyping
 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality
 
Comp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionComp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-Perception
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VR
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research Directions
 
Lecture3 - VR Technology
Lecture3 - VR TechnologyLecture3 - VR Technology
Lecture3 - VR Technology
 
COMP 4010: Lecture 5 - Interaction Design for Virtual Reality
COMP 4010: Lecture 5 - Interaction Design for Virtual RealityCOMP 4010: Lecture 5 - Interaction Design for Virtual Reality
COMP 4010: Lecture 5 - Interaction Design for Virtual Reality
 
Comp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XRComp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XR
 
Application in Augmented and Virtual Reality
Application in Augmented and Virtual RealityApplication in Augmented and Virtual Reality
Application in Augmented and Virtual Reality
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping
 
ISS2022 Keynote
ISS2022 KeynoteISS2022 Keynote
ISS2022 Keynote
 
Comp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR SystemsComp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR Systems
 
Lecture1 introduction to VR
Lecture1 introduction to VRLecture1 introduction to VR
Lecture1 introduction to VR
 
COMP 4010 - Lecture 3 VR Systems
COMP 4010 - Lecture 3 VR SystemsCOMP 4010 - Lecture 3 VR Systems
COMP 4010 - Lecture 3 VR Systems
 
Advanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR StudiesAdvanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR Studies
 
Lecture 4: VR Systems
Lecture 4: VR SystemsLecture 4: VR Systems
Lecture 4: VR Systems
 

Similaire à 2022 COMP 4010 Lecture 7: Introduction to VR

Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR SystemsMark Billinghurst
 
COMP 4010: Lecture2 VR Technology
COMP 4010: Lecture2 VR TechnologyCOMP 4010: Lecture2 VR Technology
COMP 4010: Lecture2 VR TechnologyMark Billinghurst
 
Future Directions for Augmented Reality
Future Directions for Augmented RealityFuture Directions for Augmented Reality
Future Directions for Augmented RealityMark Billinghurst
 
Multimodal Multi-sensory Interaction for Mixed Reality
Multimodal Multi-sensory Interaction for Mixed RealityMultimodal Multi-sensory Interaction for Mixed Reality
Multimodal Multi-sensory Interaction for Mixed RealityMark Billinghurst
 
Mobile AR lecture 9 - Mobile AR Interface Design
Mobile AR lecture 9 - Mobile AR Interface DesignMobile AR lecture 9 - Mobile AR Interface Design
Mobile AR lecture 9 - Mobile AR Interface DesignMark Billinghurst
 
Lecture 2 Presence and Perception
Lecture 2 Presence and PerceptionLecture 2 Presence and Perception
Lecture 2 Presence and PerceptionMark Billinghurst
 
COMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VRCOMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VRMark Billinghurst
 
COMP 4010 Lecture10 AR/VR Research Directions
COMP 4010 Lecture10 AR/VR Research DirectionsCOMP 4010 Lecture10 AR/VR Research Directions
COMP 4010 Lecture10 AR/VR Research DirectionsMark Billinghurst
 
Augmented Reality in Multi-Dimensionality: Design for Space, Motion, Multiple...
Augmented Reality in Multi-Dimensionality: Design for Space, Motion, Multiple...Augmented Reality in Multi-Dimensionality: Design for Space, Motion, Multiple...
Augmented Reality in Multi-Dimensionality: Design for Space, Motion, Multiple...Shalin Hai-Jew
 
Emerging role of virtual reality
Emerging role of virtual realityEmerging role of virtual reality
Emerging role of virtual realityjeniferdivya
 
Empathic Computing: New Approaches to Gaming
Empathic Computing: New Approaches to GamingEmpathic Computing: New Approaches to Gaming
Empathic Computing: New Approaches to GamingMark Billinghurst
 
Beyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented RealityBeyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented RealityMark Billinghurst
 
Fifty Shades of Augmented Reality: Creating Connection Using AR
Fifty Shades of Augmented Reality: Creating Connection Using ARFifty Shades of Augmented Reality: Creating Connection Using AR
Fifty Shades of Augmented Reality: Creating Connection Using ARMark Billinghurst
 
VSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
VSMM 2016 Keynote: Using AR and VR to create Empathic ExperiencesVSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
VSMM 2016 Keynote: Using AR and VR to create Empathic ExperiencesMark Billinghurst
 
Mark Billinghurst (University of South Australia ): Augmented Teleportation
Mark Billinghurst (University of South Australia ): Augmented TeleportationMark Billinghurst (University of South Australia ): Augmented Teleportation
Mark Billinghurst (University of South Australia ): Augmented TeleportationAugmentedWorldExpo
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseMark Billinghurst
 

Similaire à 2022 COMP 4010 Lecture 7: Introduction to VR (20)

ICS2208 lecture6
ICS2208 lecture6ICS2208 lecture6
ICS2208 lecture6
 
ICS2208 lecture7
ICS2208 lecture7ICS2208 lecture7
ICS2208 lecture7
 
ARI2132 lecture 8
ARI2132 lecture 8ARI2132 lecture 8
ARI2132 lecture 8
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR Systems
 
COMP 4010: Lecture2 VR Technology
COMP 4010: Lecture2 VR TechnologyCOMP 4010: Lecture2 VR Technology
COMP 4010: Lecture2 VR Technology
 
Future Directions for Augmented Reality
Future Directions for Augmented RealityFuture Directions for Augmented Reality
Future Directions for Augmented Reality
 
Multimodal Multi-sensory Interaction for Mixed Reality
Multimodal Multi-sensory Interaction for Mixed RealityMultimodal Multi-sensory Interaction for Mixed Reality
Multimodal Multi-sensory Interaction for Mixed Reality
 
Mobile AR lecture 9 - Mobile AR Interface Design
Mobile AR lecture 9 - Mobile AR Interface DesignMobile AR lecture 9 - Mobile AR Interface Design
Mobile AR lecture 9 - Mobile AR Interface Design
 
Designing Usable Interface
Designing Usable InterfaceDesigning Usable Interface
Designing Usable Interface
 
Lecture 2 Presence and Perception
Lecture 2 Presence and PerceptionLecture 2 Presence and Perception
Lecture 2 Presence and Perception
 
COMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VRCOMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VR
 
COMP 4010 Lecture10 AR/VR Research Directions
COMP 4010 Lecture10 AR/VR Research DirectionsCOMP 4010 Lecture10 AR/VR Research Directions
COMP 4010 Lecture10 AR/VR Research Directions
 
Augmented Reality in Multi-Dimensionality: Design for Space, Motion, Multiple...
Augmented Reality in Multi-Dimensionality: Design for Space, Motion, Multiple...Augmented Reality in Multi-Dimensionality: Design for Space, Motion, Multiple...
Augmented Reality in Multi-Dimensionality: Design for Space, Motion, Multiple...
 
Emerging role of virtual reality
Emerging role of virtual realityEmerging role of virtual reality
Emerging role of virtual reality
 
Empathic Computing: New Approaches to Gaming
Empathic Computing: New Approaches to GamingEmpathic Computing: New Approaches to Gaming
Empathic Computing: New Approaches to Gaming
 
Beyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented RealityBeyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented Reality
 
Fifty Shades of Augmented Reality: Creating Connection Using AR
Fifty Shades of Augmented Reality: Creating Connection Using ARFifty Shades of Augmented Reality: Creating Connection Using AR
Fifty Shades of Augmented Reality: Creating Connection Using AR
 
VSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
VSMM 2016 Keynote: Using AR and VR to create Empathic ExperiencesVSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
VSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
 
Mark Billinghurst (University of South Australia ): Augmented Teleportation
Mark Billinghurst (University of South Australia ): Augmented TeleportationMark Billinghurst (University of South Australia ): Augmented Teleportation
Mark Billinghurst (University of South Australia ): Augmented Teleportation
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole Metaverse
 

Plus de Mark Billinghurst

Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented RealityMark Billinghurst
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesMark Billinghurst
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the MetaverseMark Billinghurst
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseMark Billinghurst
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationMark Billinghurst
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseMark Billinghurst
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR InteractionMark Billinghurst
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsMark Billinghurst
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional InterfacesMark Billinghurst
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARMark Billinghurst
 

Plus de Mark Billinghurst (11)

Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the Metaverse
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the Metaverse
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader Metaverse
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional Interfaces
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise AR
 

Dernier

Generative AI - Gitex v1Generative AI - Gitex v1.pptx
Generative AI - Gitex v1Generative AI - Gitex v1.pptxGenerative AI - Gitex v1Generative AI - Gitex v1.pptx
Generative AI - Gitex v1Generative AI - Gitex v1.pptxfnnc6jmgwh
 
React Native vs Ionic - The Best Mobile App Framework
React Native vs Ionic - The Best Mobile App FrameworkReact Native vs Ionic - The Best Mobile App Framework
React Native vs Ionic - The Best Mobile App FrameworkPixlogix Infotech
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxLoriGlavin3
 
UiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to HeroUiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to HeroUiPathCommunity
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsPixlogix Infotech
 
Potential of AI (Generative AI) in Business: Learnings and Insights
Potential of AI (Generative AI) in Business: Learnings and InsightsPotential of AI (Generative AI) in Business: Learnings and Insights
Potential of AI (Generative AI) in Business: Learnings and InsightsRavi Sanghani
 
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxDigital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxLoriGlavin3
 
Abdul Kader Baba- Managing Cybersecurity Risks and Compliance Requirements i...
Abdul Kader Baba- Managing Cybersecurity Risks  and Compliance Requirements i...Abdul Kader Baba- Managing Cybersecurity Risks  and Compliance Requirements i...
Abdul Kader Baba- Managing Cybersecurity Risks and Compliance Requirements i...itnewsafrica
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxLoriGlavin3
 
Long journey of Ruby standard library at RubyConf AU 2024
Long journey of Ruby standard library at RubyConf AU 2024Long journey of Ruby standard library at RubyConf AU 2024
Long journey of Ruby standard library at RubyConf AU 2024Hiroshi SHIBATA
 
Genislab builds better products and faster go-to-market with Lean project man...
Genislab builds better products and faster go-to-market with Lean project man...Genislab builds better products and faster go-to-market with Lean project man...
Genislab builds better products and faster go-to-market with Lean project man...Farhan Tariq
 
Top 10 Hubspot Development Companies in 2024
Top 10 Hubspot Development Companies in 2024Top 10 Hubspot Development Companies in 2024
Top 10 Hubspot Development Companies in 2024TopCSSGallery
 
Generative Artificial Intelligence: How generative AI works.pdf
Generative Artificial Intelligence: How generative AI works.pdfGenerative Artificial Intelligence: How generative AI works.pdf
Generative Artificial Intelligence: How generative AI works.pdfIngrid Airi González
 
MuleSoft Online Meetup Group - B2B Crash Course: Release SparkNotes
MuleSoft Online Meetup Group - B2B Crash Course: Release SparkNotesMuleSoft Online Meetup Group - B2B Crash Course: Release SparkNotes
MuleSoft Online Meetup Group - B2B Crash Course: Release SparkNotesManik S Magar
 
Glenn Lazarus- Why Your Observability Strategy Needs Security Observability
Glenn Lazarus- Why Your Observability Strategy Needs Security ObservabilityGlenn Lazarus- Why Your Observability Strategy Needs Security Observability
Glenn Lazarus- Why Your Observability Strategy Needs Security Observabilityitnewsafrica
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
So einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdfSo einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdfpanagenda
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxLoriGlavin3
 
Microsoft 365 Copilot: How to boost your productivity with AI – Part one: Ado...
Microsoft 365 Copilot: How to boost your productivity with AI – Part one: Ado...Microsoft 365 Copilot: How to boost your productivity with AI – Part one: Ado...
Microsoft 365 Copilot: How to boost your productivity with AI – Part one: Ado...Nikki Chapple
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 

Dernier (20)

Generative AI - Gitex v1Generative AI - Gitex v1.pptx
Generative AI - Gitex v1Generative AI - Gitex v1.pptxGenerative AI - Gitex v1Generative AI - Gitex v1.pptx
Generative AI - Gitex v1Generative AI - Gitex v1.pptx
 
React Native vs Ionic - The Best Mobile App Framework
React Native vs Ionic - The Best Mobile App FrameworkReact Native vs Ionic - The Best Mobile App Framework
React Native vs Ionic - The Best Mobile App Framework
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptx
 
UiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to HeroUiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to Hero
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and Cons
 
Potential of AI (Generative AI) in Business: Learnings and Insights
Potential of AI (Generative AI) in Business: Learnings and InsightsPotential of AI (Generative AI) in Business: Learnings and Insights
Potential of AI (Generative AI) in Business: Learnings and Insights
 
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxDigital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
 
Abdul Kader Baba- Managing Cybersecurity Risks and Compliance Requirements i...
Abdul Kader Baba- Managing Cybersecurity Risks  and Compliance Requirements i...Abdul Kader Baba- Managing Cybersecurity Risks  and Compliance Requirements i...
Abdul Kader Baba- Managing Cybersecurity Risks and Compliance Requirements i...
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
 
Long journey of Ruby standard library at RubyConf AU 2024
Long journey of Ruby standard library at RubyConf AU 2024Long journey of Ruby standard library at RubyConf AU 2024
Long journey of Ruby standard library at RubyConf AU 2024
 
Genislab builds better products and faster go-to-market with Lean project man...
Genislab builds better products and faster go-to-market with Lean project man...Genislab builds better products and faster go-to-market with Lean project man...
Genislab builds better products and faster go-to-market with Lean project man...
 
Top 10 Hubspot Development Companies in 2024
Top 10 Hubspot Development Companies in 2024Top 10 Hubspot Development Companies in 2024
Top 10 Hubspot Development Companies in 2024
 
Generative Artificial Intelligence: How generative AI works.pdf
Generative Artificial Intelligence: How generative AI works.pdfGenerative Artificial Intelligence: How generative AI works.pdf
Generative Artificial Intelligence: How generative AI works.pdf
 
MuleSoft Online Meetup Group - B2B Crash Course: Release SparkNotes
MuleSoft Online Meetup Group - B2B Crash Course: Release SparkNotesMuleSoft Online Meetup Group - B2B Crash Course: Release SparkNotes
MuleSoft Online Meetup Group - B2B Crash Course: Release SparkNotes
 
Glenn Lazarus- Why Your Observability Strategy Needs Security Observability
Glenn Lazarus- Why Your Observability Strategy Needs Security ObservabilityGlenn Lazarus- Why Your Observability Strategy Needs Security Observability
Glenn Lazarus- Why Your Observability Strategy Needs Security Observability
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
So einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdfSo einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdf
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
 
Microsoft 365 Copilot: How to boost your productivity with AI – Part one: Ado...
Microsoft 365 Copilot: How to boost your productivity with AI – Part one: Ado...Microsoft 365 Copilot: How to boost your productivity with AI – Part one: Ado...
Microsoft 365 Copilot: How to boost your productivity with AI – Part one: Ado...
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 

2022 COMP 4010 Lecture 7: Introduction to VR

  • 1. INTRODUCTION TO VR COMP 4010 Lecture Seven Mark Billinghurst September 8th 2022 mark.billinghurst@unisa.edu.au
  • 3. Design in Interaction Design Key Prototyping Steps
  • 4. AR. Design Considerations • 1. Design for Humans • Use Human Information Processing model • 2. Design for Different User Groups • Different users may have unique needs • 3. Design for the Whole User • Social, cultural, emotional, physical cognitive • 4. Use UI Best Practices • Adapt known UI guidelines to AR/VR • 5. Use of Interface Metaphors/Affordances • Decide best metaphor for AR/VR application
  • 5. 1. Design for Human Information Processing • High level staged model from Wickens and Carswell (1997) • Relates perception, cognition, and physical ergonomics Perception Cognition Ergonomics
  • 6. Design for Perception • Need to understand perception to design AR • Visual perception • Many types of visual cues (stereo, oculomotor, etc.) • Auditory system • Binaural cues, vestibular cues • Somatosensory • Haptic, tactile, kinesthetic, proprioceptive cues • Chemical Sensing System • Taste and smell
  • 8. Design for Cognition • Design for Working and Long-term memory • Working memory • Short term storage, Limited storage (~5-9 items) • Long term memory • Memory recall trigger by associative cues • Situational Awareness • Model of current state of user’s environment • Used for wayfinding, object interaction, spatial awareness, etc.. • Provide cognitive cues to help with situational awareness • Landmarks, procedural cues, map knowledge • Support both ego-centric and exo-centric views
  • 9. Design for Physical Ergonomics • Design for the human motion range • Consider human comfort and natural posture • Design for hand input • Coarse and fine scale motions, gripping and grasping • Avoid “Gorilla arm syndrome” from holding arm pose
  • 10. Gorilla Arm in AR • Design interface to reduce mid-air gestures
  • 11. 2. Designing for Different User Groups • Design for Difference Ages • Children require different interface design than adults • Older uses have different needs than younger • Prior Experience with AR systems • Familiar with HMDs, AR input devices • People with Different Physical Characteristics • Height and arm reach, handedness • Perceptual, Cognitive and Motor Abilities • Colour perception varies between people • Spatial ability, cognitive or motor disabilities
  • 12. 3. Design for the Whole User
  • 13. 4. Use UI Best Practices • General UI design principles can be applied to AR • E.g. Shneiderman’s UI guidelines from 1998 • Providing interface feedback • Mixture of reactive, instrumental and operational feedback • Maintain spatial and temporal correspondence • Use constraints • Specify relations between variables that must be satisfied • E.g. physical constraints reduce freedom of movement • Support Two-Handed control • Use Guiard’s framework of bimanual manipulation • Dominant vs. non-dominant hands
  • 14. •Interface Components • Physical components • Display elements • Visual/audio • Interaction metaphors Physical Elements Display Elements Interaction Metaphor Input Output 5. Use Interface Metaphors
  • 15. AR Design Space Reality Virtual Reality Augmented Reality Physical Design Virtual Design
  • 16. •AR design is mixture of physical affordance and virtual affordance •Physical •Tangible controllers and objects •Virtual •Virtual graphics and audio
  • 17. Affordances in AR • Design AR interface objects to show how they are used • Use visual and physical cues to show possible affordances • Perceived affordances should match actual affordances • Physical and virtual affordances should match Merge Cube Tangible Molecules
  • 19. Summary •When designing AR interfaces, think of: • Physical Components • Physical affordances • Virtual Components • Virtual affordances • Interface Metaphors • Tangible AR or similar
  • 21. From Reality to Virtual Reality Internet of Things Augmented Reality Virtual Reality Real World Virtual World
  • 22. Virtual Reality (VR) • Users immersed in Computer Generated environment • HMD, gloves, 3D graphics, body tracking
  • 23. Goal of Virtual Reality “.. to make it feel like you’re actually in a place that you are not.” Palmer Luckey Co-founder, Oculus
  • 24. Virtual Reality Definition •Defining Characteristics • Immersion • User feels immersed in computer generated scene • Interaction • The virtual content can be interacted with • Independence • User can have independent view and react to environment
  • 25. From Immersion to Presence • Immersion: describes the extent to which technology is capable of delivering a vivid illusion of reality to the senses of a human participant. • Presence: a state of consciousness, the (psychological) sense of being in the virtual environment. • So Immersion, defined in technical terms, is capable of producing a sensation of Presence • Goal of VR: Create a high degree of Presence • Make people believe they are really in Virtual Environment Slater, M., & Wilbur, S. (1997). A framework for immersive virtual environments (FIVE): Speculations on the role of presence in virtual environments. Presence: Teleoperators and virtual environments, 6(6), 603-616.
  • 26. Presence .. “The subjective experience of being in one place or environment even when physically situated in another” Witmer, B. G., & Singer, M. J. (1998). Measuring presence in virtual environments: A presence questionnaire. Presence: Teleoperators and virtual environments, 7(3), 225-240.
  • 27. Why do people behave like this? • Presence can be decomposed into two dimensions (Slater 2009): • “Place Illusion” (PI): being in the place depicted in the VR environment • perception in VR matches natural sensorimotor input • Plausibility Illusion (Psi): the events in the VR environment are actually occurring • VR environment responds to user actions • When both PI and Psi are high, people respond realistically to events in the VR Slater, M. (2009). Place illusion and plausibility can lead to realistic behaviour in immersive virtual environments. Philosophical Transactions of the Royal Society B: Biological Sciences, 364(1535), 3549-3557. Presence = PI + Psi + ??
  • 28. Slater, M., Banakou, D., Beacco, A., Gallego, J., Macia-Varela, F., & Oliva, R. (2022). A Separate Reality: An Update on Place Illusion and Plausibility in Virtual Reality. Frontiers in Virtual Reality, 81. Four Illusions of Presence (Slater 2022) • Place Illusion: being in the place • Plausibility Illusion: events are real • Body Ownership: seeing your body in VR • Copresence/Social Presence: other people are in VR
  • 29. Reality vs. Virtual Reality • In a VR system there are input and output devices between human perception and action
  • 30. Using Technology to Stimulate Senses • Simulate output • E.g. simulate real scene • Map output to devices • Graphics to HMD • Use devices to stimulate the senses • HMD stimulates eyes Visual Simulation 3D Graphics HMD Vision System Brain Example: Visual Simulation Human-Machine Interface
  • 31. Key Technologies for VR Systems • Display (Immersion) • Stimulate senses • visual, auditory, tactile sense, etc.. • Tracking (Independence) • Changing viewpoint • independent movement • Input Devices (Interaction) • Supporting user interaction • User input
  • 34. Creating an Immersive Experience •Head Mounted Display • Immerse the eyes •Projection/Large Screen • Immerse the head/body •Future Technologies • Neural implants • Contact lens displays, etc
  • 35. VR Head Mounted Displays • Wide range of HMDs available
  • 36. Key Properties of HMDs • Lens • Focal length, Field of View • Occularity, Interpupillary distance • Eye relief, Eye box • Display • Resolution, contrast • Power, brightness • Refresh rate • Ergonomics • Size, weight • Wearability
  • 37.
  • 38. HMD Basic Principles • Use display with optics to create illusion of virtual screen
  • 39. Simple Magnifier HMD Design p q Eyepiece (one or more lenses) Display (Image Source) Eye f Virtual Image 1/p + 1/q = 1/f where p = object distance (distance from image source to eyepiece) q = image distance (distance of image from the lens) f = focal length of the lens
  • 40. Distortion in Lens Optics A rectangle Maps to this
  • 42. To Correct for Distortion • Must pre-distort image • This is a pixel-based distortion • Use shader programming
  • 44. Interpupillary Distance (IPD) nHorizontal distance between a user's eyes nDistance between the two optical axes in a HMD nTypical IPD ~ 63mm
  • 45. Field of View Monocular FOV is the angular subtense of the displayed image as measured from the pupil of one eye. Total FOV is the total angular size of the displayed image visible to both eyes. Binocular(or stereoscopic) FOV refers to the part of the displayed image visible to both eyes. FOV may be measured horizontally, vertically or diagonally.
  • 47. Vive HMD (Released 2016) • Field of View • 110 degrees horizontal, 110 degrees horizontal • Resolution • 1080×1200 per eye • Refresh rate • 90 Hz • Other features • Adjustable lens separation • Over the ear headphones • Integrated tracking • Tethered to PC
  • 48.
  • 49. Foveated Displays • Combine high resolution center with low resolution periphery
  • 50. Varjo Display Varjo resolution Non-Varjo resolution Focus area (27° x 27°) 70 PPD, 1920 x 1920px 115° FOV 30 PPD 2880 x 2720px • 1 LCD (wide FOV) • 1 uOLED panel (centre)
  • 51. Varjo XR-3 Demo – Threading a Needle https://www.youtube.com/watch?v=5iEwlOEUQjI
  • 52. Computer Based vs. Mobile VR Displays
  • 53. Google Cardboard • Released 2014 (Google 20% project) • >10 million shipped/given away • Easy to use developer tools + =
  • 54. Multiple Mobile VR Viewers Available
  • 55. VR Head Mounted Displays (HMDs) • HTC Vive Pro - $1100 USD • 110 degree FOV, 2 handed input • Tethered VR, room scale tracking • Oculus Quest - $400 USD • 110 degree FOV, 2 handed input • Self contained VR, inside out tracking • Google Cardboard - $10 USD • 90 degree FOV, button input • Mobile VR, 3 DOF tracking
  • 56. HMD Design Trade-offs • Resolution vs. field of view • As FOV increases, resolution decreases for fixed pixels • Eye box vs. field of view • Larger eye box limits field of view • Size, Weight and Power vs. everything else vs.
  • 57. Projection/Large Display Technologies • Room Scale Projection • CAVE, multi-wall environment • Dome projection • Hemisphere/spherical display • Head/body inside • Vehicle Simulator • Simulated visual display in windows
  • 58. Stereo Projection • Active Stereo • Active shutter glasses • Time synced signal • Brighter images • More expensive • Passive Stereo • Polarized images • Two projectors (one/eye) • Cheap glasses (powerless) • Lower resolution/dimmer • Less expensive
  • 59. CAVE • Developed in 1992, EVL University of Illinois Chicago • Multi-walled stereo projection environment • Head tracked active stereo Cruz-Neira, C., Sandin, D. J., DeFanti, T. A., Kenyon, R. V., & Hart, J. C. (1992). The CAVE: audio visual experience automatic virtual environment. Communications of the ACM, 35(6), 64-73.
  • 60. Typical CAVE Setup • 4 sides, rear projected stereo images
  • 61. Demo Video – Wisconsin CAVE • https://www.youtube.com/watch?v=mBs-OGDoPDY
  • 64. Multi-User CAVEs • Limitation of CAVEs • Stereo projection from only one user’s viewpoint • Solution • Higher frequency projectors and time slicing Kulik, A., Kunert, A., Beck, S., Reichel, R., Blach, R., Zink, A., & Froehlich, B. (2011). C1x6: a stereoscopic six-user display for co-located collaboration in shared virtual environments. ACM Transactions on Graphics (TOG), 30(6), 188.
  • 66. Allosphere • Univ. California Santa Barbara • One of a kind facility • Immersive Spherical display • 10 m diameter • Inside 3 story anechoic cube • Passive stereoscopic projection • 26 projectors, 146 speakers • Visual tracking system for input • See http://www.allosphere.ucsb.edu/ Kuchera-Morin, J., Wright, M., Wakefield, G., Roberts, C., Adderton, D., Sajadi, B., ... & Majumder, A. (2014). Immersive full-surround multi-user system design. Computers & Graphics, 40, 10-21.
  • 68. Allosphere Research • Multi-disciplinary research • Science, art, engineering • Typical research projects • Brain imaging • fMRI imaging data • Atomic bonding • Bond simulation models • Nano medicine • Simulate chemotherapy • Graph browser • Mathematical visualization • Etc Brain Imaging Hydrogen bond Nano Medicine
  • 69. Vehicle Simulators • Combine VR displays with vehicle • Visual displays on windows • Motion base for haptic feedback • Audio feedback • Physical vehicle controls • Steering wheel, flight stick, etc • Full vehicle simulation • Emergencies, normal operation, etc • Weapon operation • Training scenarios
  • 70. Demo: Boeing 787 Simulator • https://www.youtube.com/watch?v=3iah-blsw_U
  • 73. Audio Displays Definition: Computer interfaces that provide synthetic sound feedback to users interacting with the virtual world. The sound can be monoaural (both ears hear the same sound), or binaural (each ear hears a different sound) Burdea, Coiffet (2003)
  • 74. Motivation • Most of the focus in Virtual Reality is on the visuals • GPUs continue to drive the field • Users want more • More realism, More complexity, More speed • However, sound can significantly enhance realism • Example: Mood music in horror games • Sound can provide valuable user interface feedback • Example: Alert in training simulation
  • 75. 360 Video + Spatial Audio (wear headphones) • https://www.youtube.com/watch?v=G8pABGosD38
  • 76. Types of Audio Recordings • Monaural: Recording with one microphone – no positioning • Stereo Sound: Recording with two microphones placed several feet apart. Perceived sound position as recorded by microphones. • Binaural: Recording microphones embedded in a dummy head. Audio filtered by head shape. • 3D Sound: Using tiny microphones in the ears of a real person. Generate HRTF based on ear shape and audio response.
  • 77. Capturing 3D Audio for Playback • Binaural recording • 3D Sound recording, from microphones in simulated ears • Hear some examples (use headphones) • http://binauralenthusiast.com/examples/
  • 78. Example (Use Headphones) • https://www.youtube.com/watch?v=tNEUObTs1kg
  • 79. Synthetic Sounds • Complex sounds can be built from simple waveforms (e.g., sawtooth, sine) and combined using operators • Waveform parameters (frequency, amplitude) could be taken from motion data, such as object velocity • Can combine wave forms in various ways • This is what classic synthesizers do • Works well for many non-speech sounds
  • 80. Audio Display Properties Presentation Properties • Number of channels • Sound stage • Localization • Masking • Amplification Logistical Properties • Noise pollution • User mobility • Interface with tracking • Integration • Portability • Throughput • Safety • Cost
  • 81. Audio Displays: Head-worn Ear Buds On Ear Open Back Closed Bone Conduction
  • 82. Spatialization vs. Localization • Spatialization is the processing of sound signals to make them emanate from a point in space • This is a technical topic • Localization is the ability of people to identify the source position of a sound • This is a human topic, some people are better at it than others.
  • 83. Stereo Sound • Seems to come from inside user’s head • Follows head motion as user moves head
  • 84. 3D Spatial Sound • Seems to be external to the head • Fixed in space when user moves head • Has reflected sound properties
  • 85. Example: Sound Spatialisation • Note: Use stereo speakers/headphones for best effect
  • 86. Spatialized Audio Effects • Naïve approach • Simple left/right shift for lateral position • Amplitude adjustment for distance • Easy to produce using consumer hardware/software • Does not give us "true" realism in sound • No up/down or front/back cues • We can use multiple speakers for this • Surround the user with speakers • Send different sound signals to each one
  • 87. Example: The BoomRoom • Use surround speakers to create spatial audio effects • Gesture based interaction • https://www.youtube.com/watch?time_continue=54&v=6RQMOyQ3lyg
  • 88. Audio Localization • Main cues used by humans to localize sound: 1. Interaural time differences: Time difference for sound wave to travel between ears 2. Interaural level differences: For high frequency sounds (> 1.5 kHz), volume difference between ears used to determine source direction 3. Spectral filtering done by outer ears: Ear shape changes frequency heard
  • 89. Interaural Time Difference • Takes fixed time to travel between ears • Can use time difference to determine sound location
  • 90. Spectral Filtering Ear shape filters sound depending on direction it is coming from. This change in frequency determines sound source elevation.
  • 91. Head-Related Transfer Functions (HRTFs) • A set of functions that model how sound from a source at a known location reaches the eardrum
  • 92. More About HRTFs • Functions take into account, • Individual ear shape • Slope of shoulders • Head shape • So, each person has his/her own HRTF! • Need to have a parameterizable HRTFs • Some sound cards/APIs allow specifying an HRTF
  • 94. Measuring HRTFs • Putting microphones in Manikin or human ears • Playing sound from fixed positions • Record response
  • 95. Environmental Effects • Sound is also changed by objects in the environment • Can reverberate off of reflective objects • Can be absorbed by objects • Can be occluded by objects • Doppler shift • Moving sound sources • Need to simulate environmental audio properties • Takes significant processing power
  • 96. Sound Reverberation • Need to consider first and second order reflections • Need to model material properties, objects in room, etc
  • 98. The Tough Part • All of this takes a lot of processing • Need to keep track of • Multiple (possibly moving) sound sources • Path of sounds through a dynamic environment • Position and orientation of listener(s) • Most sound cards only support a limited number of spatialized sound channels • Increasingly complex geometry increases load on audio system as well as visuals • That's why we fake it ;-) • GPUs might change this too!
  • 99. GPU Based Audio Acceleration • Using GPU for audio physics calculations • AMD TrueAudio Next - https://gpuopen.com/true-audio-next/ https://www.youtube.com/watch?v=Z6nwYLHG8PU
  • 100. Audio Software SDKs • Modern CPUs are fast enough spatial audio can be generated without dedicated hardware • Several 3D audio SDKs exist • OpenAL • www.openal.org • Open source, cross platform • Renders multichannel three-dimensional positional audio • Google VR SDK • Android, iOS, Unity • https://developers.google.com/vr/concepts/spatial-audio • Unity • Unity Audio Spatializer SDK • Microsoft DirectX, MRTK, etc
  • 101. Google VR Spatial Audio Demo • https://www.youtube.com/watch?v=I9zf4hCjRg0&feature=youtu.be
  • 102. Demo: Spatial Audio In VR • AltspaceVR spatial audio for speaker discrimination • https://www.youtube.com/watch?v=dV3Qog44z6E
  • 103. Designing Spatial Audio • There are several tools available for designing 3D audio • E.g. Facebook Spatial Workstation • Audio tools for cinematic VR and360 video • https://facebook360.fb.com/spatial-workstation/ • Spatial Audio Designer • Mixing of surround sound and 3D audio • http://www.newaudiotechnology.com/en/products/spatial-audio-designer/
  • 105. Haptic Feedback • Greatly improves realism • Hands and wrist are most important • High density of touch receptors • Two kinds of feedback: • Touch Feedback • information on texture, temperature, etc. • Does not resist user contact • Force Feedback • information on weight, and inertia. • Actively resists contact motion
  • 106. Active Haptics • Actively resists motion • Key properties • Force resistance • Frequency Response • Degrees of Freedom • Latency
  • 107. Force Feedback Joysticks • WingMan Force 3D • Inexpensive ($60) • Actuators that can move the joystick given system commands • Max 3.3 N of force • Force feedback driving wheel
  • 108. Example: Phantom Omni • Combined stylus input/haptic output • 6 DOF haptic feedback
  • 109. Phantom Omni Demo • https://www.youtube.com/watch?v=REA97hRX0WQ
  • 110. Haptic Glove • Many examples of haptic gloves • Typically use mechanical device to provide haptic feedback
  • 111. HaptX Gloves • Tactile + Haptic feedback • Tactile actuators • 130 feedback points/hand • Force feedback exo-skeleton • 8lbs force/finger • Magnetic finger tracking • 2 mm tracking accuracy • https://haptx.com/
  • 113. Homebrew Glove • LucidVR Budget Haptic Glove • Simple hand tracking, force feedback, • $22 in parts.. • https://hackaday.io/project/178243- lucidvr-budget-haptic-glove
  • 114. Passive Haptics • Not controlled by system • Use real props (Styrofoam for walls) • Pros • Cheap • Large scale • Accurate • Cons • Not dynamic • Limited use
  • 115. UNC Being There Project
  • 116. Passive Haptic Paddle • Using physical props to provide haptic feedback • http://www.cs.wpi.edu/~gogo/hive/
  • 117. Tactile Feedback Interfaces • Goal: Stimulate skin tactile receptors • Using different technologies • Air bellows • Jets • Actuators (commercial) • Micropin arrays • Electrical (research) • Neuromuscular stimulations (research)
  • 118. Vibrotactile Cueing Devices • Vibrotactile feedback has been incorporated into many devices • Can we use this technology to provide scalable, wearable touch cues?
  • 119. Vibrotactile Feedback Projects Navy TSAS Project TactaBoard and TactaVest
  • 120. Teslasuit • Full body haptic feedback - https://teslasuit.io/ • Electrical muscle stimulation
  • 123. Immersion and Tracking • Motivation: For immersion, when the user changes position in reality the VR view also needs to change • Requires tracking of the user’s pose (position/orientation) in the real world and mapping to the Virtual World
  • 124. Tracking in VR • Need for Tracking • User turns their head and the VR graphics scene changes • User wants to walking through a virtual scene • User reaches out and grab a virtual object • The user wants to use a real prop in VR • All of these require technology to track the user or object • Continuously provide information about position and orientation Head Tracking Hand Tracking
  • 125. Tracking and Rendering in VR Tracking fits into the graphics pipeline for VR
  • 126. • Degree of Freedom = independent movement about an axis • 3 DoF Orientation = roll, pitch, yaw (rotation about x, y, or z axis) • 3 DoF Translation = movement along x,y,z axis • Different requirements • User turns their head in VR -> needs 3 DoF orientation tracker • Moving in VR -> needs a 6 DoF tracker (r,p,y) and (x, y, z) Degrees of Freedom
  • 127. Tracking Technologies § Active (device sends out signal) • Mechanical, Magnetic, Ultrasonic • GPS, Wifi, cell location § Passive (device senses world) • Inertial sensors (compass, accelerometer, gyro) • Computer Vision • Marker based, Natural feature tracking § Hybrid Tracking • Combined sensors (e.g. Vision + Inertial)
  • 128. Key Tracking Performance Criteria • Static Accuracy • Dynamic Accuracy • Latency • Update Rate • Tracking Jitter • Signal to Noise Ratio • Tracking Drift
  • 129. Static vs. Dynamic Accuracy • Static Accuracy • Ability of tracker to determine coordinates of a position in space • Depends on sensor sensitivity, errors (algorithm, operator), environment • Dynamic Accuracy • System accuracy as sensor moves • Depends on static accuracy • Resolution • Minimum change sensor can detect • Repeatability • Same input giving same output
  • 130. Tracker Latency, Update Rate • Latency: Time between change in object pose and time sensor detects the change • Large latency (> 10 ms) can cause simulator sickness • Larger latency (> 50 ms) can reduce VR immersion • Update Rate: Number of measurements per second • Typically > 30 Hz
  • 131. Tracker Jitter, Signal to Noise Ratio • Jitter: Change in tracker output when tracked object is stationary • Range of change is sensor noise • Tracker with no jitter reports constant value if tracked object stationary • Makes tracker data changing randomly about average value • Signal to Noise Ratio: Signal in data relative to noise • Found from calculating mean of samples in known positions
  • 132. Tracker Drift • Drift: Steady increase in tracker error over time • Accumulative (additive) error over time • Relative to Dynamic sensitivity over time • Controlled by periodically recalibration (zeroing)
  • 133. MechanicalTracker (Active) •Idea: mechanical arms with joint sensors •++: high accuracy, haptic feedback •-- : cumbersome, expensive Microscribe Sutherland
  • 134. Example: Fake Space Boom • BOOM (Binocular Omni-Orientation Monitor) • Counterbalanced arm with 100 o FOV HMD mounted on it • 6 DOF, 4mm position accuracy, 300Hz sampling, < 5 ms latency
  • 135. Demo: Fake Space Tele Presence • Using Boom with HMD to control robot view • https://www.youtube.com/watch?v=QpTQTu7A6SI
  • 136. MagneticTracker (Active) • Idea: difference between a magnetic transmitter and a receiver • ++: 6DOF, robust • -- : wired, sensible to metal, noisy, expensive • -- : error increases with distance Flock of Birds (Ascension)
  • 137. Example: Razer Hydra • Developed by Sixense • Magnetic source + 2 wired controllers • Short range (< 1 m), Precision of 1mm and 1o • 62Hz sampling rate, < 50 ms latency • $600 USD
  • 138. Razor Hydra Demo • https://www.youtube.com/watch?v=jnqFdSa5p7w
  • 140. InertialTracker (Passive) • Idea: measuring linear and angular orientation rates (accelerometer/gyroscope) • ++: no transmitter, cheap, small, high frequency, wireless • -- : drift, hysteris only 3DOF IS300 (Intersense) Wii Remote
  • 141. Types of Inertial Trackers • Gyroscopes • The rate of change in object orientation or angular velocity is measured. • Accelerometers • Measure acceleration. • Can be used to determine object position, if the starting point is known. • Inclinometer • Measures inclination, ”level” position. • Like carpenter’s level, but giving electrical signal.
  • 142. Example: MEMS Sensor • Uses spring-supported load • Reacts to gravity and inertia • Changes its electrical parameters • < 5 ms latency, 0.01o accuracy • up to 1000Hz sampling • Problems • Rapidly accumulating errors. • Error in position increases with the square of time. • Cheap units can get position drift of 4 cm in 2 seconds. • Expensive units have same error in 200 seconds. • Not good for measuring location • Need to periodically reset the output
  • 143. Demo: MEMS Sensor Working • https://www.youtube.com/watch?v=9eSnxebfuxg
  • 144. MEMS Gyro Bias Drift • Zero reading of MEMS Gyro drifts over time due to noise
  • 145. Acoustic - UltrasonicsTracker • Idea:Time of Flight or Phase-Coherence SoundWaves • ++: Small, Cheap • -- : 3DOF, Line of Sight, Low resolution, Affected by Environment (pressure, temperature), Low sampling rate Ultrasonic Logitech IS600
  • 146. OpticalTracker (Passive) • Idea: Image Processing and ComputerVision • Specialized • Infrared, Retro-Reflective, Stereoscopic • Monocular BasedVision Tracking ART Hi-Ball
  • 148. Example: Vive Lighthouse Tracking • Outside-in tracking system • 2 base stations • Each with 2 laser scanners, LED array • Headworn/handheld sensors • 37 photo-sensors in HMD, 17 in hand • Additional IMU sensors (500 Hz) • Performance • Tracking server fuses sensor samples • Sampling rate 250 Hz, 4 ms latency • See http://doc-ok.org/?p=1478
  • 149. Lighthouse Components Base station - IR LED array - 2 x scanned lasers Head Mounted Display - 37 photo sensors - 9 axis IMU
  • 151. How Lighthouse Tracking Works • Position tracking using IMU • 500 Hz sampling • But drifts over time • Drift correction using optical tracking • IR synchronization pulse (60 Hz) • Laser sweep between pulses • Photo-sensors recognize sync pulse, measure time to laser • Know when sensor hit and which sensor hit • Calculate position of sensor relative to base station • Use 2 base stations to calculate pose • Use IMU sensor data between pulses (500Hz) • See http://xinreality.com/wiki/Lighthouse
  • 152. Lighthouse Tracking Base station scanning https://www.youtube.com/watch?v=avBt_P0wg_Y https://www.youtube.com/watch?v=oqPaaMR4kY4 Room tracking
  • 153. Example: Oculus Quest • Inside out tracking • Four cameras on corner of display • Searching for visual features • On setup creates map of room
  • 154. Oculus Quest Tracking • https://www.youtube.com/watch?v=2jY3B_F3GZk
  • 155. Occipital Bridge Engine/Structure Core • Inside out tracking • Uses structured light • Better than room scale tracking • Integrated into bridge HMD • https://structure.io/
  • 156.
  • 157. Tracking Coordinate Frames • There can be several coordinate frames to consider • Head pose with respect to real world • Coordinate fame of tracking system wrt HMD • Position of hand in coordinate frame of hand tracker
  • 158. Example: Finding your hand in VR • Using Lighthouse and LeapMotion • Multiple Coordinate Frames • LeapMotion tracks hand in LeapMotion coordinate frame (HLM) • LeapMotion is fixed in HMD coordinate frame (LMHMD) • HMD is tracked in VR coordinate frame (HMDVR) (using Lighthouse) • Where is your hand in VR coordinate frame? • Combine transformations in each coordinate frame • HVR = HLM x LMHMD x HMDVR
  • 160. VR Input Devices • Physical devices that convey information into the application and support interaction in the Virtual Environment
  • 161. Mapping Between Input and Output Input Output
  • 162. Motivation • Mouse and keyboard are good for desktop UI tasks • Text entry, selection, drag and drop, scrolling, rubber banding, … • 2D mouse for 2D windows • What devices are best for 3D input in VR? • Use multiple 2D input devices? • Use new types of devices? vs.
  • 163. Input Device Characteristics • Size and shape, encumbrance • Degrees of Freedom • Integrated (mouse) vs. separable (Etch-a-sketch) • Direct vs. indirect manipulation • Relative vs. Absolute input • Relative: measure difference between current and last input (mouse) • Absolute: measure input relative to a constant point of reference (tablet) • Rate control vs. position control • Isometric vs. Isotonic • Isometric: measure pressure or force with no actual movement • Isotonic: measure deflection from a center point (e.g. mouse)
  • 164. Hand Input Devices • Devices that integrate hand input into VR • World-Grounded input devices • Devices fixed in real world (e.g. joystick) • Non-Tracked handheld controllers • Devices held in hand, but not tracked in 3D (e.g. xbox controller) • Tracked handheld controllers • Physical device with 6 DOF tracking inside (e.g. Vive controllers) • Hand-Worn Devices • Gloves, EMG bands, rings, or devices worn on hand/arm • Bare Hand Input • Using technology to recognize natural hand input
  • 165. World Grounded Devices • Devices constrained or fixed in real world • Not ideal for VR • Constrains user motion • Good for VR vehicle metaphor • Used in location based entertainment (e.g. Disney Aladdin ride) Disney Aladdin Magic Carpet VR Ride
  • 166. Non-Tracked Handheld Controllers • Devices held in hand • Buttons, joysticks, game controllers, etc. • Traditional video game controllers • Xbox controller
  • 167. Tracked Handheld Controllers • Handheld controller with 6 DOF tracking • Combines button/joystick input plus tracking • One of the best options for VR applications • Physical prop enhancing VR presence • Providing proprioceptive, passive haptic touch cues • Direct mapping to real hand motion HTC Vive Controllers Oculus Touch Controllers
  • 168. Example: WMR Handheld Controllers • Windows Mixed Reality Controllers • Left and right hand • Combine computer vision + IMU tracking • Track both in and out of view • Button input, Vibration feedback
  • 169.
  • 170. Hand Worn Devices • Devices worn on hands/arms • Glove, EMG sensors, rings, etc. • Advantages • Natural input with potentially rich gesture interaction • Hands can be held in comfortable positions – no line of sight issues • Hands and fingers can fully interact with real objects
  • 171. Facebook EMG Band • https://www.youtube.com/watch?v=WmxLiXAo9ko
  • 172. Data Gloves • Bend sensing gloves • Passive input device • Detecting hand posture and gestures • Continuous raw data from bend sensors • Fibre optic, resistive ink, strain-gauge • Large DOF output, natural hand output • Pinch gloves • Conductive material at fingertips • Determine if fingertips touching • Used for discrete input • Object selection, mode switching, etc.
  • 173. StretchSense Gloves • Wearable motion capture sensors • Capacitive sensors • Measure stretch, pressure, bend, shear • Many applications • Garments, gloves, etc. • http://stretchsense.com/
  • 174. StretchSense Glove Demo • https://www.youtube.com/watch?v=ZDq7fQguFPI
  • 175. Bare Hands • Using computer vision to track bare hand input • Creates compelling sense of Presence, natural interaction • Challenges need to be solved • Not having sense of touch • Line of sight required to sensor • Fatigue from holding hands in front of sensor
  • 176. Oculus Quest 2 – Hand Tracking • https://www.youtube.com/watch?v=uztFcEA6Rf0
  • 177. Non-Hand Input Devices • Capturing input from other parts of the body • Head Tracking • Use head motion for input • Eye Tracking • Largely unexplored for VR • Microphones • Audio input, speech • Full-Body tracking • Motion capture, body movement
  • 178. Eye Tracking • Technology • Shine IR light into eye and look for reflections • Advantages • Provides natural hands-free input • Gaze provides cues as to user attention • Can be combined with other input technologies
  • 179. HTC Vive Pro Eye • HTC Vive Pro with integrated eye-tracking • Tobii systems eye-tracker • Easy calibration and set-up • Auto-calibration software compensates for HMD motion
  • 181. Full Body Tracking • Adding full-body input into VR • Creates illusion of self-embodiment • Significantly enhances sense of Presence • Technologies • Motion capture suit, camera based systems • Can track large number of significant feature points
  • 182. Camera Based Motion Capture • Use multiple cameras • Reflective markers on body • Eg – Opitrack (www.optitrack.com) • 120 – 360 fps, < 10ms latency, < 1mm accuracy
  • 184. Wearable Motion Capture: PrioVR • Wearable motion capture system • 8 – 17 inertial sensors + wireless data transmission • 30 – 40m range, 7.5 ms latency, 0.09o precision • Supports full range of motion, no occlusion • https://yostlabs.com/priovr/
  • 186. Pedestrian Devices • Pedestrian input in VR • Walking/running in VR • Virtuix Omni • Special shoes • http://www.virtuix.com • Cyberith Virtualizer • Socks + slippery surface • http://cyberith.com
  • 187. Virtuix Omni Demo • https://www.youtube.com/watch?v=aOYHg8qdxTE
  • 188. Omnidirectional Treadmills • Infinadeck • 2 axis treadmill, flexible material • Tracks user to keep them in centre • Limitless walking input in VR • www.infinadeck.com
  • 193. Creating a Good VR Experience • Creating a good experience requires good system design • Integrating multiple hardware, software, interaction, content elements
  • 194. Example: Shard VR Slide • Ride down the Shard at 100 mph - Multi-sensory VR https://www.youtube.com/watch?v=HNXYoEdBtoU
  • 195. Key Components to Consider • Five key components: • Inputs • Outputs • Computation/Simulation • Content/World database • User interaction From: Sherman, W. R., & Craig, A. B. (2018). Understanding virtual reality: Interface, application, and design. Morgan Kaufmann.
  • 196. Typical VR System • Combining multiple technology elements for good user experience • Input devices, output modality, content databases, networking, etc.
  • 197. From Content to User Modelling Program Content • 3d model • Textures Translation • CAD data Application programming Dynamics Generator Input Devices • Gloves, Mic • Trackers Renderers • 3D, sound Output Devices • HMD, audio • Haptic User Actions • Speak • Grab Software Content User I/O
  • 198. Types of VR Graphics Content • Panoramas • 360 images/video • Captured 3D content • Scanned objects/spaces • Modelled Content • Hand created 3D models • Existing 3D assets
  • 199. Capturing Panoramas • Stitching individual photos together • Image Composite Editor (Microsoft) • AutoPano (Kolor) • Using 360 camera • Ricoh Theta-S • Fly360
  • 200. Consumer 360 Capture Devices Kodac 360 Fly 360 Gear 360 Theta S Nikon LG 360 Pointgrey Ladybug Panono 360 Bublcam
  • 201. Example: Cardboard Camera • Capture 360 panoramas • Stitch together images on phone • View in VR on Google Cardboard Viewer
  • 203. • Use camera pairs to capture stereo 360 video • Samsung 360 round • 17 lenses, 4K 3D images, live video streaming, $10K USD • Vuze+ VR camera • 8 lenses, 4K Stereoscopic 3D 360⁰ video and photo, $999 USD Stereo Video Capture Vuze Samsung
  • 204. Samsung 360 Round • https://www.youtube.com/watch?v=X_ytJJOmVF0
  • 205. 3D Scanning • A range of products support 3D scanning • Create point cloud or mesh model • Typically combine RGB cameras with depth sensing • Captures texture plus geometry • Multi-scale • Object Scanners • Handheld, Desktop • Body Scanners • Rotating platform, multi-camera • Room scale • Mobile, tripod mounted
  • 206. Example: Matterport • Matterport Pro2 3D scanner • Room scale scanner, panorama and 3D model • 360° (left-right) x 300° (vertical) field of view • Structured light (infared) 3D sensor • 15 ft (4.5 m) maximum range • 4K HDR images
  • 207. Matterport Pro2 Lite • https://www.youtube.com/watch?v=SjHk0Th-j1I
  • 208. Handheld/Desktop Scanners • Capture people/objects • Sense 3D scanner • accuracy of 0.90 mm, colour resolution of 1920×1080 pixels • Occipital Structure sensor • Add-on to iPad, mesh scanning, IR light projection, 60 Hz
  • 210. 3D Modelling • A variety of 3D modelling tools can be used • Export in VR compatible file format (.obj, .fbx, etc) • Especially useful for animation - difficult to create from scans • Popular tools • Blender (free), 3DS max, Maya, etc. • Easy to Use • Tinkercad, Sketchup Free, Meshmixer, Fusion 360, etc.
  • 211. Modelling in VR • Several tools for modelling in VR • Natural interaction, low polygon count, 3D object viewins • Low end • Google Blocks • High end • Quill, Tilt brush – 3D painting • Gravity Sketch – 3D CAD
  • 212. Example: Google Blocks • https://www.youtube.com/watch?v=1TX81cRqfUU
  • 213. Example: Gravity Sketch • https://www.youtube.com/watch?v=VK2DDnT_3l0
  • 214. Download Existing VR Content • Many locations for 3D objects, textures, etc. • Sketchfab, Sketchup, Free3D (www.free3d.com), etc. • Asset stores - Unity, Unreal • Provide 3D models, materials, code, etc..
  • 215. VR Graphics Architecture • Application Layer • User interface libraries • Simulation/behaviour code • User interaction specification • Graphics Layer (CPU acceleration) • Scene graph specification • Object physics engine • Specifying graphics objects • Rendering Layer (GPU acceleration) • Low level graphics code • Rendering pixels/polygons • Interface with graphics card/frame buffer
  • 216. • Low level code for loading models and showing on screen • Using shaders and low level GPU programming to improve graphics Traditional 3D Graphics Pipeline
  • 217. Graphics Challenges with VR • Higher data throughput (> 7x desktop requirement) • Lower latency requirements (from 150ms/frame to 20ms) • HMD Lens distortion
  • 218. • HMD may have cheap lens • Creates chromatic aberration and distorted image • Warp graphics images to create undistorted view • Use low level shader programming Lens Distortion
  • 219. VR System Pipeline • Using time warping and lens distortion
  • 220. Perception Based Graphics • Eye Physiology • Rods in eye centre = colour vision, cones in periphery = motion, B+W • Foveated Rendering • Use eye tracking to draw highest resolution where user looking • Reduces graphics throughput
  • 222. Typical VR Simulation Loop • User moves head, scene updates, displayed graphics change
  • 223. • Need to synchronize system to reduce delays System Delays
  • 224. Typical Delay from Tracking to Rendering System Delay
  • 225. Typical System Delays • Total Delay = 50 + 2 + 33 + 17 = 102 ms • 1 ms delay = 1/3 mm error for object drawn at arms length • So total of 33mm error from when user begins moving to when object drawn Tracking Calculate Viewpoint Simulation Render Scene Draw to Display x,y,z r,p,y Application Loop 20 Hz = 50ms 500 Hz = 2ms 30 Hz = 33ms 60 Hz = 17ms
  • 226. Living with High Latency (1/3 sec – 3 sec) • https://www.youtube.com/watch?v=_fNp37zFn9Q
  • 227. Effects of System Latency • Degraded Visual Acuity • Scene still moving when head stops = motion blur • Degraded Performance • As latency increases it’s difficult to select objects etc. • If latency > 120 ms, training doesn’t improve performance • Breaks-in-Presence • If system delay high user doesn’t believe they are in VR • Negative Training Effects • User train to operative in world with delay • Simulator Sickness • Latency is greatest cause of simulator sickness
  • 228. Simulator Sickness • Visual input conflicting with vestibular system
  • 229. What Happens When Senses Don’t Match? • 20-30% VR users experience motion sickness • Sensory Conflict Theory • Visual cues don’t match vestibular cues • Eyes – “I’m moving!”, Vestibular – “No, you’re not!”
  • 230. Avoiding Motion Sickness • Better VR experience design • More natural movements • Improved VR system performance • Less tracking latency, better graphics frame rate • Provide a fixed frame of reference • Ground plane, vehicle window • Add a virtual nose • Provide peripheral cue • Eat ginger • Reduces upset stomach
  • 231. Many Causes of Simulator Sickness • 25-40% of VR users get Simulator Sickness, due to: • Latency • Major cause of simulator sickness • Tracking accuracy/precision • Seeing world from incorrect position, viewpoint drift • Field of View • Wide field of view creates more periphery vection = sickness • Refresh Rate/Flicker • Flicker/low refresh rate creates eye fatigue • Vergence/Accommodation Conflict • Creates eye strain over time • Eye separation • If IPD not matching to inter-image distance then discomfort
  • 233. System Design Guidelines - I • Hardware • Choose HMDs with fast pixel response time, no flicker • Choose trackers with high update rates, accurate, no drift • Choose HMDs that are lightweight, comfortable to wear • Use hand controllers with no line-of-sight requirements • System Calibration • Have virtual FOV match actual FOV of HMD • Measure and set users IPD • Latency Reduction • Minimize overall end to end system delay • Use displays with fast response time and low persistence • Use latency compensation to reduce perceived latency Jason Jerald, The VR Book, 2016
  • 234. System Design Guidelines - II • General Design • Design for short user experiences • Minimize visual stimuli closer to eye (vergence/accommodation) • For binocular displays, do not use 2D overlays/HUDs • Design for sitting, or provide physical barriers • Show virtual warning when user reaches end of tracking area • Motion Design • Move virtual viewpoint with actual motion of the user • If latency high, no tasks requiring fast head motion • Interface Design • Design input/interaction for user’s hands at their sides • Design interactions to be non-repetitive to reduce strain injuries Jason Jerald, The VR Book, 2016