11. Virtual
Immersive
Block the real world
Teleport to a new world and time
Part of simulation
Augmented
Overlay information
View the real world
Present in time and space
Part of real world
17. Why should DESIGNERs pay attention ?
It’s still undefined - nanscent and evolving
not boxes and arrows in flat screen - Designing real space
It’s about real immersive interaction - the UX
Not just window to digital world but THE WORLD
No more fiction - It’s real in your hands as Native mobile apps - increasing
Is it different than current UI design? Will find out
18. Why should DESIGNERS pay attention ?
if you currently own a smartphone
chances are, you are already carrying a virtual
reality device in your pocket
39. Context Users Goals
Design Process
Experience
Immersion
Learning
Explore
Media
Gamers
Enterprise
Entertainment
Gaming
Healthcare
Automotive
Manufacturing
Design
user personas, conceptual flows, wireframes, an interaction model
42. Design elements
Units - not pixels but cm, inch, feet, m
Dimension - x,y,z - near and far
Peripheral vision
Typography - legibility
Shape vs volume
User Point of reference - stationary vs moving
Seated vs Standing interaction
Spatial sound/audio
Ergonomics - physiological, environment
contd...
52. Typography
Avoid showing text on white and translucent
backgrounds
Minimum readable size for the font in VR is height
of 2.32cm on the distance of one meter
Recommended size of the font is height of 6.04cm
on the distance of one meter
Try
https://coob113.github.io/font_test/size.html
54. Point of reference
Stationary MovingControls placement
2d Layout vs 3d object placements
Interactive elements - near and far
Movement
Framing the content
Ground the user
56. Motion
Avoid lateral motion with close/large objects, and at high speeds
Avoid fast motion toward the user
Be careful about moving the user
Maintain a visible stable horizon line
Don’t show sudden acceleration or deceleration
Let players choose their own height
Give them the option to sit or stand
Fairvalley, Meantime, Meditation VR
57. Audio/Sound
Spatial audio is a key ingredient for true immersion,
making the design process around sound absolutely essential
66. Skybox and
placement of
information
Foreground - interactable 3D objects
Midground - distant messages or
scenery that confer scale and depth
Skybox - wrapping the scene
Capturing Skybox images
68. Googles’ design guidelines for VR
1. Using a Reticle
2. UI Depth & Eye Strain
3. Using Constant Velocity
4. Keeping the User Grounded
5. Maintaining Head Tracking
6. Guiding with Light
7. Leveraging Scale
8. Spatial Audio
9. Gaze Cues
10. Make it Beautiful
69. In the age of
digital tools for
design and
prototyping
We are going to
use paper
wireframing
79. Warm up
1. Fail and learn - Draw anything and see
2. Draw basic shapes
3. Determine your comfortable viewing area
4. Add colors
5. Check your name in VR in 3D
A typical stop sign is 1.82 meters tall (around 6
feet).
A traffic light is 4.57 meters tall (around 15 feet).
80. Let’s create a paper wireframe
Product browser
Image gallery - browsing
Music player - playback
Personal Website
WebVR
Low Fi proto - cardboard
UsersGoals Context
The red pill would free him from the enslaving control of the machine-generated dream world and allow him to escape into the real world, but living the "truth of reality" is harsher and more difficult.
On the other hand, the blue pill would lead him back to stay in the comfortable simulated reality of the Matrix
The central observation is that the amount by which light rays from an object diverge at the eye’s lens depends on the distance from the object to the eye. If the object is close, the rays diverge at a large angle (see Figure 1, left); if the object is at a medium distance, the rays diverge less (see Figure 1, center); if the object is infinitely far away, the rays are parallel and do not diverge at all. In other words: the closer an object is to the eye, the more the eye’s lens has to bend the incoming light rays to form a sharp image on the retina.
If we extend the light rays between the intermediate lens and the eye backwards (the fainter lines on the right of Figure 2), we find that they all intersect in a single point behind the real screen. To the eye, it looks exactly like a screen that is farther away and larger (the grey horizontal line on the right of Figure 2). This is called a virtual image. And that’s why viewers can comfortably focus on the screens in HMDs: they are not looking at the real screens, which are too close, but at virtual images of the real screens, which are at a much larger distance.
Top-down view of how our left and right eyes see different images. Remember, binocular is what we see in reality and in virtual reality. The monocular perspective is what we see when we play video games or develop VR apps on a laptop.
Amazing collection of VR history
https://medium.com/inborn-experience/vr-ar-prototyping-for-everyone-ea6fb8f159b5
See how the picture below looks strange because we can’t move around in it
To create comfortable immersion it helps to understand depth perception.
Depth perception comes from an array of depth cues. These are split up into binocular and monocular cues. Binocular cues provide depth information when viewing a scene with both eyes. Monocular cues provide depth information when viewing a scene with one eye.
We use two binocular cues and 15 monocular cues to perceive depth. We use 7X more cues to perceive depth with 1 eye! Understanding how we see depth with 1 eye helps us to build more comfortable VR scenes.
Familiar size: Borrow from the real world, and size and scale objects so that they look and feel real to the player.
Texture gradient: Objects that are farther away or higher up, should have a lighter shader/color than those that appear closer.
Motion parallax and Occultation: Consider how to place one object in front of another to create a sense of depth.
Imagine standing on a mountain, looking up and around with a clear view of the sky and clouds surrounding you. This is our real-world skybox. It’s what we perceive as far as the eye can see. It’s all around us, above and to the horizon in every direction. It’s our backdrop.
As a design element, the skybox is really just a sphere of imagery that can be made from a photo, texture, or rendered artwork. When it’s placed in a scene, it extends out to infinity without a perceivable edge, giving an illusion of depth and reality. The skybox is an essential component in virtual reality design — it’s like a panoramic wrapper that projects an entire background scene onto your interface.
https://medium.com/aol-alpha/making-sense-of-skyboxes-in-vr-design-3e9f8fe254d3
a skybox can really be anything that creates the outermost limits of a scene that you are designing. It could be a solid color that sits behind the menu in a static scene. It could also be a real photograph or a 360˙ video that’s playing. It could be the outer limits of a rendered 3D universe — you could be swimming underwater, floating in outer space, or walking around an animated experience.
In the foreground you might have interactable 3D objects like buttons the player can touch to deploy a menu, or logs of wood they can pick up, or they run into wild wolves that try to attack them.
In the midground, you might have distant messages or scenery that confer scale and depth.
And lastly, the skybox resides in the background, expanding out to infinity and wrapping the scene