Lecture 6 of the COMP 4010 course on AR/VR. This lecture is about designing AR systems. This was taught by Mark Billinghurst at the University of South Australia on September 1st 2022.
Mark BillinghurstDirector at HIT Lab NZ à University of South Australia
3. Typical Interaction Design Cycle
Develop alternative prototypes/concepts and compare them, And iterate, iterate, iterate....
4. Tom Chi’s Prototyping Rules
1. Find the quickest path to experience
2. Doing is the best kind of thinking
3. Use materials that move at the speed of
thought to maximize your rate of learning
5. From Idea to Product
Define
Requirements
CityViewAR
1
2
3
4
5
6
1
Sketch
Interface
2
Rough
Wireframes
3
Interactive
Prototype
4
High Fidelity
Prototype
5
Developer
Coding
6
User
Testing
7
Deploy
App
8
Develop
Iterate
8. From Sketch to Prototype
Storyboard Wireframe Mock-up
Sketch Interactive
Prototype
9. Buxton’s Key Attributes of Sketching
• Quick
• Work at speed of thought
• Timely
• Always available
• Disposable
• Inexpensive, little investment
• Plentiful
• Easy to iterate
• A catalyst
• Evokes conversations
12. Key Elements
1. Scenario: Storyboards are based on a scenario or a user story. The
persona or role that corresponds to that scenario is clearly specified
2. Visuals: Each step in the scenario is represented visually in a sequence.
The steps can be sketches, illustrations, or photos.
3. Captions: Each visual has a corresponding caption. The caption
describes the user’s actions, environment, emotional state, device, etc.
13. Wireframes
It’s about
- Functional specs
- Navigation and interaction
- Functionality and layout
- How interface elements work together
- Defining the interaction flow/experience
Leaving room for the design to be created
15. Mockup
It’s about
- Look and feel
- Building on wireframe
- High fidelity visuals
- Putting together final assets
- Getting feedback on design
16. Designing AR in VR
https://www.youtube.com/watch?v=TfQJhSJQiaU
17. Vuforia Studio
• Author animated AR experiences
• Drag and drop content
• Add animations
• Import CAD models
• Combine with IOT sensors
• https://www.ptc.com/en/products/vuforia/vuforia-studio
18. Mock-up Guidelines
1. Generate final 2D/3D interface elements
2. Replace wireframe UI elements with high quality visuals
3. Use standard AR/VR UI elements
4. Simulate AR/VR views
5. Focus on visual/audio design
6. Collect feedback from target end-users
19. Sketch vs. Wireframe vs. Mock-up
Low Fidelity Low to Medium
Fidelity
Medium to High Fidelity
IDEATE FLOW VISUALIZE
24. Digital Authoring Tools for AR
Vuforia Studio
Lens Studio
• Support visual authoring of marker-
based and/or marker-less AR apps
• Provide default markers and support
for custom markers
• Typically enable AR previews
through emulator but need to deploy
to AR device for testing
25. Immersive Authoring Tools for AR
• Enable visual authoring of 3D
content in AR
• Make it possible to edit while
previewing AR experience in the
environment
• Provide basic support for interactive
behaviors
• Sometimes support export to
WebXR
Apple Reality Composer
Adobe Aero
26. Interactive Sketching
•Pop App
● Pop - https://marvelapp.com/pop
● Combining sketching and interactivity on mobiles
● Take pictures of sketches, link pictures together
27. Proto.io
• Web based prototyping tool
• Visual drag and drop interface
• Rich transitions
• Scroll, swipe, buttons, etc
• Deploy on device
• mobile, PC, browser
• Ideal for mobile interfaces
• iOS, Android template
• For low and high fidelity prototypes
28. AR Visual Programming
• Rapid prototype on desktop
• Deliver on mobile
• Simple interactivity
• Examples
• Zapworks Studio
• https://zap.works/studio/
• Snap Lens Studio
• https://lensstudio.snapchat.com/
• Facebook Spark AR Studio
• https://sparkar.facebook.com/ar-studio/
29. Creating On Device
•Adobe Aero
•Create AR on mobile devices
•Touch based interaction and authoring
•Only iOS support for now
•https://www.adobe.com/nz/products/aero.html
30. Apple Reality Composer
• Rapidly create 3D scenes and AR experiences
• Creation on device (iPhone, iPad)
• Drag and drop interface
• Loading 2D/3D content
• Adding simple interactivity
• Anchor content in real world (AR view)
• Planes (vertical, horizontal), faces, images
32. A-Frame
• Based on Three.js and WebGL
• New HTML tags for 3D scenes
• A-Frame Inspector (not editor)
• Asset management (img, video,
audio, & 3D models)
• ECS architecture with many open-
source components
• Cross-platform XR
33. Unity
• Started out as game engine
• Has integrated support for many
types of XR apps
• Powerful scene editor
• Asset management & store
• Basically all XR device vendors
provide Unity SDKs
34. Unity vs. A-Frame
Unity is a game engine and XR dev
platform
● De facto standard for XR apps
● Increasingly built-in support
● Most “XR people” will ask you about
your Unity skills :-)
Support for all XR devices
● Basically all AR and VR device
vendors provide Unity SDKs
A-Frame is a declarative WebXR
framework
● Emerging XR app development
framework on top of THREE.js
● Good for novice XR designers with
web dev background
Support for most XR devices
● Full WebXR support in Firefox,
Chrome, & Oculus Browser
40. AR. Design Considerations
• 1. Design for Humans
• Use Human Information Processing model
• 2. Design for Different User Groups
• Different users may have unique needs
• 3. Design for the Whole User
• Social, cultural, emotional, physical cognitive
• 4. Use UI Best Practices
• Adapt known UI guidelines to AR/VR
• 5. Use of Interface Metaphors/Affordances
• Decide best metaphor for AR/VR application
41. 1. Design for Human Information Processing
• High level staged model from Wickens and Carswell (1997)
• Relates perception, cognition, and physical ergonomics
Perception Cognition Ergonomics
42. Design for Perception
• Need to understand perception to design AR
• Visual perception
• Many types of visual cues (stereo, oculomotor, etc.)
• Auditory system
• Binaural cues, vestibular cues
• Somatosensory
• Haptic, tactile, kinesthetic, proprioceptive cues
• Chemical Sensing System
• Taste and smell
48. Design for Cognition
• Design for Working and Long-term memory
• Working memory
• Short term storage, Limited storage (~5-9 items)
• Long term memory
• Memory recall trigger by associative cues
• Situational Awareness
• Model of current state of user’s environment
• Used for wayfinding, object interaction, spatial awareness, etc..
• Provide cognitive cues to help with situational awareness
• Landmarks, procedural cues, map knowledge
• Support both ego-centric and exo-centric views
50. Time Looking at Screen
Oulasvirta, A. (2005). The fragmentation of attention in mobile
interaction, and what to do with it. interactions, 12(6), 16-18.
52. Design for Micro Interactions
▪ Design interaction for less than a few seconds
• Tiny bursts of interaction
• One task per interaction
• One input per interaction
▪ Benefits
• Use limited input
• Minimize interruptions
• Reduce attention fragmentation
53. NHTSA Guidelines - www.nhtsa.gov
For technology in cars:
• Any task by a driver should be interruptible at any time.
• The driver should control the pace of task interactions.
• Tasks should be completed with glances away from road <2 seconds
• Cumulative time glancing away from the road <=12 secs.
54. Make it Glanceable
• Seek to rigorously reduce information density. Successful designs afford for
recognition, not reading.
Bad Good
55. Reduce Information Chunks
You are designing for recognition, not reading. Reducing the total # of information
chunks will greatly increase the glanceability of your design.
1
2
3
1
2
3
4
5 (6)
Eye movements
For 1: 1-2 460ms
For 2: 1 230ms
For 3: 1 230ms
~920ms
Eye movements
For 1: 1 230ms
For 2: 1 230ms
For 3: 1 230ms
For 4: 3 690ms
For 5: 2 460ms
~1,840ms
56. Ego-centric and Exo-centric views
• Combining ego-centric and exo-centric cue for better situational awareness
57. Cognitive Issues in Mobile AR
• Information Presentation
• Amount, Representation, Placement, View combination
• Physical Interaction
• Navigation, Direct manipulation, Content creation
• Shared Experience
• Social context, Bodily Configuration, Artifact manipulation, Display space
Li, N., & Duh, H. B. L. (2013). Cognitive issues in mobile augmented reality: an embodied perspective.
In Human factors in augmented reality environments (pp. 109-135). Springer, New York, NY.
58. Information Presentation
• Consider
• The amount of information
• Clutter, complexity
• The representation of information
• Navigation cues, POI representation
• The placement of information
• Head, body, world stabilized
• Using view combinations
• Multiple views
59. Example: Twitter 360
• iPhone application
• See geo-located tweets in real world
• Twitter.com supports geo tagging
60. But: Information Clutter from Many Tweets
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
Blah
66. Design for Physical Ergonomics
• Design for the human motion range
• Consider human comfort and natural posture
• Design for hand input
• Coarse and fine scale motions, gripping and grasping
• Avoid “Gorilla arm syndrome” from holding arm pose
67. Gorilla Arm in AR
• Design interface to reduce mid-air gestures
68. XRgonomics
• Uses physiological model to calculate ergonomic interaction cost
• Difficulty of reaching points around the user
• Customizable for different users
• Programmable API, Hololens demonstrator
• GitHub Repository
• https://github.com/joaobelo92/xrgonomics
Evangelista Belo, J. M., Feit, A. M., Feuchtner, T., & Grønbæk, K. (2021, May). XRgonomics: Facilitating the Creation of
Ergonomic 3D Interfaces. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-11).
70. 2. Designing for Different User Groups
• Design for Difference Ages
• Children require different interface design than adults
• Older uses have different needs than younger
• Prior Experience with AR systems
• Familiar with HMDs, AR input devices
• People with Different Physical Characteristics
• Height and arm reach, handedness
• Perceptual, Cognitive and Motor Abilities
• Colour perception varies between people
• Spatial ability, cognitive or motor disabilities
71. Designing for Children
• HMDS
• inter pupillary distance, head fit, size and weight
• Tablets
• Poor dexterity, need to hold large tablet
• Content
• Reading ability, spatial perception
73. Consider Your User
• Consider context of user
• Physical, social, emotional, cognitive, etc.
• Mobile Phone AR User
• Probably Mobile
• One hand interaction
• Short application use
• Need to be able to multitask
• Use in outdoor or indoor environment
• Want to enhance interaction with real world
75. Whole User Needs
• Social
• Don’t make your user look stupid
• Cultural
• Follow local cultural norms
• Physical
• Can the user physically use the interface?
• Cognitive
• Can the user understand how the interface works?
• Emotional
• Make the user feel good and in control
76. Example: Social Acceptance
• People don’t want to look silly
• Only 12% of 4,600 adults would be willing to wear AR glasses
• 20% of mobile AR browser users experience social issues
• Acceptance more due to Social than Technical issues
• Needs further study (ethnographic, field tests, longitudinal)
80. 4. Use UI Best Practices
• General UI design principles can be applied to AR
• E.g. Shneiderman’s UI guidelines from 1998
• Providing interface feedback
• Mixture of reactive, instrumental and operational feedback
• Maintain spatial and temporal correspondence
• Use constraints
• Specify relations between variables that must be satisfied
• E.g. physical constraints reduce freedom of movement
• Support Two-Handed control
• Use Guiard’s framework of bimanual manipulation
• Dominant vs. non-dominant hands
81. Follow Good HCI Principles
• Provide good conceptual model/Metaphor
• customers want to understand how UI works
• Make things visible
• if object has function, interface should show it
• Map interface controls to customerʼs model
• infix -vs- postfix calculator -- whose model?
• Provide feedback
• what you see is what you get!
82. Example: Guiard’s model of bimanual manipulation
Guiard, Y. (1987). Asymmetric division of labor in human skilled bimanual action: The kinematic chain as a model. Journal of Motor Behavior, 19, 486-517.
Dominant
hand
Non-dominant
hand
Dominant
hand
Non-dominant
hand
Non-Dominant: Leads, set spatial reference frame, performs coarse motions
Dominant: Follows, works in reference frame, performs fine motions
83. Adapting Existing Guidelines
• Mobile Phone AR
• Phone HCI Guidelines
• Mobile HCI Guidelines
• HMD Based AR
• 3D User Interface Guidelines
• VR Interface Guidelines
• Desktop AR
• Desktop UI Guidelines
84. Example: Apple iOS Interface Guidelines
• Make it obvious how to use your content.
• Avoid clutter, unused blank space, and busy backgrounds.
• Minimize required user input.
• Express essential information succinctly.
• Provide a fingertip-sized target for all controls.
• Avoid unnecessary interactivity.
• Provide feedback when necessary
From: https://developer.apple.com/ios/human-interface-guidelines/
85. Applying Principles to Mobile AR
• Clean
• Large Video View
• Large Icons
• Text Overlay
• Feedback
86. •Interface Components
• Physical components
• Display elements
• Visual/audio
• Interaction metaphors
Physical
Elements
Display
Elements
Interaction
Metaphor
Input Output
5. Use Interface Metaphors
87. AR Interfaces
Tangible AR
Tangible input
AR overlay
Direct interaction
Natural AR
Freehand gesture
Speech, gaze
Tangible UI
Augmented surfaces
Object interaction
Familiar controllers
Indirect interaction
3D AR
3D UI
Dedicated
controllers
Custom devices
Browsing
Simple input
Viewpoint control
Expressiveness, Intuitiveness
88. AR Interfaces
Tangible AR
Tangible input
AR overlay
Direct interaction
Natural AR
Freehand gesture
Speech, gaze
Tangible UI
Augmented surfaces
Object interaction
Familiar controllers
Indirect interaction
3D AR
3D UI
Dedicated
controllers
Custom devices
Browsing
Simple input
Viewpoint control
Design for Layers
98. Design to Device Constraints
• Understand the platform and design for limitations
• Hardware, software platforms
• E.g. Handheld AR game with visual tracking
• Use large screen icons
• Consider screen reflectivity
• Support one-hand interaction
• Consider the natural viewing angle
• Do not tire users out physically
• Do not encourage fast actions
• Keep at least one tracking surface in view
Art of Defense Game
99. Handheld AR Constraints/Affordances
• Camera and screen are linked
• Fast motions a problem when looking at screen
• Intuitive “navigation”
• Phone in hand
• Two handed activities: awkward or intuitive
• Extended periods of holding phone tiring
• Awareness of surrounding environment
• Small screen
• Extended periods of looking at screen tiring
• In general, small awkward platform
• Vibration, sound
• Can provide feedback when looking elsewhere
100. Common Mobile AR Metaphors
• Tangible AR Lens Viewing
• Look through screen into AR scene
• Interact with screen to interact with AR content
• Touch screen input
• E.g. Invisible Train
• Metaphor – holding a window into the AR world
102. Common Mobile AR Metaphors
• Tangible AR Lens Manipulation
• Select AR object and attach to device
• Physically move phone to move AR object
• Use motion of device as input
• E.g. AR Lego
• Metaphor – phone as physical handle for device
104. AR Interfaces
Tangible AR
Tangible input
AR overlay
Direct interaction
Natural AR
Freehand gesture
Speech, gaze
Tangible UI
Augmented surfaces
Object interaction
Familiar controllers
Indirect interaction
3D AR
3D UI
Dedicated
controllers
Custom devices
Browsing
Simple input
Viewpoint control
Design for Affordances
105. Tangible AR Metaphor
• AR overcomes limitation of TUIs
• enhance display possibilities
• merge task/display space
• provide public and private views
• TUI + AR = Tangible AR
• Apply TUI methods to AR interface design
106. Tangible AR Design Principles
• Tangible AR Interfaces use TUI principles
• Physical controllers for moving virtual content
• Support for spatial 3D interaction techniques
• Time and space multiplexed interaction
• Support for multi-handed interaction
• Match object affordances to task requirements
• Support parallel activity with multiple objects
• Allow collaboration between multiple users
108. Affordances
”… the perceived and actual properties of the thing, primarily
those fundamental properties that determine just how the
thing could possibly be used. [...]
Affordances provide strong clues to the operations of things.”
(Norman, The Psychology of Everyday Things 1988, p.9)
112. Physical vs. Virtual Affordances
• Physical Affordance
• Look and feel of real objects
• Shape, texture, colour, weight, etc.
• Industrial Design
• Virtual Affordance
• Look of virtual objects
• Copy real objects
• Interface Design
113. •AR design is mixture of physical
affordance and virtual affordance
•Physical
•Tangible controllers and objects
•Virtual
•Virtual graphics and audio
114. Affordances in AR
• Design AR interface objects to show how they are used
• Use visual and physical cues to show possible affordances
• Perceived affordances should match actual affordances
• Physical and virtual affordances should match
Merge Cube Tangible Molecules
115. Case Study 1: 3D AR Lens
Goal: Develop a lens based AR interface
• MagicLenses
• Developed at Xerox PARC in 1993
• View a region of the workspace differently to the rest
• Overlap MagicLenses to create composite effects
117. AR Lens Design Principles
• Physical Components
• Lens handle
• Virtual lens attached to real object
• Display Elements
• Lens view
• Reveal layers in dataset
• Interaction Metaphor
• Physically holding lens
118. 3D AR Lenses: Model Viewer
§ Displays models made up of multiple parts
§ Each part can be shown or hidden through the lens
§ Allows the user to peer inside the model
§ Maintains focus + context
123. Case Study 2: LevelHead
• Physical Components
• Real blocks
• Display Elements
• Virtual person and rooms
• Interaction Metaphor
• Blocks are rooms
126. Case Study 3: AR Chemistry (Fjeld 2002)
• Tangible AR chemistry education
127. Goal: An AR application to teach
molecular structure in chemistry
•Physical Components
• Real book, rotation cube, scoop, tracking
markers
•Display Elements
• AR atoms and molecules
• Interaction Metaphor
• Build your own molecule
131. Case Study 4: Transitional Interfaces
Goal: An AR interface supporting
transitions from reality to virtual reality
•Physical Components
• Real book
•Display Elements
• AR and VR content
• Interaction Metaphor
• Book pages hold virtual scenes
133. Transitions
• Interfaces of the future will need to support
transitions along the RV continuum
• Augmented Reality is preferred for:
• co-located collaboration
• Immersive Virtual Reality is preferred for:
• experiencing world immersively (egocentric)
• sharing views
• remote collaboration
137. Features
• Seamless transition from Reality to Virtuality
• Reliance on real decreases as virtual increases
• Supports egocentric and exocentric views
• User can pick appropriate view
• Computer becomes invisible
• Consistent interface metaphors
• Virtual content seems real
• Supports collaboration
138. Collaboration in MagicBook
• Collaboration on multiple levels:
• Physical Object
• AR Object
• Immersive Virtual Space
• Egocentric + exocentric collaboration
• multiple multi-scale users
• Independent Views
• Privacy, role division, scalability
139. Technology
• Reality
• No technology
• Augmented Reality
• Camera – tracking
• Switch – fly in
• Virtual Reality
• Compass – tracking
• Press pad – move
• Switch – fly out
140. Summary
•When designing AR interfaces, think of:
• Physical Components
• Physical affordances
• Virtual Components
• Virtual affordances
• Interface Metaphors
• Tangible AR or similar
151. Design Patterns
“Each pattern describes a problem which occurs
over and over again in our environment, and then
describes the core of the solution to that problem in
such a way that you can use this solution a million
times over, without ever doing it the same way twice.”
– Christopher Alexander et al.
Use Design Patterns to Address Reoccurring Problems
C.A. Alexander, A Pattern Language, Oxford Univ. Press, New York, 1977.
154. Design Patterns for Handheld AR
• Set of design patterns for Handheld AR
• Title: a short phase that is memorable.
• Definition: what experiences the prepattern supports
• Description: how and why the prepattern works,
what aspects of game design it is based on.
• Examples: Illustrate the meaning of the pre-pattern.
• Using the pre-patterns: reveal the challenges and
context of applying the pre-patterns.
Xu, Y., Barba, E., Radu, I., Gandy, M., Shemaka, R., Schrank, B., ... & Tseng, T.
(2011, October). Pre-patterns for designing embodied interactions in handheld
augmented reality games. In 2011 IEEE International Symposium on Mixed and
Augmented Reality-Arts, Media, and Humanities (pp. 19-28). IEEE.
155. Handheld AR Design Patterns
Title Meaning Embodied Skills
Device Metaphors Using metaphor to suggest available player
actions
Body A&S Naïve physics
Control Mapping Intuitive mapping between physical and digital
objects
Body A&S Naïve physics
Seamful Design Making sense of and integrating the
technological seams through game design
Body A&S
World Consistency Whether the laws and rules in
physical world hold in digital world
Naïve physics
Environmental A&S
Landmarks Reinforcing the connection between digital-
physical space through landmarks
Environmental A&S
Personal Presence The way that a player is represented in the
game decides how much they feel like living in
the digital game world
Environmental A&S
Naïve physics
Living Creatures Game characters that are responsive to
physical, social events that mimic behaviours
of living beings
Social A&S Body A&S
Body constraints Movement of one’s body position
constrains another player’s action
Body A&S Social A&S
Hidden information The information that can be hidden and
revealed can foster emergent social play
Social A&S Body A&S
*A&S = awareness and skills
157. Example: Seamless Design
• Design to reduce seams in the user experience
• Eg: AR tracking failure, change in interaction mode
• Paparazzi Game
• Change between AR tracking to accelerometer input
Yan Xu , et.al. , Pre-patterns for designing embodied interactions in handheld augmented reality games,
Proceedings of the 2011 IEEE International Symposium on Mixed and Augmented Reality--Arts, Media, and
Humanities, p.19-28, October 26-29, 2011
159. Example: Living Creatures
• Virtual creatures should respond to real world events
• eg. Player motion, wind, light, etc
• Creates illusion creatures are alive in the real world
• Sony EyePet
• Responds to player blowing on creature
161. ARCore Elements App
• Mobile AR app demonstrating
interface guidelines
• Multiple Interface Guidelines
• User interface
• User environment
• Object manipulation
• Off-screen markers
• Etc..
• Test on Device
• https://play.google.com/store/apps/details?id=com.google.ar.unity.ddelements
175. The Trouble with AR Design Guidelines
1) Rapidly evolving best practices
Still a moving target, lots to learn about AR design
Slowly emerging design patterns, but often change with OS updates
Already major differences between device platforms
2) Challenges with scoping guidelines
Often too high level, like “keep the user safe and comfortable”
Or, too application/device/vendor-specific
3) Best guidelines come from learning by doing
Test your designs early and often, learn from your own “mistakes”
Mind differences between VR and AR, but less so between devices