SlideShare une entreprise Scribd logo
INTERACTION AND PROTOTYPING
COMP 4010 Lecture Five
Mark Billinghurst
August 24th 2021
mark.billinghurst@unisa.edu.au
REVIEW
OPTICAL TRACKING
Common AR Optical Tracking Types
• Marker Tracking
• Tracking known artificial markers/images
• e.g. ARToolKit square markers
• Markerless Tracking
• Tracking from known features in real world
• e.g. Vuforia image tracking
• Unprepared Tracking
• Tracking in unknown environment
• e.g. SLAM tracking
Marker Tracking
• Available for more than 20 years
• Several open-source solutions exist
• ARToolKit, ARTag, ATK+, etc
• Fairly simple to implement
• Standard computer vision methods
• A rectangle provides 4 corner points
• Enough for pose estimation!
Marker Based Tracking: ARToolKit
https://github.com/artoolkit
Natural Feature Tracking
• Use Natural Cues of Real Elements
• Edges
• Surface Texture
• Interest Points
• Model or Model-Free
• No visual pollution
Contours
Features Points
Surfaces
TextureTracking
Detection and Tracking
Detection
Incremental
tracking
Tracking target
detected
Tracking target
lost
Tracking target
not detected
Incremental
tracking ok
Start
+ Recognize target type
+ Detect target
+ Initialize camera pose
+ Fast
+ Robust to blur, lighting changes
+ Robust to tilt
• Tracking and detection are complementary approaches.
• After successful detection, the target is tracked incrementally.
• If the target is lost, the detection is activated again
NFT – Real-time tracking
• Search for keypoints in the video image
• Create the descriptors
• Match the descriptors from the
live video against those in the database
• Remove the keypoints that are outliers
• Use the remaining keypoints
to calculate the pose of the camera
Keypoint detection
Descriptor creation
and matching
Outlier Removal
Pose estimation
and refinement
Camera Image
Pose
Recognition
Example
Target Image Feature Detection AR Overlay
Tracking from an Unknown Environment
• What to do when you don’t know any features?
• Very important problem in mobile robotics - Where am I?
• SLAM
• Simultaneously Localize And Map the environment
• Goal: to recover both camera pose and map structure
while initially knowing neither.
• Mapping:
• Building a map of the environment which the robot is in
• Localisation:
• Navigating this environment using the map while keeping
track of the robot’s relative position and orientation
Parallel Tracking and Mapping
Tracking Mapping
New keyframes
Map updates
+ Estimate camera pose
+ For every frame
+ Extend map
+ Improve map
+ Slow updates rate
Parallel tracking and mapping uses two
concurrent threads, one for tracking and one
for mapping, which run at different speeds
How SLAMWorks
• Three main steps
1. Tracking a set of points through successive camera frames
2. Using these tracks to triangulate their 3D position
3. Simultaneously use the estimated point locations to calculate
the camera pose which could have observed them
• By observing a sufficient number of points can solve for both
structure and motion (camera path and scene structure).
Hybrid Tracking Interfaces
• Combine multiple tracking technologies together
• Active-Passive: Magnetic, Vision
• Active-Inertial: Vison, inertial
• Passoive-Inertial: Compass, inertial
Combining Sensors andVision
• Sensors
• Produces noisy output (= jittering augmentations)
• Are not sufficiently accurate (= wrongly placed augmentations)
• Gives us first information on where we are in the world,
and what we are looking at
• Vision
• Is more accurate (= stable and correct augmentations)
• Requires choosing the correct keypoint database to track from
• Requires registering our local coordinate frame (online-
generated model) to the global one (world)
Example: Outdoor Hybrid Tracking
• Combines
• computer vision
• inertial gyroscope sensors
• Both correct for each other
• Inertial gyro
• provides frame to frame prediction of camera
orientation, fast sensing
• drifts over time
• Computer vision
• Natural feature tracking, corrects for gyro drift
• Slower, less accurate
ARKit – Visual Inertial Odometry
• Uses both computer vision + inertial sensing
• Tracking position twice
• Computer Vision – feature tracking, 2D plane tracking
• Inertial sensing – using the phone IMU
• Output combined via Kalman filter
• Determine which output is most accurate
• Pass pose to ARKit SDK
• Each system compliments the other
• Computer vision – needs visual features
• IMU - drifts over time, doesn’t need features
ARKit –Visual Inertial Odometry
• Slow camera
• Fast IMU
• If camera drops out IMU takes over
• Camera corrects IMU errors
AR INTERACTION
Different Types of AR Interaction
1. Browsing Interfaces
• simple (conceptually!), unobtrusive
2. 3D AR Interfaces
• expressive, creative, require attention
3. Tangible Interfaces
• embedded into conventional environments
4. Tangible AR
• combines TUI input + AR display
5. Natural AR Interfaces
• interact with gesture, speech and gaze
1. AR Interfaces as Data Browsers
• 2D/3D virtual objects are
registered in 3D
• “VR in Real World”
• Interaction
• 2D/3D virtual viewpoint control
• Applications
• Visualization, training
Example: Google Maps AR Mode
• AR Navigation Aid
• GPS + compass, 2D/3D object placement
Advantages and Disadvantages
• Important class of AR interfaces
• Wearable computers
• AR simulation, training
• Limited interactivity
• Modification of virtual
content is difficult
Rekimoto, et al. 1997
2. 3D AR Interfaces
• Virtual objects displayed in 3D
physical space and manipulated
• HMDs and 6DOF head-tracking
• 6DOF hand trackers for input
• Interaction
• Viewpoint control
• Traditional 3D user interface interaction:
manipulation, selection, etc.
Kiyokawa, et al. 2000
Advantages and Disadvantages
• Important class of AR interfaces
• Entertainment, design, training
• Advantages
• User can interact with 3D virtual
object everywhere in space
• Natural, familiar interaction
• Disadvantages
• Usually no tactile feedback
• User has to use different devices
for virtual and physical objects
Oshima, et al. 2000
AR INTERACTION (PART II)
3. Augmented Surfaces and Tangible Interfaces
• Basic principles
• Virtual images are projected
on a surface
• Physical objects are used as
controls for virtual objects
• Support for collaboration
Wellner, P. (1993). Interacting with paper on the
DigitalDesk. Communications of the ACM, 36(7), 87-96.
Augmented Surfaces
• Rekimoto, et al. 1999
• Front projection
• Marker-based tracking
• Multiple projection surfaces
• Object interaction
Rekimoto, J., & Saitoh, M. (1999, May). Augmented
surfaces: a spatially continuous work space for hybrid
computing environments. In Proceedings of the SIGCHI
conference on Human Factors in Computing
Systems (pp. 378-385).
Augmented Surfaces Demo (1999)
https://www.youtube.com/watch?v=r4g_fvnjVCA
Tangible User Interfaces (Ishii 97)
• Create digital shadows
for physical objects
• Foreground
• graspable UI
• Background
• ambient interfaces
Tangible Interfaces - Ambient
• Dangling String
• Jeremijenko 1995
• Ambient ethernet monitor
• Relies on peripheral cues
• Ambient Fixtures
• Dahley, Wisneski, Ishii 1998
• Use natural material qualities
for information display
Tangible Interface: ARgroove
• Collaborative Instrument
• Exploring Physically Based
Interaction
• Map physical actions to
Midi output
• Translation, rotation
• Tilt, shake
ARGroove Demo (2001)
ARgroove in Use
Visual Feedback
•Continuous Visual Feedback is Key
•Single Virtual Image Provides:
• Rotation
• Tilt
• Height
i/O Brush (Ryokai, Marti, Ishii) - 2004
Ryokai, K., Marti, S., & Ishii, H. (2004, April). I/O brush: drawing with everyday objects as ink.
In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 303-310).
i/O Brush Demo (2005)
https://www.youtube.com/watch?v=04v_v1gnyO8
Many Other Examples
• Triangles (Gorbert 1998)
• Triangular based story telling
• ActiveCube (Kitamura 2000-)
• Cubes with sensors
• Reactable (2007- )
• Cube based music interface
Lessons from Tangible Interfaces
• Physical objects make us smart
• Norman’s “Things that Make Us Smart”
• encode affordances, constraints
• Objects aid collaboration
• establish shared meaning
• Objects increase understanding
• serve as cognitive artifacts
But There are TUI Limitations
• Difficult to change object properties
• can’t tell state of digital data
• Limited display capabilities
• projection screen = 2D
• dependent on physical display surface
• Separation between object and display
• ARgroove – Interact on table, look at screen
Advantages and Disadvantages
•Advantages
• Natural - user’s hands are used for interacting
with both virtual and real objects.
• No need for special purpose input devices
•Disadvantages
• Interaction is limited only to 2D surface
• Full 3D interaction and manipulation is difficult
Orthogonal Nature of Interfaces
3D AR interfaces Tangible Interfaces
Spatial Gap No – Interaction is
Everywhere
Yes – Interaction is
only on 2D surfaces
Interaction Gap
Yes – separate
devices for physical
and virtual objects
No – same devices for
physical and virtual
objects
Orthogonal Nature of Interfaces
3D AR interfaces Tangible Interfaces
Spatial Gap No – Interaction is
Everywhere
Yes – Interaction is
only on 2D surfaces
Interaction Gap
Yes – separate
devices for physical
and virtual objects
No – same devices for
physical and virtual
objects
4. Tangible AR: Back to the Real World
• AR overcomes display limitation of TUIs
• enhance display possibilities
• merge task/display space
• provide public and private views
• TUI + AR = Tangible AR
• Apply TUI methods to AR interface design
Billinghurst, M., Kato, H., & Poupyrev, I. (2008). Tangible augmented reality. ACM Siggraph Asia, 7(2), 1-10.
Space vs. Time - Multiplexed
• Space-multiplexed
• Many devices each with one function
• Quicker to use, more intuitive, clutter
• Real Toolbox
• Time-multiplexed
• One device with many functions
• Space efficient
• mouse
Tangible AR: Tiles (Space Multiplexed)
• Tiles semantics
• data tiles
• operation tiles
• Operation on tiles
• proximity
• spatial arrangements
• space-multiplexed
Poupyrev, I., Tan, D. S., Billinghurst, M., Kato, H., Regenbrecht, H., & Tetsutani, N. (2001,
July). Tiles: A Mixed Reality Authoring Interface. In Interact (Vol. 1, pp. 334-341).
Space-multiplexed Interface
Data authoring in Tiles
Tiles Demo (2001)
Proximity-based Interaction
Tangible AR: Time-multiplexed Interaction
• Use of natural physical object manipulations to control
virtual objects
• VOMAR Demo
• Catalog book:
• Turn over the page
• Paddle operation:
• Push, shake, incline, hit, scoop
Kato, H., Billinghurst, M., Poupyrev, I., Imamoto, K., & Tachibana, K. (2000, October). Virtual object manipulation on a table-top AR
environment. In Proceedings IEEE and ACM International Symposium on Augmented Reality (ISAR 2000) (pp. 111-119). IEEE.
VOMAR Interface
VOMAR Demo (2001)
Advantages and Disadvantages
•Advantages
• Natural interaction with virtual and physical tools
• No need for special purpose input devices
• Spatial interaction with virtual objects
• 3D manipulation with virtual objects anywhere in space
•Disadvantages
• Requires Head Mounted Display
5. Natural AR Interfaces
• Goal:
• Interact with AR content the same
way we interact in the real world
• Using natural user input
• Body motion
• Gesture
• Gaze
• Speech
• Input recognition
• Nature gestures, gaze
• Multimodal input
FingARtips (2004)
Tinmith (2001)
External Fixed Cameras
• Overhead depth sensing camera
• Capture real time hand model
• Create point cloud model
• Overlay graphics on AR view
• Perform gesture interaction
Billinghurst, M., Piumsomboon, T., & Bai, H. (2014). Hands in space: Gesture interaction with
augmented-reality interfaces. IEEE computer graphics and applications, 34(1), 77-80.
Examples
https://www.youtube.com/watch?v=7FLyDEdQ_vk
PhobiAR (2013)
https://www.youtube.com/watch?v=635PVGicxng
Google Glass (2013)
Head Mounted Cameras
• Attach cameras/depth sensor to HMD
• Connect to high end PC
• Computer vision capture/processing on PC
• Perform tracking/gesture recognition on PC
• Use custom tracking hardware
• Leap Motion (Structured IR)
• Intel RealSense (Stereo depth)
Project NorthStar (2018)
Meta2 (2016)
Project NorthStar Hand Interaction
Self Contained Systems
• Sensors and processors on device
• Fully mobile
• Customized hardware/software
• Example: Hololens 2 (2019)
• 3D hand tracking
• 21 points/hand tracked
• Gesture driven interface
• Constrained set of gestures
• Multimodal input (gesture, gaze, speech)
Hololens 2 Gesture Input Demo (MRTK)
https://www.youtube.com/watch?v=qfONlUCSWdg
Speech Input
• Reliable speech recognition
• Windows speech, Watson, etc.
• Indirect input with AR content
• No need for gesture
• Match with gaze/head pointing
• Look to select target
• Good for Quantitative input
• Numbers, text, etc.
• Keyword trigger
• “select”, ”hey cortana”, etc https://www.youtube.com/watch?v=eHMkOpNUtR8
Eye Tracking Interfaces
• Use IR light to find gaze direction
• IR sources + cameras in HMD
• Support implicit input
• Always look before interact
• Natural pointing input
• Multimodal Input
• Combine with gesture/speech
Camera
IR light
IR view
Processed image
Hololens 2
Gaze Demo
https://www.youtube.com/watch?v=UPH4lk1jAWs
Evolution of AR Interfaces
Tangible AR
Tangible input
AR overlay
Direct interaction
Natural AR
Freehand gesture
Speech, gaze
Tangible UI
Augmented surfaces
Object interaction
Familiar controllers
Indirect interaction
3D AR
3D UI
Dedicated
controllers
Custom devices
Browsing
Simple input
Viewpoint control
Expressiveness, Intuitiveness
DESIGNING AR INTERFACES
Interaction Design
“Designing interactive products to support
people in their everyday and working lives”
Preece, J., (2002). Interaction Design
• Design of User Experience with Technology
Bill Verplank on Interaction Design
https://www.youtube.com/watch?v=Gk6XAmALOWI
•Interaction Design involves answering three questions:
•What do you do? - How do you affect the world?
•What do you feel? – What do you sense of the world?
•What do you know? – What do you learn?
Bill Verplank
Typical Interaction Design Cycle
Develop alternative prototypes/concepts and compare them, And iterate, iterate, iterate....
PROTOTYPING
How Do You Prototype This?
Example: Google Glass
View Through Google Glass
Google Glass Prototyping
https://www.youtube.com/watch?v=d5_h1VuwD6g
Early Glass Prototyping
Tom Chi’s Prototyping Rules
1. Find the quickest path to experience
2. Doing is the best kind of thinking
3. Use materials that move at the speed of
thought to maximize your rate of learning
How can we quickly prototype
XR experiences with little or no
coding?
Prototyping in Interaction Design
Key Prototyping
Steps
● Quick visual design
● Capture key interactions
● Focus on user experience
● Communicate design ideas
● “Learn by doing/experiencing”
Why Prototype?
Importance of Physical and Digital Prototypes
• Quick & dirty
• Explore interactions
• Get initial user feedback
• Avoid premature commitment
• Devise technical requirements
Typical Development Steps
● Sketching
● Storyboards
● UI Mockups
● Interaction Flows
● Video Prototypes
● Interactive Prototypes
● Final Native Application
Increased
Fidelity and
Interactivity
From Idea to Product
XR Prototyping Techniques
Lo-
Fi
Hi-
Fi
Easy
Hard
Digital
Authoring
Immersive
Authoring
Web-Based
Development*
Cross-Platform
Development*
Native
Development*
* requires scripting and 3D programming skills
Sketching
Paper Prototyping
Video Prototyping
Wireframing
Bodystorming
Wizard of Oz
XR Prototyping Tools
Low Fidelity (Concept, visual design)
• Sketching
• Photoshop
• PowerPoint
• Video
High Fidelity (Interaction, experience design)
• Interactive sketching
• Desktop & on-device authoring
• Immersive authoring & visual scripting
• XR development toolkits
Advantages/Disadvantages
Prototype Advantages Disadvantages
Low-fidelity
prototype
- low developmental cost
- evaluate multiple
design concepts
- limited error checking
- navigational and flow
limitations
High-fidelity
prototype
- fully interactive
- look and feel of final
product
- clearly defined
navigational scheme
- more expensive to develop
- time consuming to build
- developers are reluctant to
change something they have
crafted for hours
Content and Interaction
Polished
Content
Placeholder
Content
Implicit & Explicit
Interactions
“click”
Lo-Fi Hi-Fi
XR Prototyping Tools
Low Fidelity (Concept, visual design)
• Sketching
• Photoshop
• PowerPoint
• Video
High Fidelity (Interaction, experience design)
• Interactive sketching
• Desktop & on-device authoring
• Immersive authoring & visual scripting
• XR development toolkits
Your Most Valuable Prototyping Tool..
Interface Sketching
Sketching as Dialogue
“Sketching is about the Activity not the Result” - Bill Buxton
Buxton’s Key Attributes of Sketching
• Quick
• Work at speed of thought
• Timely
• Always available
• Disposable
• Inexpensive, little investment
• Plentiful
• Easy to iterate
• A catalyst
• Evokes conversations
Sketching Resources
AR/VR Interface Design Sketches
•Sketch out Design concept(s)
From Sketch to Product
•Sketches:
- early ideation stages of design
•Prototypes:
- detailing the actual design
Sketch to Prototype
Storyboarding - Describing the Experience
http://monamishra.com/projects/Argo
Wireframes
It’s about
- Functional specs
- Navigation and interaction
- Functionality and layout
- How interface elements work together
- Defining the interaction flow/experience
Leaving room for the design to be created
Task Flow
User flow with the application
Informs screen/interface elements
Example Wireframe
Mockup
It’s about
- Look and feel
- Building on wireframe
- High fidelity visuals
- Putting together final assets
- Getting feedback on design
Interface Sketching in VR
Using VR applications for rapid prototyping
- Intuitive sketching in immersive space
- Creating/testing at 1:1 scale
- Rapid UI design/layout
Examples
- Quill - https://quill.fb.com/
- Tilt Brush - https://www.tiltbrush.com/
Example: VR Concept Interface for SketchUp
https://www.youtube.com/watch?v=5FR3BLkgSPk
Designing AR in VR
https://www.youtube.com/watch?v=TfQJhSJQiaU
Demo
https://www.youtube.com/watch?v=TfQJhSJQiaU
Microsoft Maquette
•Prototype AR/VR interfaces from inside VR
•3D UI for spatial prototyping
•Bring content into Unity with plug-in
•Javascript support
Maquette Demo
https://www.youtube.com/watch?v=capxS6C6ooY
Scene Assembly In AR
• Many tools for creating AR scenes
• Drag and drop your assets
• Develop on web, publish to mobile
• Examples
• Catchoom - CraftAR
• Blippar - Blipbuilder
• ARloopa - Arloopa studio
• Wikitude - Wikitude studio
• Zappar - ZapWorks Designer
CraftAR
•Web-based AR marker tracking
•Add 3D models, video, images to real print content
•Simple drag and drop interface
•Cloud based image recognition
•https://catchoom.com/augmented-reality-craftar/
CraftAR Demo
https://www.youtube.com/watch?v=f42MqLF5Odw
Adding Transitions
Paper Prototyping
Basic Materials
● Post-its
● 5x8 in. index cards
● Scissors, X-acto knives
● Overhead transparencies
● Large, heavy, white paper (11 x 17)
● Tape, stick glue, correction tape
● Pens & markers (many colors & sizes)
Example: Mobile Paper Prototype
https://www.youtube.com/watch?v=jTytI1PkGFM
AR Prototyping with Layers
● Separate world-stabilized and head stabilized
○ Draw world stabilized on background paper
○ Draw head stabilized on transparent plastic
● Simulate Field of View of AR HMD
Lauber, F., Böttcher, C., & Butz, A. (2014, September). Papar: Paper prototyping for augmented reality. In Adjunct Proceedings
of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 1-6).
FOV in Red
Head Stabilized - Foreground
World Stabilized - Background
Example: Mobile AR Prototyping
https://www.youtube.com/watch?v=Hrbflbct99o
AR Visualization with Props
https://www.youtube.com/watch?v=dXBXBHLqrLM
Case Study:
Higher Fidelity Prototype
https://medium.com/@dylanhongtech/augmented-reality-prototyping-tools-for-head-mounted-displays-c1ee8eaa0783
AR Prototyping for HMDs
•Tools
•Storyboarding - Concept
•Sketch – high fidelity 2D
•Cinema4D – 3D animation
•SketchBox – 3D layout
•Photoshop – Visual mockup
•PowerPoint - Interactivity
Sketch Design
•User Interface Layout
•Overlay AR elements on blurry background
Cinema4D
•3D animation tool
•Interface element mock-up
SketchBox
•3D user interface layout
Photoshop
•High end visual mock-ups
•Add AR content to real world views
Unity/MRTK Prototyping
•Viewing on the Hololens2, simple gesture interaction
Final PowerPoint Prototype
•Simple interactive demo
Video Sketching
• Process
• Capture elements of real world
• Use series of still photos/sketches in a movie format.
• Act out using the product
• Benefits
• Demonstrates the product experience
• Discover where concept needs fleshing out.
• Communicate experience and interface
• You can use whatever tools you want, e.g. iMovie.
Example ToastAR (Apple)
View here: https://developer.apple.com/videos/play/wwdc2018/808/
AR Video Sketching
https://www.youtube.com/watch?v=vityu-IgHLQ
Google Glass Concept Movie
https://www.youtube.com/watch?v=5R1snVxGNVs
Tvori - Animating AR/VR Interfaces
• Animation based AR/VR prototyping tool
• https://tvori.co/
• Key features
• Model input, animation, etc
• Export 360 images and video
• Simulate AR views
• Multi-user support
• Present in VR
• Create VR executable
Tvori Demo
https://www.youtube.com/watch?v=XSI5_unAjCY
Vuforia Studio
• Author animated AR experiences
• Drag and drop content
• Add animations
• Import CAD models
• Combine with IOT sensors
• https://www.ptc.com/en/products/vuforia/vuforia-studio
https://www.youtube.com/watch?v=0kfIKMOqAPc
www.empathiccomputing.org
@marknb00
mark.billinghurst@unisa.edu.au

Contenu connexe

Tendances

Tendances (20)

Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional Interfaces
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
 
Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and Systems
 
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research Directions
 
Advanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR StudiesAdvanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR Studies
 
2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VR
 
ISS2022 Keynote
ISS2022 KeynoteISS2022 Keynote
ISS2022 Keynote
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader Metaverse
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole Metaverse
 
Comp 4010 2021 Snap Tutorial 2
Comp 4010 2021 Snap Tutorial 2Comp 4010 2021 Snap Tutorial 2
Comp 4010 2021 Snap Tutorial 2
 
Lecture 4: VR Systems
Lecture 4: VR SystemsLecture 4: VR Systems
Lecture 4: VR Systems
 
Lecture 9 AR Technology
Lecture 9 AR TechnologyLecture 9 AR Technology
Lecture 9 AR Technology
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
 

Similaire à Comp4010 Lecture5 Interaction and Prototyping

Similaire à Comp4010 Lecture5 Interaction and Prototyping (20)

Comp4010 Lecture4 AR Tracking and Interaction
Comp4010 Lecture4 AR Tracking and InteractionComp4010 Lecture4 AR Tracking and Interaction
Comp4010 Lecture4 AR Tracking and Interaction
 
COMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionCOMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR Interaction
 
COMP 4010 Lecture 9 AR Interaction
COMP 4010 Lecture 9 AR InteractionCOMP 4010 Lecture 9 AR Interaction
COMP 4010 Lecture 9 AR Interaction
 
Mobile AR Lecture 2 - Technology
Mobile AR Lecture 2 - TechnologyMobile AR Lecture 2 - Technology
Mobile AR Lecture 2 - Technology
 
2016 AR Summer School Lecture3
2016 AR Summer School Lecture32016 AR Summer School Lecture3
2016 AR Summer School Lecture3
 
Mobile AR Lecture 10 - Research Directions
Mobile AR Lecture 10 - Research DirectionsMobile AR Lecture 10 - Research Directions
Mobile AR Lecture 10 - Research Directions
 
2016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 52016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 5
 
COMP 4010 - Lecture 4: 3D User Interfaces
COMP 4010 - Lecture 4: 3D User InterfacesCOMP 4010 - Lecture 4: 3D User Interfaces
COMP 4010 - Lecture 4: 3D User Interfaces
 
Lecture 5: 3D User Interfaces for Virtual Reality
Lecture 5: 3D User Interfaces for Virtual RealityLecture 5: 3D User Interfaces for Virtual Reality
Lecture 5: 3D User Interfaces for Virtual Reality
 
Mobile AR lecture 9 - Mobile AR Interface Design
Mobile AR lecture 9 - Mobile AR Interface DesignMobile AR lecture 9 - Mobile AR Interface Design
Mobile AR lecture 9 - Mobile AR Interface Design
 
Mobile Augmented Reality
Mobile Augmented RealityMobile Augmented Reality
Mobile Augmented Reality
 
COMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR InteractionCOMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR Interaction
 
COMP 4010 - Lecture11 - AR Applications
COMP 4010 - Lecture11 - AR ApplicationsCOMP 4010 - Lecture11 - AR Applications
COMP 4010 - Lecture11 - AR Applications
 
COMP 4010 Lecture10: AR Tracking
COMP 4010 Lecture10: AR TrackingCOMP 4010 Lecture10: AR Tracking
COMP 4010 Lecture10: AR Tracking
 
COMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityCOMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual Reality
 
Sixth sense
Sixth senseSixth sense
Sixth sense
 
COMP 4010: Lecture 4 - 3D User Interfaces for VR
COMP 4010: Lecture 4 - 3D User Interfaces for VRCOMP 4010: Lecture 4 - 3D User Interfaces for VR
COMP 4010: Lecture 4 - 3D User Interfaces for VR
 
Computer-Vision based Centralized Multi-agent System on Matlab and Arduino Du...
Computer-Vision based Centralized Multi-agent System on Matlab and Arduino Du...Computer-Vision based Centralized Multi-agent System on Matlab and Arduino Du...
Computer-Vision based Centralized Multi-agent System on Matlab and Arduino Du...
 
Building AR and VR Experiences
Building AR and VR ExperiencesBuilding AR and VR Experiences
Building AR and VR Experiences
 
Augmented Reality
Augmented RealityAugmented Reality
Augmented Reality
 

Plus de Mark Billinghurst

Plus de Mark Billinghurst (11)

The Metaverse: Are We There Yet?
The  Metaverse:    Are   We  There  Yet?The  Metaverse:    Are   We  There  Yet?
The Metaverse: Are We There Yet?
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the Metaverse
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the Metaverse
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR Systems
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise AR
 

Dernier

Essentials of Automations: Optimizing FME Workflows with Parameters
Essentials of Automations: Optimizing FME Workflows with ParametersEssentials of Automations: Optimizing FME Workflows with Parameters
Essentials of Automations: Optimizing FME Workflows with Parameters
Safe Software
 
State of ICS and IoT Cyber Threat Landscape Report 2024 preview
State of ICS and IoT Cyber Threat Landscape Report 2024 previewState of ICS and IoT Cyber Threat Landscape Report 2024 preview
State of ICS and IoT Cyber Threat Landscape Report 2024 preview
Prayukth K V
 
Future Visions: Predictions to Guide and Time Tech Innovation, Peter Udo Diehl
Future Visions: Predictions to Guide and Time Tech Innovation, Peter Udo DiehlFuture Visions: Predictions to Guide and Time Tech Innovation, Peter Udo Diehl
Future Visions: Predictions to Guide and Time Tech Innovation, Peter Udo Diehl
Peter Udo Diehl
 

Dernier (20)

FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdfFIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
 
Essentials of Automations: Optimizing FME Workflows with Parameters
Essentials of Automations: Optimizing FME Workflows with ParametersEssentials of Automations: Optimizing FME Workflows with Parameters
Essentials of Automations: Optimizing FME Workflows with Parameters
 
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualitySoftware Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
 
Ransomware Mallox [EN].pdf
Ransomware         Mallox       [EN].pdfRansomware         Mallox       [EN].pdf
Ransomware Mallox [EN].pdf
 
State of ICS and IoT Cyber Threat Landscape Report 2024 preview
State of ICS and IoT Cyber Threat Landscape Report 2024 previewState of ICS and IoT Cyber Threat Landscape Report 2024 preview
State of ICS and IoT Cyber Threat Landscape Report 2024 preview
 
When stars align: studies in data quality, knowledge graphs, and machine lear...
When stars align: studies in data quality, knowledge graphs, and machine lear...When stars align: studies in data quality, knowledge graphs, and machine lear...
When stars align: studies in data quality, knowledge graphs, and machine lear...
 
PHP Frameworks: I want to break free (IPC Berlin 2024)
PHP Frameworks: I want to break free (IPC Berlin 2024)PHP Frameworks: I want to break free (IPC Berlin 2024)
PHP Frameworks: I want to break free (IPC Berlin 2024)
 
To Graph or Not to Graph Knowledge Graph Architectures and LLMs
To Graph or Not to Graph Knowledge Graph Architectures and LLMsTo Graph or Not to Graph Knowledge Graph Architectures and LLMs
To Graph or Not to Graph Knowledge Graph Architectures and LLMs
 
From Siloed Products to Connected Ecosystem: Building a Sustainable and Scala...
From Siloed Products to Connected Ecosystem: Building a Sustainable and Scala...From Siloed Products to Connected Ecosystem: Building a Sustainable and Scala...
From Siloed Products to Connected Ecosystem: Building a Sustainable and Scala...
 
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
 
De-mystifying Zero to One: Design Informed Techniques for Greenfield Innovati...
De-mystifying Zero to One: Design Informed Techniques for Greenfield Innovati...De-mystifying Zero to One: Design Informed Techniques for Greenfield Innovati...
De-mystifying Zero to One: Design Informed Techniques for Greenfield Innovati...
 
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...
 
Le nuove frontiere dell'AI nell'RPA con UiPath Autopilot™
Le nuove frontiere dell'AI nell'RPA con UiPath Autopilot™Le nuove frontiere dell'AI nell'RPA con UiPath Autopilot™
Le nuove frontiere dell'AI nell'RPA con UiPath Autopilot™
 
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024
 
Bits & Pixels using AI for Good.........
Bits & Pixels using AI for Good.........Bits & Pixels using AI for Good.........
Bits & Pixels using AI for Good.........
 
Future Visions: Predictions to Guide and Time Tech Innovation, Peter Udo Diehl
Future Visions: Predictions to Guide and Time Tech Innovation, Peter Udo DiehlFuture Visions: Predictions to Guide and Time Tech Innovation, Peter Udo Diehl
Future Visions: Predictions to Guide and Time Tech Innovation, Peter Udo Diehl
 
ODC, Data Fabric and Architecture User Group
ODC, Data Fabric and Architecture User GroupODC, Data Fabric and Architecture User Group
ODC, Data Fabric and Architecture User Group
 
Unsubscribed: Combat Subscription Fatigue With a Membership Mentality by Head...
Unsubscribed: Combat Subscription Fatigue With a Membership Mentality by Head...Unsubscribed: Combat Subscription Fatigue With a Membership Mentality by Head...
Unsubscribed: Combat Subscription Fatigue With a Membership Mentality by Head...
 
GenAISummit 2024 May 28 Sri Ambati Keynote: AGI Belongs to The Community in O...
GenAISummit 2024 May 28 Sri Ambati Keynote: AGI Belongs to The Community in O...GenAISummit 2024 May 28 Sri Ambati Keynote: AGI Belongs to The Community in O...
GenAISummit 2024 May 28 Sri Ambati Keynote: AGI Belongs to The Community in O...
 
НАДІЯ ФЕДЮШКО БАЦ «Професійне зростання QA спеціаліста»
НАДІЯ ФЕДЮШКО БАЦ  «Професійне зростання QA спеціаліста»НАДІЯ ФЕДЮШКО БАЦ  «Професійне зростання QA спеціаліста»
НАДІЯ ФЕДЮШКО БАЦ «Професійне зростання QA спеціаліста»
 

Comp4010 Lecture5 Interaction and Prototyping

  • 1. INTERACTION AND PROTOTYPING COMP 4010 Lecture Five Mark Billinghurst August 24th 2021 mark.billinghurst@unisa.edu.au
  • 4. Common AR Optical Tracking Types • Marker Tracking • Tracking known artificial markers/images • e.g. ARToolKit square markers • Markerless Tracking • Tracking from known features in real world • e.g. Vuforia image tracking • Unprepared Tracking • Tracking in unknown environment • e.g. SLAM tracking
  • 5. Marker Tracking • Available for more than 20 years • Several open-source solutions exist • ARToolKit, ARTag, ATK+, etc • Fairly simple to implement • Standard computer vision methods • A rectangle provides 4 corner points • Enough for pose estimation!
  • 6. Marker Based Tracking: ARToolKit https://github.com/artoolkit
  • 7. Natural Feature Tracking • Use Natural Cues of Real Elements • Edges • Surface Texture • Interest Points • Model or Model-Free • No visual pollution Contours Features Points Surfaces
  • 9. Detection and Tracking Detection Incremental tracking Tracking target detected Tracking target lost Tracking target not detected Incremental tracking ok Start + Recognize target type + Detect target + Initialize camera pose + Fast + Robust to blur, lighting changes + Robust to tilt • Tracking and detection are complementary approaches. • After successful detection, the target is tracked incrementally. • If the target is lost, the detection is activated again
  • 10. NFT – Real-time tracking • Search for keypoints in the video image • Create the descriptors • Match the descriptors from the live video against those in the database • Remove the keypoints that are outliers • Use the remaining keypoints to calculate the pose of the camera Keypoint detection Descriptor creation and matching Outlier Removal Pose estimation and refinement Camera Image Pose Recognition
  • 11. Example Target Image Feature Detection AR Overlay
  • 12. Tracking from an Unknown Environment • What to do when you don’t know any features? • Very important problem in mobile robotics - Where am I? • SLAM • Simultaneously Localize And Map the environment • Goal: to recover both camera pose and map structure while initially knowing neither. • Mapping: • Building a map of the environment which the robot is in • Localisation: • Navigating this environment using the map while keeping track of the robot’s relative position and orientation
  • 13. Parallel Tracking and Mapping Tracking Mapping New keyframes Map updates + Estimate camera pose + For every frame + Extend map + Improve map + Slow updates rate Parallel tracking and mapping uses two concurrent threads, one for tracking and one for mapping, which run at different speeds
  • 14. How SLAMWorks • Three main steps 1. Tracking a set of points through successive camera frames 2. Using these tracks to triangulate their 3D position 3. Simultaneously use the estimated point locations to calculate the camera pose which could have observed them • By observing a sufficient number of points can solve for both structure and motion (camera path and scene structure).
  • 15. Hybrid Tracking Interfaces • Combine multiple tracking technologies together • Active-Passive: Magnetic, Vision • Active-Inertial: Vison, inertial • Passoive-Inertial: Compass, inertial
  • 16. Combining Sensors andVision • Sensors • Produces noisy output (= jittering augmentations) • Are not sufficiently accurate (= wrongly placed augmentations) • Gives us first information on where we are in the world, and what we are looking at • Vision • Is more accurate (= stable and correct augmentations) • Requires choosing the correct keypoint database to track from • Requires registering our local coordinate frame (online- generated model) to the global one (world)
  • 17. Example: Outdoor Hybrid Tracking • Combines • computer vision • inertial gyroscope sensors • Both correct for each other • Inertial gyro • provides frame to frame prediction of camera orientation, fast sensing • drifts over time • Computer vision • Natural feature tracking, corrects for gyro drift • Slower, less accurate
  • 18. ARKit – Visual Inertial Odometry • Uses both computer vision + inertial sensing • Tracking position twice • Computer Vision – feature tracking, 2D plane tracking • Inertial sensing – using the phone IMU • Output combined via Kalman filter • Determine which output is most accurate • Pass pose to ARKit SDK • Each system compliments the other • Computer vision – needs visual features • IMU - drifts over time, doesn’t need features
  • 19. ARKit –Visual Inertial Odometry • Slow camera • Fast IMU • If camera drops out IMU takes over • Camera corrects IMU errors
  • 21. Different Types of AR Interaction 1. Browsing Interfaces • simple (conceptually!), unobtrusive 2. 3D AR Interfaces • expressive, creative, require attention 3. Tangible Interfaces • embedded into conventional environments 4. Tangible AR • combines TUI input + AR display 5. Natural AR Interfaces • interact with gesture, speech and gaze
  • 22. 1. AR Interfaces as Data Browsers • 2D/3D virtual objects are registered in 3D • “VR in Real World” • Interaction • 2D/3D virtual viewpoint control • Applications • Visualization, training
  • 23. Example: Google Maps AR Mode • AR Navigation Aid • GPS + compass, 2D/3D object placement
  • 24. Advantages and Disadvantages • Important class of AR interfaces • Wearable computers • AR simulation, training • Limited interactivity • Modification of virtual content is difficult Rekimoto, et al. 1997
  • 25. 2. 3D AR Interfaces • Virtual objects displayed in 3D physical space and manipulated • HMDs and 6DOF head-tracking • 6DOF hand trackers for input • Interaction • Viewpoint control • Traditional 3D user interface interaction: manipulation, selection, etc. Kiyokawa, et al. 2000
  • 26. Advantages and Disadvantages • Important class of AR interfaces • Entertainment, design, training • Advantages • User can interact with 3D virtual object everywhere in space • Natural, familiar interaction • Disadvantages • Usually no tactile feedback • User has to use different devices for virtual and physical objects Oshima, et al. 2000
  • 28. 3. Augmented Surfaces and Tangible Interfaces • Basic principles • Virtual images are projected on a surface • Physical objects are used as controls for virtual objects • Support for collaboration Wellner, P. (1993). Interacting with paper on the DigitalDesk. Communications of the ACM, 36(7), 87-96.
  • 29. Augmented Surfaces • Rekimoto, et al. 1999 • Front projection • Marker-based tracking • Multiple projection surfaces • Object interaction Rekimoto, J., & Saitoh, M. (1999, May). Augmented surfaces: a spatially continuous work space for hybrid computing environments. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems (pp. 378-385).
  • 30. Augmented Surfaces Demo (1999) https://www.youtube.com/watch?v=r4g_fvnjVCA
  • 31. Tangible User Interfaces (Ishii 97) • Create digital shadows for physical objects • Foreground • graspable UI • Background • ambient interfaces
  • 32. Tangible Interfaces - Ambient • Dangling String • Jeremijenko 1995 • Ambient ethernet monitor • Relies on peripheral cues • Ambient Fixtures • Dahley, Wisneski, Ishii 1998 • Use natural material qualities for information display
  • 33. Tangible Interface: ARgroove • Collaborative Instrument • Exploring Physically Based Interaction • Map physical actions to Midi output • Translation, rotation • Tilt, shake
  • 36. Visual Feedback •Continuous Visual Feedback is Key •Single Virtual Image Provides: • Rotation • Tilt • Height
  • 37. i/O Brush (Ryokai, Marti, Ishii) - 2004 Ryokai, K., Marti, S., & Ishii, H. (2004, April). I/O brush: drawing with everyday objects as ink. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 303-310).
  • 38. i/O Brush Demo (2005) https://www.youtube.com/watch?v=04v_v1gnyO8
  • 39. Many Other Examples • Triangles (Gorbert 1998) • Triangular based story telling • ActiveCube (Kitamura 2000-) • Cubes with sensors • Reactable (2007- ) • Cube based music interface
  • 40. Lessons from Tangible Interfaces • Physical objects make us smart • Norman’s “Things that Make Us Smart” • encode affordances, constraints • Objects aid collaboration • establish shared meaning • Objects increase understanding • serve as cognitive artifacts
  • 41. But There are TUI Limitations • Difficult to change object properties • can’t tell state of digital data • Limited display capabilities • projection screen = 2D • dependent on physical display surface • Separation between object and display • ARgroove – Interact on table, look at screen
  • 42. Advantages and Disadvantages •Advantages • Natural - user’s hands are used for interacting with both virtual and real objects. • No need for special purpose input devices •Disadvantages • Interaction is limited only to 2D surface • Full 3D interaction and manipulation is difficult
  • 43. Orthogonal Nature of Interfaces 3D AR interfaces Tangible Interfaces Spatial Gap No – Interaction is Everywhere Yes – Interaction is only on 2D surfaces Interaction Gap Yes – separate devices for physical and virtual objects No – same devices for physical and virtual objects
  • 44. Orthogonal Nature of Interfaces 3D AR interfaces Tangible Interfaces Spatial Gap No – Interaction is Everywhere Yes – Interaction is only on 2D surfaces Interaction Gap Yes – separate devices for physical and virtual objects No – same devices for physical and virtual objects
  • 45. 4. Tangible AR: Back to the Real World • AR overcomes display limitation of TUIs • enhance display possibilities • merge task/display space • provide public and private views • TUI + AR = Tangible AR • Apply TUI methods to AR interface design Billinghurst, M., Kato, H., & Poupyrev, I. (2008). Tangible augmented reality. ACM Siggraph Asia, 7(2), 1-10.
  • 46. Space vs. Time - Multiplexed • Space-multiplexed • Many devices each with one function • Quicker to use, more intuitive, clutter • Real Toolbox • Time-multiplexed • One device with many functions • Space efficient • mouse
  • 47. Tangible AR: Tiles (Space Multiplexed) • Tiles semantics • data tiles • operation tiles • Operation on tiles • proximity • spatial arrangements • space-multiplexed Poupyrev, I., Tan, D. S., Billinghurst, M., Kato, H., Regenbrecht, H., & Tetsutani, N. (2001, July). Tiles: A Mixed Reality Authoring Interface. In Interact (Vol. 1, pp. 334-341).
  • 51. Tangible AR: Time-multiplexed Interaction • Use of natural physical object manipulations to control virtual objects • VOMAR Demo • Catalog book: • Turn over the page • Paddle operation: • Push, shake, incline, hit, scoop Kato, H., Billinghurst, M., Poupyrev, I., Imamoto, K., & Tachibana, K. (2000, October). Virtual object manipulation on a table-top AR environment. In Proceedings IEEE and ACM International Symposium on Augmented Reality (ISAR 2000) (pp. 111-119). IEEE.
  • 54. Advantages and Disadvantages •Advantages • Natural interaction with virtual and physical tools • No need for special purpose input devices • Spatial interaction with virtual objects • 3D manipulation with virtual objects anywhere in space •Disadvantages • Requires Head Mounted Display
  • 55. 5. Natural AR Interfaces • Goal: • Interact with AR content the same way we interact in the real world • Using natural user input • Body motion • Gesture • Gaze • Speech • Input recognition • Nature gestures, gaze • Multimodal input FingARtips (2004) Tinmith (2001)
  • 56. External Fixed Cameras • Overhead depth sensing camera • Capture real time hand model • Create point cloud model • Overlay graphics on AR view • Perform gesture interaction Billinghurst, M., Piumsomboon, T., & Bai, H. (2014). Hands in space: Gesture interaction with augmented-reality interfaces. IEEE computer graphics and applications, 34(1), 77-80.
  • 58. Head Mounted Cameras • Attach cameras/depth sensor to HMD • Connect to high end PC • Computer vision capture/processing on PC • Perform tracking/gesture recognition on PC • Use custom tracking hardware • Leap Motion (Structured IR) • Intel RealSense (Stereo depth) Project NorthStar (2018) Meta2 (2016)
  • 59. Project NorthStar Hand Interaction
  • 60. Self Contained Systems • Sensors and processors on device • Fully mobile • Customized hardware/software • Example: Hololens 2 (2019) • 3D hand tracking • 21 points/hand tracked • Gesture driven interface • Constrained set of gestures • Multimodal input (gesture, gaze, speech)
  • 61. Hololens 2 Gesture Input Demo (MRTK) https://www.youtube.com/watch?v=qfONlUCSWdg
  • 62. Speech Input • Reliable speech recognition • Windows speech, Watson, etc. • Indirect input with AR content • No need for gesture • Match with gaze/head pointing • Look to select target • Good for Quantitative input • Numbers, text, etc. • Keyword trigger • “select”, ”hey cortana”, etc https://www.youtube.com/watch?v=eHMkOpNUtR8
  • 63. Eye Tracking Interfaces • Use IR light to find gaze direction • IR sources + cameras in HMD • Support implicit input • Always look before interact • Natural pointing input • Multimodal Input • Combine with gesture/speech Camera IR light IR view Processed image Hololens 2
  • 65. Evolution of AR Interfaces Tangible AR Tangible input AR overlay Direct interaction Natural AR Freehand gesture Speech, gaze Tangible UI Augmented surfaces Object interaction Familiar controllers Indirect interaction 3D AR 3D UI Dedicated controllers Custom devices Browsing Simple input Viewpoint control Expressiveness, Intuitiveness
  • 67. Interaction Design “Designing interactive products to support people in their everyday and working lives” Preece, J., (2002). Interaction Design • Design of User Experience with Technology
  • 68. Bill Verplank on Interaction Design https://www.youtube.com/watch?v=Gk6XAmALOWI
  • 69. •Interaction Design involves answering three questions: •What do you do? - How do you affect the world? •What do you feel? – What do you sense of the world? •What do you know? – What do you learn? Bill Verplank
  • 70. Typical Interaction Design Cycle Develop alternative prototypes/concepts and compare them, And iterate, iterate, iterate....
  • 72. How Do You Prototype This?
  • 76.
  • 77.
  • 78.
  • 80.
  • 81.
  • 82. Tom Chi’s Prototyping Rules 1. Find the quickest path to experience 2. Doing is the best kind of thinking 3. Use materials that move at the speed of thought to maximize your rate of learning
  • 83. How can we quickly prototype XR experiences with little or no coding?
  • 84. Prototyping in Interaction Design Key Prototyping Steps
  • 85. ● Quick visual design ● Capture key interactions ● Focus on user experience ● Communicate design ideas ● “Learn by doing/experiencing” Why Prototype?
  • 86. Importance of Physical and Digital Prototypes • Quick & dirty • Explore interactions • Get initial user feedback • Avoid premature commitment • Devise technical requirements
  • 87. Typical Development Steps ● Sketching ● Storyboards ● UI Mockups ● Interaction Flows ● Video Prototypes ● Interactive Prototypes ● Final Native Application Increased Fidelity and Interactivity
  • 88. From Idea to Product
  • 89. XR Prototyping Techniques Lo- Fi Hi- Fi Easy Hard Digital Authoring Immersive Authoring Web-Based Development* Cross-Platform Development* Native Development* * requires scripting and 3D programming skills Sketching Paper Prototyping Video Prototyping Wireframing Bodystorming Wizard of Oz
  • 90. XR Prototyping Tools Low Fidelity (Concept, visual design) • Sketching • Photoshop • PowerPoint • Video High Fidelity (Interaction, experience design) • Interactive sketching • Desktop & on-device authoring • Immersive authoring & visual scripting • XR development toolkits
  • 91. Advantages/Disadvantages Prototype Advantages Disadvantages Low-fidelity prototype - low developmental cost - evaluate multiple design concepts - limited error checking - navigational and flow limitations High-fidelity prototype - fully interactive - look and feel of final product - clearly defined navigational scheme - more expensive to develop - time consuming to build - developers are reluctant to change something they have crafted for hours
  • 92. Content and Interaction Polished Content Placeholder Content Implicit & Explicit Interactions “click” Lo-Fi Hi-Fi
  • 93. XR Prototyping Tools Low Fidelity (Concept, visual design) • Sketching • Photoshop • PowerPoint • Video High Fidelity (Interaction, experience design) • Interactive sketching • Desktop & on-device authoring • Immersive authoring & visual scripting • XR development toolkits
  • 94. Your Most Valuable Prototyping Tool..
  • 96.
  • 97. Sketching as Dialogue “Sketching is about the Activity not the Result” - Bill Buxton
  • 98. Buxton’s Key Attributes of Sketching • Quick • Work at speed of thought • Timely • Always available • Disposable • Inexpensive, little investment • Plentiful • Easy to iterate • A catalyst • Evokes conversations
  • 100. AR/VR Interface Design Sketches •Sketch out Design concept(s)
  • 101.
  • 102.
  • 103. From Sketch to Product •Sketches: - early ideation stages of design •Prototypes: - detailing the actual design
  • 105. Storyboarding - Describing the Experience http://monamishra.com/projects/Argo
  • 106. Wireframes It’s about - Functional specs - Navigation and interaction - Functionality and layout - How interface elements work together - Defining the interaction flow/experience Leaving room for the design to be created
  • 107. Task Flow User flow with the application Informs screen/interface elements
  • 109. Mockup It’s about - Look and feel - Building on wireframe - High fidelity visuals - Putting together final assets - Getting feedback on design
  • 110.
  • 111.
  • 112.
  • 113. Interface Sketching in VR Using VR applications for rapid prototyping - Intuitive sketching in immersive space - Creating/testing at 1:1 scale - Rapid UI design/layout Examples - Quill - https://quill.fb.com/ - Tilt Brush - https://www.tiltbrush.com/
  • 114. Example: VR Concept Interface for SketchUp https://www.youtube.com/watch?v=5FR3BLkgSPk
  • 115. Designing AR in VR https://www.youtube.com/watch?v=TfQJhSJQiaU
  • 117. Microsoft Maquette •Prototype AR/VR interfaces from inside VR •3D UI for spatial prototyping •Bring content into Unity with plug-in •Javascript support
  • 119. Scene Assembly In AR • Many tools for creating AR scenes • Drag and drop your assets • Develop on web, publish to mobile • Examples • Catchoom - CraftAR • Blippar - Blipbuilder • ARloopa - Arloopa studio • Wikitude - Wikitude studio • Zappar - ZapWorks Designer
  • 120. CraftAR •Web-based AR marker tracking •Add 3D models, video, images to real print content •Simple drag and drop interface •Cloud based image recognition •https://catchoom.com/augmented-reality-craftar/
  • 122.
  • 125. Basic Materials ● Post-its ● 5x8 in. index cards ● Scissors, X-acto knives ● Overhead transparencies ● Large, heavy, white paper (11 x 17) ● Tape, stick glue, correction tape ● Pens & markers (many colors & sizes)
  • 126. Example: Mobile Paper Prototype https://www.youtube.com/watch?v=jTytI1PkGFM
  • 127. AR Prototyping with Layers ● Separate world-stabilized and head stabilized ○ Draw world stabilized on background paper ○ Draw head stabilized on transparent plastic ● Simulate Field of View of AR HMD Lauber, F., Böttcher, C., & Butz, A. (2014, September). Papar: Paper prototyping for augmented reality. In Adjunct Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 1-6). FOV in Red Head Stabilized - Foreground World Stabilized - Background
  • 128. Example: Mobile AR Prototyping https://www.youtube.com/watch?v=Hrbflbct99o
  • 129. AR Visualization with Props https://www.youtube.com/watch?v=dXBXBHLqrLM
  • 130. Case Study: Higher Fidelity Prototype https://medium.com/@dylanhongtech/augmented-reality-prototyping-tools-for-head-mounted-displays-c1ee8eaa0783
  • 131. AR Prototyping for HMDs •Tools •Storyboarding - Concept •Sketch – high fidelity 2D •Cinema4D – 3D animation •SketchBox – 3D layout •Photoshop – Visual mockup •PowerPoint - Interactivity
  • 132. Sketch Design •User Interface Layout •Overlay AR elements on blurry background
  • 135. Photoshop •High end visual mock-ups •Add AR content to real world views
  • 136. Unity/MRTK Prototyping •Viewing on the Hololens2, simple gesture interaction
  • 138. Video Sketching • Process • Capture elements of real world • Use series of still photos/sketches in a movie format. • Act out using the product • Benefits • Demonstrates the product experience • Discover where concept needs fleshing out. • Communicate experience and interface • You can use whatever tools you want, e.g. iMovie.
  • 139. Example ToastAR (Apple) View here: https://developer.apple.com/videos/play/wwdc2018/808/
  • 141.
  • 142. Google Glass Concept Movie https://www.youtube.com/watch?v=5R1snVxGNVs
  • 143. Tvori - Animating AR/VR Interfaces • Animation based AR/VR prototyping tool • https://tvori.co/ • Key features • Model input, animation, etc • Export 360 images and video • Simulate AR views • Multi-user support • Present in VR • Create VR executable
  • 145. Vuforia Studio • Author animated AR experiences • Drag and drop content • Add animations • Import CAD models • Combine with IOT sensors • https://www.ptc.com/en/products/vuforia/vuforia-studio