A talk from the Develop Track at AWE USA 2018 - the World's #1 XR Conference & Expo in Santa Clara, California May 30- June 1, 2018.
Neil Sarkar (AdHawk Microsystems): Ultra-Fast Eye Tracking Without Cameras for Mobile AR Headsets
This session showcases the first camera-free eye-tracking microsystem. A MEMS (microelectromechanical system) device on a tiny chip scans a beam of light across the eye 4,500 times every second. The latest specifications to be revealed at AWE are enabling foveated rendering in mobile platforms, endpoint prediction during saccades, and unprecedented insights into the state of the user.
http://AugmentedWorldExpo.com
2. 2
Eye Tracking will be Critical for VR/AR to become the Next Major Computing Platform
-
10,000
20,000
30,000
40,000
50,000
60,000
70,000
80,000
2010 2011 2012 2013 2014 2015 2016 2017 2018 2019 2020 2021 2022
PB/month
Global Mobile Data Traffic Growing Exponentially (1)
VR/AR requires high bandwidth
and low latency
Strong mobile data demand from VR/AR
(1) Source: Barclays (October 4, 2017)
Data demand is growing exponentially with unprecedented
access to information → need to curate
Themes
Information Needs to be Curated
Real world and data blending more seamlessly → AR/VR, IoT
New input modalities required → first touch, then voice, now
eye
Displays moving to 20MP per eye → 10x more pixels requires
foveated rendering
Highest bandwidth into human is through visual cortex → input
and output
Eyes make over 100,000 movements/day → can reveal insight
into the state of the user
3. 3
Saccades, Microsaccades, Intersaccadic Drifts
Eye tracking reveals interest through fixations
Eye tracking monitors fatigue through saccade velocity
Eye tracking monitors moods through analysis of trajectories
Eye tracking can predict next fixation 100ms before it occurs
4. 4
“Picture of the brainstem circuitry underlying microsaccade generation is almost complete”
Study of eye movement may predict the onset of
neurodegenerative disease: active research area
Optometrists study eye movement anomalies: end-
point nystagmus, saccade anomalies, pursuit
anomalies, microsaccadic oscillations, image
stabilizing devices
Medical Diagnostics
Concussion
Alzheimer's
Schizophrenia
Dyslexia
Autism
Parkinson’s
5. 5
Heat maps of fixation indicate efficacy of ad placement
provide insight on user attention
50Hz, 5 degrees
“Google pay per gaze advertisement revenue”
Market Research
6. 6
Conventional Eye Tracking Methods and Background: Camera Based
Camera-based eye tracking technology requires substantial computation and power
Research Grade Example Market Research Glasses Example
Shortcomings
Intensive processing: Algorithms must process hundreds of pictures/second
Mediocre specifications: High power consumption, low speed and high latency, complex system integration
Tethered devices: Mobility is compromised (have to record data for post processing when mobile)
Expensive: Research grade >$50,000 and other systems ~$15,000 in low volume
Yarbus Eye Tracker
7. 7
Camera-Free Eye Tracking Technology
AdHawk tracks the movement of the eye based on an infrared scan of the cornea’s position
Eye Tracking System Design
First eye tracker created that does not require cameras
Scanner Module Technology (MEMS)
Module scans in both x and y and covers a 90 degree range
Eye:
r=12mm
Cornea:
r=8mm
50mmeyeglasswidth
Compact Scanner Module
Photodiode Module
Link to tracking concept video Link to scanner video
8. 8
10x Improvements in Eye Tracking Specifications
AdHawk’s specifications have order of magnitude improvements compared to camera-based solutions
Metric CMOS-MEMS Camera-Based
System
Integration
(per Eye)
Flex cable with 6 wires
Scanner Module:
Current: ~4.3 x 4.3 x 2.2 mm
Q4 19: ~2.0 x 1.5 x 1.0 mm
Photodiode (only 1 required):
Current: ~4.5 x 4.5 x 1.0 mm
Q4 18: ~2.0 x 1.2 x 0.45 mm
ASIC: ~1.6 x 1.6 x 0.5 mm
MIPI lines, hot mirrors
(tight tolerances)
Camera:
~3.0 x 3.0 x 2.5 mm
IR LEDs (total 10 required):
~1.5 x 1.5 x 1.5 mm
ASIC: ~8x8 mm
Speed 750 Hz – 2250 Hz (x,y pairs) ~60 - 120 Hz
Latency ~1.8ms (1) ~10ms (2)
Power
Consumption
(per Eye)
Current: ~300 mW
Target: <50 mW (3)
>1W
Resolution ~0.25 degrees ~0.5 - 1.0 degree
Communication
Bandwidth
100 Kbps 100,000 Kbps
BOM at Volume
(including
licensing)
~2x less ~2x more
Specifications vs Camera-Based Glasses Integration
VR Headset Integration
(1) Negative latency of up to -20ms is achieved through endpoint prediction (data available near the middle of a saccade)
(2) Time from mid exposure to data available on interface
(3) Target power consumption per eye includes MEMS, MEMS drive circuitry, USB bus interface, microcontroller, photodiode analog signal chain, VCSEL driver and VCSEL
9. 9
Camera-Free Eye Tracking Technology (Cont’d)
AdHawk can capture all types of eye movements with unprecedented resolution and bandwidth
Eye Tracking Data Eye Tracking Prediction Video
Link to endpoint prediction video
AdHawk’s eye-tracker produces ~10x more data within a saccade Eye position is measured 350 times per second, producing velocity
data that may be used for endpoint prediction
60Hz vs 1kHz Eye Tracking
10. 10
Endpoint Prediction
Raw Data (uncalibrated) Eye Position vs Time Indicating Location of Prediction
Velocity (after moving average)
Uncalibrated dataset obtained while reading a paragraph of text
Velocity-based endpoint prediction produces estimates well ahead of next fixation
11. 11
Menu Navigation - Uncalibrated
Eye Dwelling is strenuous
Relative movements improve robustness and reduce false positives
High sampling rate is required to capture saccades robustly
12. 12
Augmented Reality Use Cases
Menu Navigation
Next generation AR devices depend on high speed eye tracking
Most
Challenging
Least
Challenging
Incorporating Visual
System into Display
Focal plane
IPD
Directing display onto
fovea
Simulated Depth
QR Code Scanning
Warehouse Management
Remote Assistance,
Guided Surgery
Understanding User
Interests
Q3 Q4 1Q19 Q2
Augmented Reality
Ocular Health
Examinations
Targeted Advertising
13. 13
Virtual Reality Use Cases
Menu Navigation
We are solving the most challenging use cases to create the most immersive VR experiences
Most
Challenging
Least
Challenging
End Point Prediction Capturing Intent
Object Selection
Assessing Fatigue IPD Adjustment
Engagement
Foveated Rendering
Simulated Depth
Targeting in Games
Calibration required Low Latency Required Database and ML required
Profile State of
User
Q21Q18 Q3 Q4+
Virtual Reality
Targeted
Advertising
14. 14
AdHawk’s Eye Tracking Solution
APPLICATIONS
• Eye tracking use cases critical to next generation AR/VR devices
FOVEATED
RENDERING,
DEPTH OF FIELD
FIRMWARE
• Algorithms capture eye movement data with unprecedented bandwidth
END POINT
PREDICTION,
GESTURES
HARDWARE
• World’s first camera-free eye tracker using an ultra-compact MEMS
microsystem
10X SUSTAINABLE
IMPROVEMENTS
Correlations Reveal:
• Intent
• State of User
• Medical Indications
Eye Movement Data
(Optometrists, Game
Developers):
• Saccades
• Smooth Pursuit
• VOR
CLOUD BASED MACHINE LEARNING
We are building a database of eye movements to reveal insight into the state of the user
15. 15
VR Market Segmentation
Mobile VR
2018 will see the first standalone VR headsets and greater triple-A content
Standalone VR Low Cost PC VR High End and Console
$80-$100 + smartphone $200-$400 $400-$450 + standard PC $500-$800 + high end PC
Samsung Gear
Google Daydream View
Oculus Go
Vive Focus
Lenovo Mirage
Samsung
Odyssey
HP
Dell Asus
Acer Lenovo
Vive Pro
Sony PSVR
LG (18MP prototype)
Oculus Rift
Driving greater mass adoption Driving enterprise VR Driving high end gaming
GPU Developers ODMsReference Designs
16. 16
AR Market Segmentation
Smartglasses
AR is expected to undergo a shift from enterprise to consumer as form factors become sleeker, experiences
become more immersive and greater content emerges (eg. ARKit and ARCore)
Holographic AR
$1000-$3000 $3000+
Sleeker form factor helping to drive market acceptance
Vuzix Blade
ODG R-9
Intel Vaunt Magic Leap One
Meta
Avegant
Google Glass
More immersive holographic experiences
starting to shift to consumer
Microsoft Hololens
17. 17
Summary
AdHawk has developed the world’s first camera-free eye tracker using an ultra-compact MEMS platform
Specifications offer order-of-magnitude improvements in speed, latency, power consumption and resolution
Manufacturing at the wafer scale with a robust supply chain of CMOS, MEMS and assembly partners
Enabling immersive and untethered VR/AR experiences (eye gestures, endpoint prediction, foveated rendering)
Enabling mobile diagnostic capabilities for researchers and practitioners
Ontario Research Fund