SlideShare une entreprise Scribd logo
1  sur  5
Télécharger pour lire hors ligne
Page | 1
Smart Glasses and the Evolution of Human-Computing Interfaces
Within the emerging category of wearable computing, arguably the most characteristic product to
emerge is "smart glasses" which mesh the communications capabilities of smartphones with
additional visual and other sensual enhancements, including augmented reality. The primary selling
feature of smart glasses is their ability to display video, navigation, messaging, augmented reality
(AR) applications, and games on a large virtual screen, all completely hands-free. The current
poster child for smart glasses is Google’s "Glass" product, but there are more than 20 firms offering
smart glasses or planning to do so.
The hands-free nature of smart glasses opens up new possibilities for human-computer interfaces
(HCI), drawing from smart phones as well as interfaces developed in other contexts (e.g. virtual
reality). Early smart glasses models are leaning on mature and low-cost technologies with notable
influence from smartphones; however we see a gradual trend for smart glasses (and other
wearable computing devices) to be driven by more natural interface controls, once these
technologies have time to mature as well -- and they're getting remarkably close.
Touch: A Logical First Step
The most prominent smart glasses user interface in these earliest iterations is touch sensitivity,
through a touchpad built into the glasses formfactor itself or through a separate pocket-sized
tethered accessory device which contains most of the actual componentry including a touchpad.
Most of the prominent smart glasses offerings utilize a touch interface, most notably Google Glass
which is so far the mind- and market-share leader in this market. This is influencing other OEMs,
and suppliers, to think about these first smart glasses products involving some kind of touch
interface. Leveraging mobile-device touch interfaces also leverages the benefits of lower costs of
the technology. Adding touch functionality to low-cost high-volume consumer products is
progressively becoming noticeably inexpensive.
NanoMarkets believes that touch likely will continue to play a role in smart glasses user interface
throughout the next several years -- but it will be a diminishing role, both as the technology moves
further away from its smartphone inspiration and established itself firmly, and as other supporting
interface capabilities improve.
Voice Recognition: Seeking a Balance
The next step for smart-glasses HCI is voice command recognition, arguably a better fit for such a
hands-free wearable application. Google Glass already uses voice recognition; other smart glasses
developers incorporating or considering it include Brilliant Labs (Mirama One), Kopin (Golden-i
HMD), Vuzix (M-100), and Epson (BT-200). Kopin's Pupil design eschews touch entirely in favor of
voice recognition. Samsung is among several companies reportedly in talks with Nuance, the
company that helped Google develop Siri, reportedly with an eye toward finding a use for its own
future smart glasses.
Page | 2
This technology is fairly mature. Various voice recognition applications such as Siri are proliferating
in the smartphone market. Chips that provide speech recognition, synthesis, and system control
on a single chip can be procured for $1-$5 (depending on volumes, packaging type, and memory
size), though at the component level another $10 or so will be needed for the audio system which
is required in any voice recognition system. Speech-to-text may have a future role to play in smart
glasses as well, though it does not appear to be widely used. Researchers at Oxford University
have already developed a smart glasses prototype with speech-text capability targeting the market
for the visually impaired. In the commercial area, two applications (WatchMeTalk and SubtleGlass)
are able to convert speech to text; and OrCam Technologies has developed a camera that attaches
to any glasses (not just "smart glasses") to "see" text on almost substrate and read the text aloud,
with future plans to include facial recognition and the ability to recognize places.
While voice is possibly the best balance (right now) between "naturalness" and technology maturity,
it is not really a selling feature per se, and our impression is that smart glasses OEMs are not
making much of a fuss about it yet. We expect more smart glasses OEMs to quietly adopt voice
recognition, and likely it will become a ubiquitous interface for the next several years, until the next
and future HCI technology matures and overtakes it: gesture control.
Gestural Control: The Future of Natural HCI
Neither touch nor voice, in our opinion, are the eventual winning user interface for smart glasses.
Touching one's glasses is not a commonly performed gesture, and a tactile interface action doesn't
seem to align with the hands-free paradigm. Meanwhile, talking out loud in some environments can
be challenging (e.g., a busy warehouse or metropolitan area) or incompatible with privacy or
security needs (e.g., a hospital; also perhaps awkward in general consumer use.) Instead, we
expect the eventual ascension of a truly more natural HCI for a wearable electronic device: gesture
recognition.
Several different technologies are currently available for tracking motions and gestures, but these
technologies are intended to support gaming, medical and systems that are not smart glasses in
any sense.
Page | 3
Technology Description
Companies
Involved
Commercial Prospects
Inertial Sensors
Inertial measurement unit
(IMU) with 3-axis
accelerometers and
gyroscopes to detect precise
3D motion; 9-axis option
adds magnetometer
(compass)
Freescale (U.S.),
Invensense (U.S.)
Likely solution for smart
phones and wearables, in
combination with optical or
other approaches
Ultrasonic
Uses ultrasonic speakers
and microphones to detect
disturbances in sound
waves
Elliptic Labs
(Norway), Chirp
Microsystems
(U.S.)
Reasonable chance of
increased commercialization
in the long term, especially if
it can be expanded into more
applications beyond laptops
Electrical Field
Detects disturbances in
electrical field caused by
hand gestures
Microchip
Technologies
(U.S.)
Facing an uphill battle against
other technologies in
consumer electronics but
compelling for smart home
and other applications that
are a few years out
Magnetic Field
Detects disturbances in
magnetic field caused by
hand gestures
Telekom
Innovation
Laboratories
(Germany)
Not likely to make it
commercially in high-volume
products; possible for VR
applications
Muscle
Movement
Detects electrical signals
generated by muscle
movement
Thalmic Labs
(Canada)
Requires an armband, but
compelling in medical
applications; market demand
remains to be seen
Eye Tracking
Tracks eye movements to
note where the user is
looking
EyeSight (Israel),
Tobii (Sweden),
Quantum Interface
(U.S.)
Likely to be combined with
other types of gesture control
Page | 4
Among these gestural options, eye tracking takes on special importance in the context of a smart-
glasses interface for obvious reasons. Yet this technology has some way to progress before it's
truly ready. In some of today's commercial smart glasses technologies, head tracking is often used
instead. APX Labs, for example, claims to have provisional patents on gesture and motion-based
input based on onboard sensors, tracking a user's head rather than eyes to interact with content
displayed on the screen. Improved accuracy and reliability are needed if eye tracking is to do more
than tell whether the user is looking at one of several specific locations.
We would note that for eye tracking, subsystems not long ago cost several thousands of dollars,
but low-cost (and low-performance) eye tracking sub-systems can be obtained now. For example,
The Eye Tribe offers a subsystem for around $100; and U.K. researchers claim a design that could
deliver eye tracking for around $30 per item.
Making Gesture Work for Smart Glasses
At the moment, gestural recognition suffers from a lack of technological maturity. In commercial
applications it doesn't quite deliver needed functionality, nor is it sufficiently reliable or robust to be
extended into a consumer electronics product. There are a multitude of improvements that
NanoMarkets believes will be needed if gestural recognition is ever to become common, especially
for consumer-oriented smart glasses:
 Low power. Sensors in gestural recognition have tended to be power hogs, but sensor
makers are coming up with new IMU sensors that consume almost no power in standby mode
and reasonably little when fully active. Higher-capacity batteries and other energy sources
could be appropriate in this context, though there would be a trade-off with other system
requirements, ultimately with cost as the bottom line.
 Better data capture. Subsystems must support the sophisticated movements that are a
normal part of the human gestural repertoire. For example, Atheer Labs is developing a
smart-glasses product that can track in all hand orientations, both hands, and up to 10-finger
identification (multiple fingers tracked separately). As part of this trend, NanoMarkets expects
more use of time-of-flight (ToF) sensors. Brilliant Labs is integrating ToF for gesture
recognition, since it does not depend on surrounding light (it has its own light source),
provides clear images, can cancel out noise easily, provides good depth measurements, and
can be used in most environments. This is seen as better than RGB cameras which have
difficulty tracking gestures and canceling background noise.
 Improved cameras. To the extent that cameras are used for gestural recognition in smart
glasses, one can assume that they will grow beyond off-the-shelf 2D cameras to stereo
cameras and then to 3D cameras with image sensors that can detect image and depth
information at the same time. To this end, ToF cameras could very well be the next big thing
in optical gesture recognition. Instead of scanning a scene, ToF cameras emit IR light and
measure how long it takes the light to travel to the image, be reflected, and travel back to the
image sensor. An array of pixels image every point at once. While it is possible to obtain full
3D imaging by other methods, ToF promises very fast response times because it collects data
Page | 5
from an entire scene at once rather than scanning the field of view. ToF systems are not
cheap, however, which may limit their use in consumer devices.
Most smart glasses OEMs appear to view gesture recognition capability as still some ways off, but
the majority of OEMs are working on this issue. In some cases, this work is just R&D, but there are
also some gestural recognition that is already embodied in commercial smart glasses. Numerous
efforts from lab projects (SixthSense) to large company patents (Microsoft, Google) to product
development (Epson, Pivothead, Sony, Technical Illusions, Thalmic Labs, Vuzix) are exploring and
incorporating gesture control in a smart glasses product, from gesture to head tracking to eye
tracking interfaces.
We believe that smart glasses will follow a trajectory towards more natural HCIs, so that the smart
glasses gradually "merge" with the body. The problem is that gestural recognition is still not quite
reliable and does not yet offer the cost/performance ratio necessary for smart glasses to transition
into a profitable consumer electronics item. While gestural recognition is not quite ready for prime
time, it is getting close.
Into the Future: The Brain-Computer Interface
Ultimately the next step in the HCI paradigm with wearable electronics to establish the most natural
interface is to remove the proverbial middlemen and develop a direct communication pathway
between the brain and an external device. The brain-computer interface (BCI) has long been a sci-
fi staple, and the first neuroprosthetic devices implanted in humans appeared in the mid-1990s, but
there is still much R&D activity happening in this area. Mostly these have focused on medical
applications such as restoring damaged hearing, sight, and movement.
Recently, however, lower-cost BCIs have begun to emerge, aimed at R&D or gaming applications.
These could be transferred to smart glasses at some time, and we would not be surprised to see
such brain-computer interfaces appear in future generations of smart glasses in a few years time -
- however, most smart glasses OEMs appear to be not yet thinking about this.
The information contained in this article was drawn from the NanoMarkets report,
Smart Glasses: Component and Technology Markets: 2014
See more at: http://nanomarkets.net/market_reports/report/smart-glasses-component-and-
technology-markets-2014

Contenu connexe

Tendances

Tendances (20)

Future Applications of Smart Iot Devices
Future Applications of Smart Iot DevicesFuture Applications of Smart Iot Devices
Future Applications of Smart Iot Devices
 
Pistoia Alliance Debates: Smart Glasses, Smart Scientists; 25th February 2016...
Pistoia Alliance Debates: Smart Glasses, Smart Scientists; 25th February 2016...Pistoia Alliance Debates: Smart Glasses, Smart Scientists; 25th February 2016...
Pistoia Alliance Debates: Smart Glasses, Smart Scientists; 25th February 2016...
 
"Embedded Vision in Augmented Reality: Trends and Opportunities," a Presentat...
"Embedded Vision in Augmented Reality: Trends and Opportunities," a Presentat..."Embedded Vision in Augmented Reality: Trends and Opportunities," a Presentat...
"Embedded Vision in Augmented Reality: Trends and Opportunities," a Presentat...
 
finalgoogle
finalgooglefinalgoogle
finalgoogle
 
Extended Reality usecases for B2C and B2E - Virtual Meetup September 2021
Extended Reality usecases for B2C and B2E - Virtual Meetup September 2021Extended Reality usecases for B2C and B2E - Virtual Meetup September 2021
Extended Reality usecases for B2C and B2E - Virtual Meetup September 2021
 
IRJET-Advance Technology- Google Glass
IRJET-Advance Technology- Google GlassIRJET-Advance Technology- Google Glass
IRJET-Advance Technology- Google Glass
 
Google glass documentation
Google glass documentationGoogle glass documentation
Google glass documentation
 
Google Glass seminar complete
Google Glass seminar completeGoogle Glass seminar complete
Google Glass seminar complete
 
Pistoia Alliance Debates: Smart Glasses, Smart Scientists; 25th February 2016...
Pistoia Alliance Debates: Smart Glasses, Smart Scientists; 25th February 2016...Pistoia Alliance Debates: Smart Glasses, Smart Scientists; 25th February 2016...
Pistoia Alliance Debates: Smart Glasses, Smart Scientists; 25th February 2016...
 
GOOGLE GLΛSS By Google X and Google.inc
GOOGLE GLΛSS By Google X and Google.incGOOGLE GLΛSS By Google X and Google.inc
GOOGLE GLΛSS By Google X and Google.inc
 
Seminar on google glass
Seminar on google glassSeminar on google glass
Seminar on google glass
 
Magic Leap Augmented Reality Strategy Insights from Patents
Magic Leap Augmented Reality Strategy Insights from PatentsMagic Leap Augmented Reality Strategy Insights from Patents
Magic Leap Augmented Reality Strategy Insights from Patents
 
Google Glass
Google GlassGoogle Glass
Google Glass
 
Implementation of augmented reality(ar) technology for business beneficiary
Implementation of augmented reality(ar) technology for business beneficiaryImplementation of augmented reality(ar) technology for business beneficiary
Implementation of augmented reality(ar) technology for business beneficiary
 
Augmented reality applications in manufacturing and maintenance
Augmented reality applications in manufacturing and maintenance Augmented reality applications in manufacturing and maintenance
Augmented reality applications in manufacturing and maintenance
 
Technical Report on Google Glass/Department of INFORMATION TECHNOLOGY
Technical Report on Google Glass/Department of INFORMATION TECHNOLOGYTechnical Report on Google Glass/Department of INFORMATION TECHNOLOGY
Technical Report on Google Glass/Department of INFORMATION TECHNOLOGY
 
Google glass
Google glassGoogle glass
Google glass
 
Google glass
Google glassGoogle glass
Google glass
 
Google glass
Google glassGoogle glass
Google glass
 
Seminar report on google glass
Seminar report on google glassSeminar report on google glass
Seminar report on google glass
 

Similaire à Smart Glasses and the Evolution of Human-Computing Interfaces

Project glass ieee document
Project glass ieee documentProject glass ieee document
Project glass ieee document
bhavyakishore
 
Epsilon CES 2017 Trend Recap
Epsilon CES 2017 Trend RecapEpsilon CES 2017 Trend Recap
Epsilon CES 2017 Trend Recap
Tom Edwards
 

Similaire à Smart Glasses and the Evolution of Human-Computing Interfaces (20)

Project glass ieee document
Project glass ieee documentProject glass ieee document
Project glass ieee document
 
Epsilon CES 2017 Trend Recap
Epsilon CES 2017 Trend RecapEpsilon CES 2017 Trend Recap
Epsilon CES 2017 Trend Recap
 
Epsilon 2017 CES Trends recap
Epsilon 2017 CES Trends recapEpsilon 2017 CES Trends recap
Epsilon 2017 CES Trends recap
 
Screenless display
Screenless displayScreenless display
Screenless display
 
ILTA Future Horizons Technology Timeline 2014 - 2030
ILTA Future Horizons Technology Timeline 2014 - 2030ILTA Future Horizons Technology Timeline 2014 - 2030
ILTA Future Horizons Technology Timeline 2014 - 2030
 
Seminar report meta
Seminar report metaSeminar report meta
Seminar report meta
 
Raspberry Pi Augmentation: A Cost Effective Solution To Google Glass
Raspberry Pi Augmentation: A Cost Effective Solution To Google GlassRaspberry Pi Augmentation: A Cost Effective Solution To Google Glass
Raspberry Pi Augmentation: A Cost Effective Solution To Google Glass
 
screen less display documentation
screen less display documentationscreen less display documentation
screen less display documentation
 
Screenless display report
Screenless display reportScreenless display report
Screenless display report
 
HCI in Smart glasses
HCI in Smart glassesHCI in Smart glasses
HCI in Smart glasses
 
Cyborg Design: Multimodal Interactions, Information, and Environments for Wea...
Cyborg Design: Multimodal Interactions, Information, and Environments for Wea...Cyborg Design: Multimodal Interactions, Information, and Environments for Wea...
Cyborg Design: Multimodal Interactions, Information, and Environments for Wea...
 
When smart-phones sense how you feel: The era of intelligent mobile devices -...
When smart-phones sense how you feel: The era of intelligent mobile devices -...When smart-phones sense how you feel: The era of intelligent mobile devices -...
When smart-phones sense how you feel: The era of intelligent mobile devices -...
 
W4
W4W4
W4
 
The Age of Empathic Devices - Beyond Fusion 2014 Conference
The Age of Empathic Devices - Beyond Fusion 2014 ConferenceThe Age of Empathic Devices - Beyond Fusion 2014 Conference
The Age of Empathic Devices - Beyond Fusion 2014 Conference
 
CES 2016 Europe - digitalni trendovi
CES 2016 Europe - digitalni trendoviCES 2016 Europe - digitalni trendovi
CES 2016 Europe - digitalni trendovi
 
Technological trends 2017
Technological trends 2017Technological trends 2017
Technological trends 2017
 
Haptic Technology Trend Assessment Paper
Haptic Technology Trend Assessment PaperHaptic Technology Trend Assessment Paper
Haptic Technology Trend Assessment Paper
 
Decoding touch technology
Decoding touch technologyDecoding touch technology
Decoding touch technology
 
Disruptive enterprise communications technologies
Disruptive enterprise communications technologiesDisruptive enterprise communications technologies
Disruptive enterprise communications technologies
 
Wearable Products and Technology Outlook - July 2013
Wearable Products and Technology Outlook - July 2013Wearable Products and Technology Outlook - July 2013
Wearable Products and Technology Outlook - July 2013
 

Plus de n-tech Research

Smart Coatings Markets 2015-2022
Smart Coatings Markets 2015-2022Smart Coatings Markets 2015-2022
Smart Coatings Markets 2015-2022
n-tech Research
 
Organic Photovoltaic Markets 2015-2022 Sample Chapter
Organic Photovoltaic Markets 2015-2022  Sample ChapterOrganic Photovoltaic Markets 2015-2022  Sample Chapter
Organic Photovoltaic Markets 2015-2022 Sample Chapter
n-tech Research
 
Power Sources for the Internet-of-Things: Markets and Strategies
Power Sources for the Internet-of-Things: Markets and StrategiesPower Sources for the Internet-of-Things: Markets and Strategies
Power Sources for the Internet-of-Things: Markets and Strategies
n-tech Research
 

Plus de n-tech Research (20)

Electrochromic Glass and Film Markets 2016-2023 Slides
Electrochromic Glass and Film Markets  2016-2023 SlidesElectrochromic Glass and Film Markets  2016-2023 Slides
Electrochromic Glass and Film Markets 2016-2023 Slides
 
Markets for Metamaterials 2016-2023 Slides
Markets for Metamaterials 2016-2023 SlidesMarkets for Metamaterials 2016-2023 Slides
Markets for Metamaterials 2016-2023 Slides
 
Slides for Webinar on Multifunctional Smart Coatings and Surfaces
Slides for Webinar on Multifunctional Smart Coatings and SurfacesSlides for Webinar on Multifunctional Smart Coatings and Surfaces
Slides for Webinar on Multifunctional Smart Coatings and Surfaces
 
Smart Structures in the Construction Sector: Evolving into a Major Market Opp...
Smart Structures in the Construction Sector: Evolving into a Major Market Opp...Smart Structures in the Construction Sector: Evolving into a Major Market Opp...
Smart Structures in the Construction Sector: Evolving into a Major Market Opp...
 
White Paper: Smart Materials in the Construction Sector
White Paper: Smart Materials in the Construction SectorWhite Paper: Smart Materials in the Construction Sector
White Paper: Smart Materials in the Construction Sector
 
Evaluating Opportunities for Solar PV in Mobile Electronic Devices
Evaluating Opportunities for Solar PV in Mobile Electronic DevicesEvaluating Opportunities for Solar PV in Mobile Electronic Devices
Evaluating Opportunities for Solar PV in Mobile Electronic Devices
 
The Evolving Building Integrated Photovoltaics Market
The Evolving Building Integrated Photovoltaics MarketThe Evolving Building Integrated Photovoltaics Market
The Evolving Building Integrated Photovoltaics Market
 
Smart Coatings Markets, An Emerging Opportunity
Smart Coatings Markets, An Emerging OpportunitySmart Coatings Markets, An Emerging Opportunity
Smart Coatings Markets, An Emerging Opportunity
 
Perovskite PV: Updating Our Views on Progress and Commercialization Timelines
Perovskite PV: Updating Our Views on Progress and Commercialization TimelinesPerovskite PV: Updating Our Views on Progress and Commercialization Timelines
Perovskite PV: Updating Our Views on Progress and Commercialization Timelines
 
Smart Coatings Markets 2015-2022
Smart Coatings Markets 2015-2022Smart Coatings Markets 2015-2022
Smart Coatings Markets 2015-2022
 
Organic Photovoltaic Markets 2015-2022 Sample Chapter
Organic Photovoltaic Markets 2015-2022  Sample ChapterOrganic Photovoltaic Markets 2015-2022  Sample Chapter
Organic Photovoltaic Markets 2015-2022 Sample Chapter
 
Smart Mirrors Technologies and Markets, 2015-2022
Smart Mirrors Technologies and Markets, 2015-2022Smart Mirrors Technologies and Markets, 2015-2022
Smart Mirrors Technologies and Markets, 2015-2022
 
Slides from NanoMarkets webinar on Smart Coatings Feb 2015
Slides from NanoMarkets webinar on Smart Coatings  Feb 2015Slides from NanoMarkets webinar on Smart Coatings  Feb 2015
Slides from NanoMarkets webinar on Smart Coatings Feb 2015
 
Power Sources for the Internet-of-Things: Markets and Strategies
Power Sources for the Internet-of-Things: Markets and StrategiesPower Sources for the Internet-of-Things: Markets and Strategies
Power Sources for the Internet-of-Things: Markets and Strategies
 
Translating CIGS Efficiency Improvements Into Market Opportunity
Translating CIGS Efficiency Improvements Into Market OpportunityTranslating CIGS Efficiency Improvements Into Market Opportunity
Translating CIGS Efficiency Improvements Into Market Opportunity
 
What Market Movements Will Spur BIPV Growth?
What Market Movements Will Spur BIPV Growth?What Market Movements Will Spur BIPV Growth?
What Market Movements Will Spur BIPV Growth?
 
Opportunities for Smart Auto Glass
Opportunities for Smart Auto GlassOpportunities for Smart Auto Glass
Opportunities for Smart Auto Glass
 
Opportunities for BIPV
Opportunities for BIPVOpportunities for BIPV
Opportunities for BIPV
 
article- "Opportunities for BIPV"
article- "Opportunities for BIPV"article- "Opportunities for BIPV"
article- "Opportunities for BIPV"
 
Article - "Markets and Opportunities for Nanosensors"
Article - "Markets and Opportunities for Nanosensors"Article - "Markets and Opportunities for Nanosensors"
Article - "Markets and Opportunities for Nanosensors"
 

Dernier

Dernier (20)

Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challenges
 
MS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsMS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectors
 
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : Uncertainty
 
Real Time Object Detection Using Open CV
Real Time Object Detection Using Open CVReal Time Object Detection Using Open CV
Real Time Object Detection Using Open CV
 
Ransomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdfRansomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdf
 
GenAI Risks & Security Meetup 01052024.pdf
GenAI Risks & Security Meetup 01052024.pdfGenAI Risks & Security Meetup 01052024.pdf
GenAI Risks & Security Meetup 01052024.pdf
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a Fresher
 
A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?
 
Corporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxCorporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptx
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of Terraform
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 

Smart Glasses and the Evolution of Human-Computing Interfaces

  • 1. Page | 1 Smart Glasses and the Evolution of Human-Computing Interfaces Within the emerging category of wearable computing, arguably the most characteristic product to emerge is "smart glasses" which mesh the communications capabilities of smartphones with additional visual and other sensual enhancements, including augmented reality. The primary selling feature of smart glasses is their ability to display video, navigation, messaging, augmented reality (AR) applications, and games on a large virtual screen, all completely hands-free. The current poster child for smart glasses is Google’s "Glass" product, but there are more than 20 firms offering smart glasses or planning to do so. The hands-free nature of smart glasses opens up new possibilities for human-computer interfaces (HCI), drawing from smart phones as well as interfaces developed in other contexts (e.g. virtual reality). Early smart glasses models are leaning on mature and low-cost technologies with notable influence from smartphones; however we see a gradual trend for smart glasses (and other wearable computing devices) to be driven by more natural interface controls, once these technologies have time to mature as well -- and they're getting remarkably close. Touch: A Logical First Step The most prominent smart glasses user interface in these earliest iterations is touch sensitivity, through a touchpad built into the glasses formfactor itself or through a separate pocket-sized tethered accessory device which contains most of the actual componentry including a touchpad. Most of the prominent smart glasses offerings utilize a touch interface, most notably Google Glass which is so far the mind- and market-share leader in this market. This is influencing other OEMs, and suppliers, to think about these first smart glasses products involving some kind of touch interface. Leveraging mobile-device touch interfaces also leverages the benefits of lower costs of the technology. Adding touch functionality to low-cost high-volume consumer products is progressively becoming noticeably inexpensive. NanoMarkets believes that touch likely will continue to play a role in smart glasses user interface throughout the next several years -- but it will be a diminishing role, both as the technology moves further away from its smartphone inspiration and established itself firmly, and as other supporting interface capabilities improve. Voice Recognition: Seeking a Balance The next step for smart-glasses HCI is voice command recognition, arguably a better fit for such a hands-free wearable application. Google Glass already uses voice recognition; other smart glasses developers incorporating or considering it include Brilliant Labs (Mirama One), Kopin (Golden-i HMD), Vuzix (M-100), and Epson (BT-200). Kopin's Pupil design eschews touch entirely in favor of voice recognition. Samsung is among several companies reportedly in talks with Nuance, the company that helped Google develop Siri, reportedly with an eye toward finding a use for its own future smart glasses.
  • 2. Page | 2 This technology is fairly mature. Various voice recognition applications such as Siri are proliferating in the smartphone market. Chips that provide speech recognition, synthesis, and system control on a single chip can be procured for $1-$5 (depending on volumes, packaging type, and memory size), though at the component level another $10 or so will be needed for the audio system which is required in any voice recognition system. Speech-to-text may have a future role to play in smart glasses as well, though it does not appear to be widely used. Researchers at Oxford University have already developed a smart glasses prototype with speech-text capability targeting the market for the visually impaired. In the commercial area, two applications (WatchMeTalk and SubtleGlass) are able to convert speech to text; and OrCam Technologies has developed a camera that attaches to any glasses (not just "smart glasses") to "see" text on almost substrate and read the text aloud, with future plans to include facial recognition and the ability to recognize places. While voice is possibly the best balance (right now) between "naturalness" and technology maturity, it is not really a selling feature per se, and our impression is that smart glasses OEMs are not making much of a fuss about it yet. We expect more smart glasses OEMs to quietly adopt voice recognition, and likely it will become a ubiquitous interface for the next several years, until the next and future HCI technology matures and overtakes it: gesture control. Gestural Control: The Future of Natural HCI Neither touch nor voice, in our opinion, are the eventual winning user interface for smart glasses. Touching one's glasses is not a commonly performed gesture, and a tactile interface action doesn't seem to align with the hands-free paradigm. Meanwhile, talking out loud in some environments can be challenging (e.g., a busy warehouse or metropolitan area) or incompatible with privacy or security needs (e.g., a hospital; also perhaps awkward in general consumer use.) Instead, we expect the eventual ascension of a truly more natural HCI for a wearable electronic device: gesture recognition. Several different technologies are currently available for tracking motions and gestures, but these technologies are intended to support gaming, medical and systems that are not smart glasses in any sense.
  • 3. Page | 3 Technology Description Companies Involved Commercial Prospects Inertial Sensors Inertial measurement unit (IMU) with 3-axis accelerometers and gyroscopes to detect precise 3D motion; 9-axis option adds magnetometer (compass) Freescale (U.S.), Invensense (U.S.) Likely solution for smart phones and wearables, in combination with optical or other approaches Ultrasonic Uses ultrasonic speakers and microphones to detect disturbances in sound waves Elliptic Labs (Norway), Chirp Microsystems (U.S.) Reasonable chance of increased commercialization in the long term, especially if it can be expanded into more applications beyond laptops Electrical Field Detects disturbances in electrical field caused by hand gestures Microchip Technologies (U.S.) Facing an uphill battle against other technologies in consumer electronics but compelling for smart home and other applications that are a few years out Magnetic Field Detects disturbances in magnetic field caused by hand gestures Telekom Innovation Laboratories (Germany) Not likely to make it commercially in high-volume products; possible for VR applications Muscle Movement Detects electrical signals generated by muscle movement Thalmic Labs (Canada) Requires an armband, but compelling in medical applications; market demand remains to be seen Eye Tracking Tracks eye movements to note where the user is looking EyeSight (Israel), Tobii (Sweden), Quantum Interface (U.S.) Likely to be combined with other types of gesture control
  • 4. Page | 4 Among these gestural options, eye tracking takes on special importance in the context of a smart- glasses interface for obvious reasons. Yet this technology has some way to progress before it's truly ready. In some of today's commercial smart glasses technologies, head tracking is often used instead. APX Labs, for example, claims to have provisional patents on gesture and motion-based input based on onboard sensors, tracking a user's head rather than eyes to interact with content displayed on the screen. Improved accuracy and reliability are needed if eye tracking is to do more than tell whether the user is looking at one of several specific locations. We would note that for eye tracking, subsystems not long ago cost several thousands of dollars, but low-cost (and low-performance) eye tracking sub-systems can be obtained now. For example, The Eye Tribe offers a subsystem for around $100; and U.K. researchers claim a design that could deliver eye tracking for around $30 per item. Making Gesture Work for Smart Glasses At the moment, gestural recognition suffers from a lack of technological maturity. In commercial applications it doesn't quite deliver needed functionality, nor is it sufficiently reliable or robust to be extended into a consumer electronics product. There are a multitude of improvements that NanoMarkets believes will be needed if gestural recognition is ever to become common, especially for consumer-oriented smart glasses:  Low power. Sensors in gestural recognition have tended to be power hogs, but sensor makers are coming up with new IMU sensors that consume almost no power in standby mode and reasonably little when fully active. Higher-capacity batteries and other energy sources could be appropriate in this context, though there would be a trade-off with other system requirements, ultimately with cost as the bottom line.  Better data capture. Subsystems must support the sophisticated movements that are a normal part of the human gestural repertoire. For example, Atheer Labs is developing a smart-glasses product that can track in all hand orientations, both hands, and up to 10-finger identification (multiple fingers tracked separately). As part of this trend, NanoMarkets expects more use of time-of-flight (ToF) sensors. Brilliant Labs is integrating ToF for gesture recognition, since it does not depend on surrounding light (it has its own light source), provides clear images, can cancel out noise easily, provides good depth measurements, and can be used in most environments. This is seen as better than RGB cameras which have difficulty tracking gestures and canceling background noise.  Improved cameras. To the extent that cameras are used for gestural recognition in smart glasses, one can assume that they will grow beyond off-the-shelf 2D cameras to stereo cameras and then to 3D cameras with image sensors that can detect image and depth information at the same time. To this end, ToF cameras could very well be the next big thing in optical gesture recognition. Instead of scanning a scene, ToF cameras emit IR light and measure how long it takes the light to travel to the image, be reflected, and travel back to the image sensor. An array of pixels image every point at once. While it is possible to obtain full 3D imaging by other methods, ToF promises very fast response times because it collects data
  • 5. Page | 5 from an entire scene at once rather than scanning the field of view. ToF systems are not cheap, however, which may limit their use in consumer devices. Most smart glasses OEMs appear to view gesture recognition capability as still some ways off, but the majority of OEMs are working on this issue. In some cases, this work is just R&D, but there are also some gestural recognition that is already embodied in commercial smart glasses. Numerous efforts from lab projects (SixthSense) to large company patents (Microsoft, Google) to product development (Epson, Pivothead, Sony, Technical Illusions, Thalmic Labs, Vuzix) are exploring and incorporating gesture control in a smart glasses product, from gesture to head tracking to eye tracking interfaces. We believe that smart glasses will follow a trajectory towards more natural HCIs, so that the smart glasses gradually "merge" with the body. The problem is that gestural recognition is still not quite reliable and does not yet offer the cost/performance ratio necessary for smart glasses to transition into a profitable consumer electronics item. While gestural recognition is not quite ready for prime time, it is getting close. Into the Future: The Brain-Computer Interface Ultimately the next step in the HCI paradigm with wearable electronics to establish the most natural interface is to remove the proverbial middlemen and develop a direct communication pathway between the brain and an external device. The brain-computer interface (BCI) has long been a sci- fi staple, and the first neuroprosthetic devices implanted in humans appeared in the mid-1990s, but there is still much R&D activity happening in this area. Mostly these have focused on medical applications such as restoring damaged hearing, sight, and movement. Recently, however, lower-cost BCIs have begun to emerge, aimed at R&D or gaming applications. These could be transferred to smart glasses at some time, and we would not be surprised to see such brain-computer interfaces appear in future generations of smart glasses in a few years time - - however, most smart glasses OEMs appear to be not yet thinking about this. The information contained in this article was drawn from the NanoMarkets report, Smart Glasses: Component and Technology Markets: 2014 See more at: http://nanomarkets.net/market_reports/report/smart-glasses-component-and- technology-markets-2014