This document summarizes navigation assistance solutions for blind or visually impaired individuals. It discusses several mobile software programs that integrate GPS and audio cues to provide turn-by-turn navigation via smartphones, including The Loadstone Project, Mobile Geo, and BlindSquare. It also describes some research projects seeking to develop more advanced navigation technologies, such as Trinetra's public transportation information system and Drishti's wearable computer and voice guidance system for outdoor navigation. The document emphasizes developing cost-effective and independence-enhancing technologies to assist blind individuals in navigation.
11. • Cornea : A thin membrane having a refractive index
of
approx. 1.38. Protects the eye;
Refracts light as it enters
the eye.
• Pupil
: An opening in the middle of the eyeball.
Appears
black because the incident light is
absorbed on the retina
and does not exit the
eye. The size of the pupil opening
can be adjusted
by the dilation of the iris.
• Iris
: A coloured diaphragm capable of adjusting
the size of
the opening. Bright-light situations
- reduce the pupil
opening and limit the
amount of light. Dim-light
situations maximize the size of the pupil opening and
increase the amount of light.
12. • Crystalline lens
: Made of fibrous layers having a
refractive index of
roughly 1.40. Fine-tunes the
vision process by changing
its shape. The
lens is attached to the ciliary muscles.
• Ciliary muscles
: Relax and contract to change the
shape of the lens.
Assist the eye in
producing an image on the back of
the
eyeball.
• Retina
: Inner surface of the eye. Contains rods and
cones that
detect intensity and frequency
of incident light. Has
up to 120 million
rods and 6 million cones that detect
intensity and frequency of light. Rods and cones send
nerve impulses to the brain.
• Nerve Cells
: Nerve impulses travel through a nerve
cell network.
About one million neural
pathways exist from rods
and cones
13. The ultimate goal of such an anatomy is to allow
humans to focus images on the back of the retina.
16. Cornea - lens - ciliary muscles - retina
• The lens of the eye is not where all the refraction of
incoming light rays takes place. Most of the refraction
occurs at the cornea.
• The refractive index of the cornea is significantly greater
than that of the surrounding air.
• This difference in optical density and the double convex
shape - ability of the cornea to do most of the
refraction.
• Shape of crystalline lens is changed by ciliary muscles.
Induces small alterations in the amount of corneal bulge
and fine-tune the additional refraction.
17. • For an object located at a point more than 2-focal
lengths from the "lens," the image will be located
between the F and the 2F point.
• The image will be inverted, reduced in size, and real.
• The cornea-lens system produces an image of the object
on the retinal surface.
• The image is
• Real
- Vision is dependent upon stimulation of
nerve
impulses by incident light rays.
Only real images are
capable of producing
such a stimulation.
• Reduced in size - Allows the entire image to "fit" on the
retina.
• Inverted
- Brain properly interprets the signal as
originating
from a right-side-up object.
18. • The Power of Accommodation: Ciliary muscles of the
eye serve to contract and relax, thus changing the shape
of the lens. This allows the eye to change its focal length
and thus focus images of objects that are both close up
and far away.
21. What makes the human eye
unique?
• Our eyes are able to look around a scene and
dynamically adjust based on subject matter.
• Our eyes can compensate as we focus on regions of
varying brightness, can look around to encompass a
broader angle of view, or can alternately focus on
objects at a variety of distances.
• The end result is akin to a video camera, that compiles
relevant snapshots to form a mental image.
• What we really see is our mind’s reconstruction of
objects based on input provided by the eyes, not the
actual light received by our eyes.
25. Angle of View
Short Focal
Length
Plane
•
Long Focal Length
Each eye individually has 120-200 degree angle of view,
depending on the definition of being "seen."
26. • Dual eye overlap region is around 130 degrees.
• Our central angle of view (around 40-60 degrees) is what most
impacts our perception.
27. • Too wide angle of view - Relative sizes of objects are
exaggerated
• Too narrow angle
- Objects are all nearly the same
relative size and sense of depth is lost.
• Extremely wide angles - Tend to make objects near the edges
of the frame appear stretched.
28. Resolution and Detail
• Our mind does not remember images pixel by pixel; it records
memorable textures, colour and contrast on an image by image
basis.
• To assemble a detailed mental image, our eyes focus on several
regions of interest in rapid succession.
29. • Asymmetry
: Each eye is more capable of
perceiving
detail below our
line of sight than above,
and
their peripheral vision is also much
more sensitive in directions away from the
nose than towards it. Cameras record images
almost perfectly symmetrically.
• Low-Light Viewing : In extremely low light, our eyes
begin to
see in monochrome. Our
central vision also
begins to depict
less detail than just offcenter.
• Subtle Gradations : With a camera, enlarged detail is
always
easier to resolve but
enlarged detail becomes
less
30. Sensitivity and Dynamic Range
• Our eye dynamically adjusts like a video camera
• After adjusting to low light, our eyes can see
anywhere from 10-14 f-stops of dynamic range.
• Dynamic range also depends on brightness and
subject contrast.
• Cameras can take longer exposures to bring out even
fainter objects, whereas our eyes do not see
additional detail after staring at something for
more than about 10-15 seconds.
31. Our mind is able to intelligently interpret the
information from our eyes, whereas with a camera,
all we have is the raw image. Even so, current digital
cameras fare surprisingly well, and surpass our own
eyes for several visual capabilities.
34. The Loadstone Project
• Open source mobile software for
navigation for blind/visually
impaired users using S60 Symbian
operating system.
• A GPS receiver is connected to the
cell phone by Bluetooth. The
software provides users options to
create and store their own waypoints
for navigation and share it with
others.
• Many blind people around the world
use Nokia cell phones because of
availability of “Talks” from Nuance
35. Mobile Geo
• Mobile Geo is Code Factory’s GPS navigation software
for Windows Mobile-based smartphones.
• Mobile Geo is the first solution specifically designed to
serve as a navigation aid for people with a visual
impairment which works with a wide range of
mainstream mobile devices.
• Mobile Geo is seamlessly integrated with Code
Factory’s popular screen readers, “Mobile Speak” for
Pocket PCs and “Mobile Speak” for Windows Mobile
smartphones.
36. BlindSquare
• BlindSquare is MIPsoft’s
GPS navigation software
for iPhone and iPad.
• It differs from other
navigation applications
by using crowd sourced
data.
• It uses Foursquare for
points of interest and
OpenStreetMap for
street information.
38. Trinetra
• Aims to develop cost-effective, independenceenhancing technologies to benefit blind people.
• Addresses accessibility concerns of blind people
using public transportation systems.
• Using GPS receivers and Infrared sensors, information
is relayed to a centralized fleet management server
via a cellular modem.
• Blind people, using common text-to-speech enabled
cell phones can query estimated time of arrival,
locality, and current bus capacity using a web
browser.
• Spearheaded by Professor Priya Narasimhan,
Trinetra is an ongoing project at the Electrical and
39. Drishti
• Drishti is a wireless pedestrian navigation system.
• Integrates several technologies including wearable
computers, voice recognition and synthesis, wireless
networks, GIS and GPS.
• Augments contextual information to the visually
impaired and computes optimized routes based on
user preference, temporal constraints (e.g. traffic
congestion), and dynamic obstacles (e.g. ongoing
ground work).
• Environmental conditions and landmark information
queries from a spatial database along their route
are provided on the fly through detailed
explanatory voice cues.
40. NAVIG
• NAVIG is an innovative multidisciplinary project, with
fundamental and applied aspects.
• The main objective is to increase the autonomy of
blind people in their navigation capabilities.
• Reaching a destination while avoiding obstacles is one
of the most difficult issue that blind individuals have
to face. Achieving autonomous navigation will be
pursued indoor and outdoor, in known and unknown
environments.
41. • Two research centres, one specialized in humanmachine interaction (IRIT) for handicapped people and
in auditory perception, spatial cognition, sound design
and augmented reality (LIMSI)
• The other is specialized in human and computer vision
(CERCO).