Final Year IEEE Projects for BE, B.Tech, ME, M.Tech,M.Sc, MCA & Diploma Students latest Java, .Net, Matlab, NS2, Android, Embedded,Mechanical, Robtics, VLSI, Power Electronics, IEEE projects are given absolutely complete working product and document providing with real time Software & Embedded training......
JAVA, .NET, NS2, MATLAB PROJECTS:
Networking, Network Security, Data Mining, Cloud Computing, Grid Computing, Web Services, Mobile Computing, Software Engineering, Image Processing, E-Commerce, Games App, Multimedia, etc.,
EMBEDDED SYSTEMS:
Embedded Systems,Micro Controllers, DSC & DSP, VLSI Design, Biometrics, RFID, Finger Print, Smart Cards, IRIS, Bar Code, Bluetooth, Zigbee, GPS, Voice Control, Remote System, Power Electronics, etc.,
ROBOTICS PROJECTS:
Mobile Robots, Service Robots, Industrial Robots, Defence Robots, Spy Robot, Artificial Robots, Automated Machine Control, Stair Climbing, Cleaning, Painting, Industry Security Robots, etc.,
MOBILE APPLICATION (ANDROID & J2ME):
Android Application, Web Services, Wireless Application, Bluetooth Application, WiFi Application, Mobile Security, Multimedia Projects, Multi Media, E-Commerce, Games Application, etc.,
MECHANICAL PROJECTS:
Auto Mobiles, Hydraulics, Robotics, Air Assisted Exhaust Breaking System, Automatic Trolley for Material Handling System in Industry, Hydraulics And Pneumatics, CAD/CAM/CAE Projects, Special Purpose Hydraulics And Pneumatics, CATIA, ANSYS, 3D Model Animations, etc.,
CONTACT US:
ECWAY TECHNOLOGIES
23/A, 2nd Floor, SKS Complex,
OPP. Bus Stand, Karur-639 001.
TamilNadu , India.Cell: +91 9894917187.
Website: www.ecwayprojects.com | www.ecwaytechnologies.com
Mail to: ecwaytechnologies@gmail.com.
A probabilistic approach to online eye gaze tracking without explicit personal calibration
1. OUR OFFICES @CHENNAI/ TRICHY / KARUR / ERODE / MADURAI / SALEM / COIMBATORE / BANGALORE / HYDRABAD
CELL: +91 9894917187 | 875487 1111 / 2111 / 3111 / 4111 / 5111 / 6111
ECWAY TECHNOLOGIES
IEEE SOFTWARE | EMBEDDED | MECHANICAL | ROBOTICS PROJECTS DEVELOPMENT
Visit: www.ecwaytechnologies.com | www.ecwayprojects.com Mail to: ecwaytechnologies@gmail.com
A PROBABILISTIC APPROACH TO ONLINE EYE GAZE TRACKING WITHOUT
EXPLICIT PERSONAL CALIBRATION
By
A
PROJECT REPORT
Submitted to the Department of electronics &communication Engineering in the
FACULTY OF ENGINEERING & TECHNOLOGY
In partial fulfillment of the requirements for the award of the degree
Of
MASTER OF TECHNOLOGY
IN
ELECTRONICS &COMMUNICATION ENGINEERING
APRIL 2016
2. OUR OFFICES @CHENNAI/ TRICHY / KARUR / ERODE / MADURAI / SALEM / COIMBATORE / BANGALORE / HYDRABAD
CELL: +91 9894917187 | 875487 1111 / 2111 / 3111 / 4111 / 5111 / 6111
ECWAY TECHNOLOGIES
IEEE SOFTWARE | EMBEDDED | MECHANICAL | ROBOTICS PROJECTS DEVELOPMENT
Visit: www.ecwaytechnologies.com | www.ecwayprojects.com Mail to: ecwaytechnologies@gmail.com
CERTIFICATE
Certified that this project report titled “A Probabilistic Approach to Online Eye Gaze
Tracking Without Explicit Personal Calibration” is the bonafide work of Mr.
_____________Who carried out the research under my supervision Certified further, that to the
best of my knowledge the work reported herein does not form part of any other project report or
dissertation on the basis of which a degree or award was conferred on an earlier occasion on this
or any other candidate.
Signature of the Guide Signature of the H.O.D
Name Name
3. OUR OFFICES @CHENNAI/ TRICHY / KARUR / ERODE / MADURAI / SALEM / COIMBATORE / BANGALORE / HYDRABAD
CELL: +91 9894917187 | 875487 1111 / 2111 / 3111 / 4111 / 5111 / 6111
ECWAY TECHNOLOGIES
IEEE SOFTWARE | EMBEDDED | MECHANICAL | ROBOTICS PROJECTS DEVELOPMENT
Visit: www.ecwaytechnologies.com | www.ecwayprojects.com Mail to: ecwaytechnologies@gmail.com
DECLARATION
I hereby declare that the project work entitled “A Probabilistic Approach to Online Eye Gaze
Tracking Without Explicit Personal Calibration” Submitted to BHARATHIDASAN
UNIVERSITY in partial fulfillment of the requirement for the award of the Degree of MASTER
OF APPLIED ELECTRONICS is a record of original work done by me the guidance of
Prof.A.Vinayagam M.Sc., M.Phil., M.E., to the best of my knowledge, the work reported here
is not a part of any other thesis or work on the basis of which a degree or award was conferred on
an earlier occasion to me or any other candidate.
(Student Name)
(Reg.No)
Place:
Date:
4. OUR OFFICES @CHENNAI/ TRICHY / KARUR / ERODE / MADURAI / SALEM / COIMBATORE / BANGALORE / HYDRABAD
CELL: +91 9894917187 | 875487 1111 / 2111 / 3111 / 4111 / 5111 / 6111
ECWAY TECHNOLOGIES
IEEE SOFTWARE | EMBEDDED | MECHANICAL | ROBOTICS PROJECTS DEVELOPMENT
Visit: www.ecwaytechnologies.com | www.ecwayprojects.com Mail to: ecwaytechnologies@gmail.com
ACKNOWLEDGEMENT
I am extremely glad to present my project “A Probabilistic Approach to Online Eye Gaze
Tracking Without Explicit Personal Calibration” which is a part of my curriculum of third
semester Master of Science in Computer science. I take this opportunity to express my sincere
gratitude to those who helped me in bringing out this project work.
I would like to express my Director,Dr. K. ANANDAN, M.A.(Eco.), M.Ed., M.Phil.,(Edn.),
PGDCA., CGT., M.A.(Psy.)of who had given me an opportunity to undertake this project.
I am highly indebted to Co-OrdinatorProf. Muniappan Department of Physics and thank from
my deep heart for her valuable comments I received through my project.
I wish to express my deep sense of gratitude to my guide
Prof. A.Vinayagam M.Sc., M.Phil., M.E., for her immense help and encouragement for
successful completion of this project.
I also express my sincere thanks to the all the staff members of Computer science for their kind
advice.
And last, but not the least, I express my deep gratitude to my parents and friends for their
encouragement and support throughout the project.
5. OUR OFFICES @CHENNAI/ TRICHY / KARUR / ERODE / MADURAI / SALEM / COIMBATORE / BANGALORE / HYDRABAD
CELL: +91 9894917187 | 875487 1111 / 2111 / 3111 / 4111 / 5111 / 6111
ECWAY TECHNOLOGIES
IEEE SOFTWARE | EMBEDDED | MECHANICAL | ROBOTICS PROJECTS DEVELOPMENT
Visit: www.ecwaytechnologies.com | www.ecwayprojects.com Mail to: ecwaytechnologies@gmail.com
ABSTRACT:
Existing eye gaze tracking systems typically require an explicit personal calibration process in
order to estimate certain person-specific eye parameters. For natural human computer interaction,
such a personal calibration is often inconvenient and unnatural. In this paper, we propose a new
probabilistic eye gaze tracking system without explicit personal calibration. Unlike the
conventional eye gaze tracking methods, which estimate the eye parameter deterministically
using known gaze points, our approach estimates the probability distributions of the eye
parameter and eye gaze. Using an incremental learning framework, the subject does not need
personal calibration before using the system. His/her eye parameter estimation and gaze
estimation can be improved gradually when he/she is naturally interacting with the system. The
experimental result shows that the proposed system can achieve <3° accuracy for different
people without explicit personal calibration.
6. OUR OFFICES @CHENNAI/ TRICHY / KARUR / ERODE / MADURAI / SALEM / COIMBATORE / BANGALORE / HYDRABAD
CELL: +91 9894917187 | 875487 1111 / 2111 / 3111 / 4111 / 5111 / 6111
ECWAY TECHNOLOGIES
IEEE SOFTWARE | EMBEDDED | MECHANICAL | ROBOTICS PROJECTS DEVELOPMENT
Visit: www.ecwaytechnologies.com | www.ecwayprojects.com Mail to: ecwaytechnologies@gmail.com
INTRODUCTION:
Gaze tracking is the procedure of determining the pointof-gaze on the monitor, or the visual axis
of the eye in 3D space. Gaze tracking systems are primarily used in the Human Computer
Interaction (HCI) and in the analysis of visual scanning patterns. In HCI, the eye gaze can serve
as an advanced computer input to replace traditional input devices such as a mouse pointer Also,
the graphic display on the screen can be controlled by the eye gaze interactively. Since visual
scanning patterns are closely related to the attentional focus, cognitive scientists use the gaze
tracking system to study human’s cognitive processes.
Various video-based gaze estimation techniques ,have been proposed (a survey of gaze
estimation may be found) and the gaze estimation systems have now evolved to the point where
the user is allowed free head movements while maintaining high accuracy (one degree or better).
However, most of current gaze estimation systems require a personal calibration procedure for
each subject in order to estimate his/her specific eye parameters.This calibration process could
significantly limit the practical utility of gaze estimation. To overcome this limitation, we
propose a novel gaze estimation framework without any explicit personal calibration. The main
contributions of this work include:
• the introduction of a probabilistic approach (versus the existing deterministic approach) for 3D
eye gaze estimation without explicit personal calibration,
• development of an incremental learning approach to incrementally refine the eye parameters
when the subject is naturally viewing the screen without prompting,
• development of a gaze estimation algorithm using Gaussian prior probability when the subject
is naturally watching a video.
Our system adapts automatically online to each subject to improve eye gaze tracking accuracy
without user collaboration,making natural, non-intrusive and non-collaborative eye gaze tracking
closer to reality. Since the video-based eye tracking is being widely used, including many
commercial eye trackers. By eliminating the explicit personal calibration for these eye trackers,
the proposed research hence has significant practical impact.
7. OUR OFFICES @CHENNAI/ TRICHY / KARUR / ERODE / MADURAI / SALEM / COIMBATORE / BANGALORE / HYDRABAD
CELL: +91 9894917187 | 875487 1111 / 2111 / 3111 / 4111 / 5111 / 6111
ECWAY TECHNOLOGIES
IEEE SOFTWARE | EMBEDDED | MECHANICAL | ROBOTICS PROJECTS DEVELOPMENT
Visit: www.ecwaytechnologies.com | www.ecwayprojects.com Mail to: ecwaytechnologies@gmail.com
CONCLUSION:
In this paper, we proposed a new probabilistic gaze estimation framework by combining the gaze
prior with the 3D eye gaze model. Compared to the conventional gaze estimation method, our
proposed approach eliminates the explicit personal calibration procedure. It changes
conventional deterministic eye gaze tracking to probabilistic eye gaze tracking, allowing to
combine eye gaze prior with optical axis estimates to simultaneously estimate 3D gaze point and
the personal eye parameters in an incremental manner without any cooperation from the
user.Compared with the most recent implicit personal calibration method , our system allows
natural head movement, and achieves high gaze estimation accuracy of less than three degree
(comparing to the accuracy of 3.5 degrees in,
By using a novel incremental learning framework, our system doesn’t need any training data
fromin the subject beforehand. It can adapt to the user quickly and improves its performance as
the subject naturally uses the system. Finally, we further extend our system without computing
the saliency map by assuming the prior gaze distribution follows a Gaussian distribution, with a
mean located in the center of the screen. This not only improves the speed of our method
(without the need of computing saliency map), but also extend its application scope. Our
approach, however, is limited to free-viewing scenarios when subjects are naturally viewing
images or videos. Studies in visual attention ,have already shown that if the user is performing a
specific visual task, his/her gaze distribution is driven by the task. The proposed gaze prior,
either saliency map or Gaussian gaze prior, may not be applicable in this case. We will study the
task-dependent gaze prior in our future work.
8. OUR OFFICES @CHENNAI/ TRICHY / KARUR / ERODE / MADURAI / SALEM / COIMBATORE / BANGALORE / HYDRABAD
CELL: +91 9894917187 | 875487 1111 / 2111 / 3111 / 4111 / 5111 / 6111
ECWAY TECHNOLOGIES
IEEE SOFTWARE | EMBEDDED | MECHANICAL | ROBOTICS PROJECTS DEVELOPMENT
Visit: www.ecwaytechnologies.com | www.ecwayprojects.com Mail to: ecwaytechnologies@gmail.com
REFERENCES:
[1] Y. Sugano, Y. Matsushita, and Y. Sato, “Calibration-free gaze sensing using saliency maps,”
in Proc. IEEE Conf. Comput. Vis. Pattern Recognit. (CVPR), Jun. 2010, pp. 2667–2674.
[2] Y. Sugano, Y. Matsushita, and Y. Sato, “Appearance-based gaze estimation using visual
saliency,” IEEE Trans. Pattern Anal. Mach. Intell.,vol. 35, no. 2, pp. 329–341, Feb. 2013.
[3] D. W. Hansen and Q. Ji, “In the eye of the beholder: A survey of models for eyes and gaze,”
IEEE Trans. Pattern Anal. Mach. Intell., vol. 32, no. 3, pp. 478–500, Mar. 2010.
[4] F. Lu, T. Okabe, Y. Sugano, and Y. Sato, “A head pose-free approach for appearance-based
gaze estimation,” in Proc. Brit. Mach. Vis. Conf.,2011, pp. 126.1–126.11.
[5] J. Chen and Q. Ji, “Probabilistic gaze estimation without active personal calibration,” in Proc.
IEEE Conf. Comput. Vis. Pattern Recognit. (CVPR), Jun. 2011, pp. 609–616.
[6] Y. Sugano, Y. Matsushita, Y. Sato, and H. Koike, “An incremental learning method for
unconstrained gaze estimation,” in Proc. Eur. Conf. Comput. Vis. (ECCV), 2007, pp. 656–667.
[7] A. Borji and L. Itti, “Defending Yarbus: Eye movements reveal observers’ task,” J. Vis., vol.
14, no. 3, p. 29, Mar. 2014.
[8] T. Judd, F. Durand, and A. Torralba, “A benchmark of computational models of saliency to
predict human fixations,” Massachusetts Inst.Technol., Cambridge, MA, USA, Tech. Rep. MIT-
CSAIL-TR-2012-001,2012.
[9] T. Judd, K. Ehinger, F. Durand, and A. Torralba, “Learning to predict where humans look,”
in Proc. IEEE Int. Conf. Comput. Vis. (ICCV), Sep./Oct. 2009, pp. 2106–2113.