A graduation project at the Faculty of Computer and Information Sciences, Ain Shams University, Cairo, Egypt.
Interactive
wall Allows users to interact with the computer using his/her hands
gestures,
The application uses an optical camera to detect and track
the hands using image processing techniques,
The desktop is projected
on a wall using a projector, which gives the user the free experience
of interacting with the computer freely.
_________________________________________________________________
Windows Live™: Keep your life in sync. Check it out!
http://windowslive.com/explore?ocid=TXT_TAGLM_WL_t1_allup_explore_012009
3. Agenda Introduction Physical Environment and Framework Project Modules and Applications Challenges Conclusion and Future work Tools and References 3
13. Detect Corners Controller Module Color Mapping Search for hand in entry point Segmentation Construct the search window Track the hand Fire Event Gesture Recognition 12
56. AsanterabiMalima, ErolÖzgür, and MüjdatÇetin, A Fast algorithm for vision-based hand gesture recognition robot control, Faculty of Engineering and Natural Science, Sabancı University, Tuzla, İstanbul, Turkey, 2006.40
57.
58.
59.
60.
61. NgonT.Truong, Jae-GyunGwag, Yong-Jin Park, and Suk-Ha Lee, Genetic Diversity of Soybean Pod Shape Based on Elliptical Fourier Descriptors, Dep. of Plant Science, Seoul National University, Seoul 151-742, Korea Dep. Of Crop Sciences, Can Tho University, Can Tho, Viet Nam Genetic Resources Div., National Institute of Agricultural Biotechnology, Suwon 441-707, Korea, 2005.
62.
63.
64. Stephen Wolf,Color Correction Matrix for Digital Still and Video Imaging Systems, U.S. DEPARTMENT OF COMMERCE, December 2003.
67. Dengsheng Zhang and Guojun Lu,A Comparative Study on Shape Retrieval Using Fourier Descriptors with Different Shape Signatures, Gippsland School of Computing and Information Technology Monash University Churchill, Victoria 3842, Australia, 2001.43
68.
69. Kenny Teng, Jeremy Ng, Shirlene Lim, Computer Vision Based Sign Language Recognition for Numbers.
70.
71. A. M. Hamad, Fawziashaaban, Mona Gabr, NohaSayed, RababHussien, Robot Vision, Faculty of computer and Information Sciences Ain Shams University, Cairo, Egypt, 2008.44
Our goal is the development of a more natural interface; a camera-projector system using hand gesture analysis.Optical cameraProjector Framework Hand gesture , ApplicationsA development framework that gives any application using it the ability to allow the user to interact with this application using “Hand Gestures”.A stream of video is captured using low cost webcam then processed in the framework to extract the hand position, recognize the gesture and finally the application handles the fired events.
Our goal is the development of a more natural interface; a camera-projector system using hand gesture analysis.Optical cameraProjector Framework Hand gesture , ApplicationsA development framework that gives any application using it the ability to allow the user to interact with this application using “Hand Gestures”.A stream of video is captured using low cost webcam then processed in the framework to extract the hand position, recognize the gesture and finally the application handles the fired events.
1. Initialization (k=0). In this step it is looked for the object in the whole image due we do not know previously the object position. We obtain this way. Also we can considerer initially a big error tolerance.2. Prediction (k>0). In this stage using the Kalman filter we predict the relative position of the object, such position is considered as search center to find the object.3. Correction (k>0). In this part we locate the object (which is in the neighborhood point predicted in the previous stage) and we use its realposition (measurement) to carry out the state correction using the Kalman filter finding this way .The steps 2 and 3 are carried out while the object tracking runs.the equations for the Kalman filter fall into two groups: Time update equations and measurement update equations. The time update equations are responsible for projecting forward (in time) the current state and error covarianceestimates to obtain the a priori estimates for the next time step. The measurement update equations are responsible for the feedback—i.e. for incorporating a new measurement intothe a priori estimate to obtain an improved a posteriori estimate.