Implementation of A Dynamic Gesture Recognition Based Indian Sign Language

Pushpendra Patel*, Sandeep B. Patil**
* PG Student, Department of Electronics and Communication Engineering, Shri Shankaracharya Technical Campus of Engineering and Technology, Bhilai, India.
** Associate Professor, Department of Electrical and Electronics Engineering, Shri Shankaracharya College of Engineering and Technology, Bhilai, India.
Periodicity:September - November'2016
DOI : https://doi.org/10.26634/jpr.3.3.12404

Abstract

In this proposed work, the authors have designed a structure for the accurate hand gesture recognition using MATLAB. Computation based hand classifier is recommended for the dynamic gesture recognition. Now-a-days, it provides platform for those people who are unable to communicate like a normal person. The Principal Component Analysis based algorithm is capable of doing this work. This is done by creating an invariant subspace, which creates a vocabulary in such a manner that any abnormal person can learn and express himself easily. For doing so, using the movement of hands, its shape and position are obtained as an information which can be recognized for the gesture recognition. One of the methods is Hidden Markov Models (HMMs) capable of recognizing these combinations.

Keywords

Indian Sign Language, DWT, Computer Vision, HMMs, Gesture Recognition

How to Cite this Article?

Patel, P., and Patil, S. B. (2016). Implementation Of A Dynamic Gesture Recognition Based Indian Sign Language. . i-manager’s Journal on Pattern Recognition, 3(3), 1-6. https://doi.org/10.26634/jpr.3.3.12404

References

[1]. Tripathi, K., Baranwal, N., & Nandi, G.C. (2015). “Continuous dynamic Indian Sign Language gesture recognition with invariant backgrounds”. In Advances in Computing, Communications and Informatics (ICACCI), 2015 International Conference on IEEE, Kochi, pp.2211- 2216.
[2]. Abid, M.R., Petriu, E.M., & Amjadian, E. (2015). “Dynamic Sign Language Recognition for smart home interactive application using stochastic linear formal grammar”. IEEE Transactions on Instrumentation and Measurement, Vol.64, No.3, pp.596-605.
[3]. Wu, S., & Nagahashi, H. (2013). “Real-time 2D hands detection and tracking for sign language recognition”. In System of Systems Engineering (SoSE), 2013 8 International Conference on IEEE, Maui, HI, pp.40-45.
[4]. Kuznetsova, A., Leal-Taixé, L., & Rosenhahn, B. (2013). “Real-time sign language recognition using a consumer depth camera”. In Proceedings of the IEEE International Conference on Computer Vision Workshops, Sydney, NSW, pp.83-90.
[5]. Ong, E.J., Cooper, H., Pugeault, N., & Bowden, R. (2012). “Sign language recognition using sequential pattern trees”. In Computer Vision and Pattern Recognition (CVPR), 2012 IEEE Conference on IEEE, Providence, RI, pp.2200-2207.
[6]. Mekala, P., Gao, Y., Fan, J., & Davari, A. (2011). “Realtime sign language recognition based on neural network rd architecture”. In System Theory (SSST), 2011 IEEE 43 Southeastern Symposium on IEEE, Auburn, AL, pp.195- 199.
[7]. Yang, R., Sarkar, S., & Loeding, B. (2010). “Handling movement epenthesis and hand segmentation ambiguities in continuous sign language recognition using nested dynamic programming”. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol.32, No.3, pp.462-477.
[8]. Hamada, Y., Shimada, N., & Shirai, Y. (2004). “Hand shape estimation under complex backgrounds for sign language recognition”. In Automatic Face and Gesture Recognition, 2004. Proceedings Sixth IEEE International Conference on IEEE, pp.589-594.
[9]. Marcel, S., Bernier, O., Viallet, J.E., & Collobert, D. (2000). “Hand gesture recognition using input-output hidden markov models”. In Automatic Face and Gesture Recognition, 2000. Proceedings. Four th IEEE International Conference on IEEE, Grenoble, pp.456- 461.
[10]. Rautaray, S.S., & Agrawal, A. (2010). “A novel human computer interface based on hand gesture recognition using computer vision techniques”. In Proceedings of the First International Conference on Intelligent Interactive Technologies and Multimedia, Allahabad, India (ACM), pp.292-296.
[11]. Rautaray, S.S., & Agrawal, A. (2012). “Real time hand gesture recognition system for dynamic applications”. International Journal of UbiComp, Vol.3, No.1, pp.21.
[12]. Liu, N., & Lovell, B. C. (2001). “MMX-accelerated real-time hand tracking system”. In Proceedings of IVCNZ, pp.381-385.
[13]. Chen, F. S., Fu, C. M., & Huang, C. L. (2003). “Hand gesture recognition using a real-time tracking method and Hidden Markov Models”. Image and Vision Computing, Vol.21, No.8, pp.745-758.
[14]. Lee, C., Ghyme, S., Park, C., & Wohn, K. (1998). “The control of avatar motion using hand gesture”. In Proceedings of the ACM symposium on Virtual Reality Software and Technology, Taipei, Taiwan (ACM), pp.59- 65.
If you have access to this article please login to view the article or kindly login to purchase the article

Purchase Instant Access

Single Article

North Americas,UK,
Middle East,Europe
India Rest of world
USD EUR INR USD-ROW
Pdf 35 35 200 20
Online 35 35 200 15
Pdf & Online 35 35 400 25

Options for accessing this content:
  • If you would like institutional access to this content, please recommend the title to your librarian.
    Library Recommendation Form
  • If you already have i-manager's user account: Login above and proceed to purchase the article.
  • New Users: Please register, then proceed to purchase the article.