Static Devnagari Sign Language Recognition

Versha Verma*, Sandeep B. Patil**
* PG Student, Department of Electronics and Telecommunication Engineering, Shri Shankaracharya Technical Campus, Bhilai, India.
** Associate Professor, Department of Electrical and Electronics Engineering, Shri Shankaracharya Technical Campus, Bhilai, India.
Periodicity:September - November'2016
DOI : https://doi.org/10.26634/jpr.3.3.12406

Abstract

Hand gesture based sign language is a different way of communication among the Deaf-mute and physically impaired people performed using specific hand gestures. Deaf-mute people face struggle in expressing their feelings to other people, which creates a communication gap between normal and deaf-mute people. This paper, based on hand gesture based Devnagari Sign language Recognition approaches aim to provide a communication way for the Deafmute Community over the society. Therefore, the authors have used static hand gesture based sign language recognition for Deaf-mute communication system. Many researchers use only single American Sign Language or Indian Sign Language for creating their database. In this paper, recent research of sign language is reviewed based on manual communication and body language. A hand gesture based Devnagari Sign Language for recognizing Hindi characters using hand gestures of Deaf-mute people is developed. Devnagari Sign Language recognition system is typically explained with five steps, i.e. acquisition, segmentation, pre-processing, feature extraction, and classification.

Keywords

Hand Gesture Recognition, Devnagari Sign Language Recognition, Human Computer Interaction, Feature Extraction, Edge Oriented Histogram

How to Cite this Article?

Verma, V., and Patil, S. B. (2016). Static Devnagari Sign Language Recognition. i-manager’s Journal on Pattern Recognition. i-manager’s Journal on Pattern Recognition, 3(3), 13-18. https://doi.org/10.26634/jpr.3.3.12406

References

[1]. J. L. Hernandez-Rebollar, N. Kyriakopoulos, and R. W. Linderman, (2004). “A New Instrumented Approach for Translating American Sign Language into Sound and Text”. Proceedings of the Sixth IEEE International Conference on Automatic Face and Gesture Recognition, pp.547-552.
[2]. D. Stein, P. Dreuw, H. Ney, S. Morrissey, and A. Way, (2007). “Hand in Hand Automatic Sign Language to English Translation”. Proceedings of the 11 Conference on Theoretical and Methodological Issues in Machine Translation, StÖvde, Sweden, pp.214-220
[3]. N. Tanibata, N. Shimada, and Y. Shirai, (2002). “Extraction of Hand Features for Recognition of Sign Language Words”. In Proceedings of the 15 International Conference on Vision Interface, pp.391-398.
[4]. D. J. Rios-Soria, S. E. Schaeffer, and S. E. Garza- Villarreal, (2013). “Hand-gesture recognition using computer-vision techniques”. In 21 International Conference on Computer Graphics, Visualization and Computer Vision, Czech Republic, pp.1-8.
[5]. J. Singha and K. Das, (2013). “Hand Gesture Recognition Based on Karhunen-Loeve Transform”. In Mobile & Embedded Technology International Conference (MECON), pp.365-371.
[6]. K. K. Safaya and P. J.W.Bakal, (2013). “Real Time Based Bare Hand Gesture Recognition”. IPASJ International Journal of Information Technology (IIJIT), Vol.1, No.2, pp.1-9.
[7]. P. Chaudhary and H.S. Ryait, (2014). “Neural Network Based Static Sign Gesture”. International Journal of Innovative Research in Computer and Communication Engineering, Vol.2, No.2, pp.3066-3072.
[8]. J. R. Pansare, S. H. Gawande, and M. Ingle, (2012). “Real-time Static Hand Gesture Recognition for American Sign Language (ASL) in Complex Background”. Journal of Signal and Information Processing, Vol.3, No.3, pp.364- 367.
[9]. S. Pramada, D. Saylee, N. Pranita, N. Samiksha, and M.S. Vaidya, (2013). “Intelligent Sign Language Recognition using Image Processing”. IOSR Journal of Engineering (IOSRJEN), Vol.3, No.2, pp.45-51.
[10]. C. Evans, (2009). Notes on the OpenSURF Library. University of Bristol, Department of Computer Vision.
[11]. H. Bay, T. Tuytelaars, and L. V. Gool, (2006). “SURF: Speeded up Robust Features”. In 9 European Conference on Computer Vision, pp.404-417.
[12]. M. A. Rahman, Ahsan-Ui-Ambia and M. Aktaruzzaman, (2011). “Recognition Static Hand Gesture of Alphbetin ASL”. IJCIT, Vol.2, No.1, pp.75-78.
If you have access to this article please login to view the article or kindly login to purchase the article

Purchase Instant Access

Single Article

North Americas,UK,
Middle East,Europe
India Rest of world
USD EUR INR USD-ROW
Online 15 15

Options for accessing this content:
  • If you would like institutional access to this content, please recommend the title to your librarian.
    Library Recommendation Form
  • If you already have i-manager's user account: Login above and proceed to purchase the article.
  • New Users: Please register, then proceed to purchase the article.