Real Time Sign Language: A Review

Nakul Nagpal *, Arun K. Mittra**, Pankaj Agrawal ***
* Assistant Professor, Department of Electronics and Telecommunication Engineering, Jhulelal Institute of Technology, Nagpur, Maharashtra, India.
** Professor, Department of Electronics Engineering, Manoharbhai Patel Institute of Engineering and Technology, Gondia, Maharashtra, India.
*** Professor, Department of Electronics and Communication Engineering, G.H. Raisoni Academy of Engineering and Technology, Nagpur, Maharashtra, India.
Periodicity:December - February'2018
DOI : https://doi.org/10.26634/jpr.4.4.14132

Abstract

In order to develop a system which is versatile there is a need to achieve a dual channel communication, i.e. from deaf
to hearing and vice-versa. Basically, Sign Language Recognition System is a system which helps the stakeholders, i.e. the
deaf and the dumb persons and those with normal senses to communicate with each others quickly and efficiently
without a need of conventional interpreters. In this review, the authors have tried to study the works, which have been
carried out over the years and tried to present the insight gained along with the identified research gap.

Keywords

Gesture Spotting, Recognition, Image Processing, Indian Sign Language.

How to Cite this Article?

Nagpal, N., Mittra, K. A., and Agrawal, P. (2018). Real Time Sign Language: A Review. i-manager’s Journal on Pattern Recognition, 4(4), 39-50. https://doi.org/10.26634/jpr.4.4.14132

References

[1]. Agrafiotis, D., Canagarajah, C. N., Bull, D. R., Kyle, J., Seers, H., & Dye, M. (2004). A video coding system for sign language communication at low bit rates. In 2004 International Conference on Image Processing (ICIP'04) Singapore (Vol. 1, pp. 441-444). IEEE.
[2]. Akalin, N., Uluer, P., & Kose, H. (2014, November). Nonverbal communication with a social robot peer: Towards robot assisted interactive sign language tutoring. In Humanoid Robots (Humanoids), 2014 14 IEEE-RAS International Conference on (pp. 1122-1127). IEEE.
[3]. Akalin, N., Uluer, P., Kose, H., & Ince, G. (2013, November). Humanoid robots communication with participants using sign language: An interaction based sign language game. In Advanced Robotics and its Social Impacts (ARSO), 2013 IEEE Workshop on (pp. 181- 186). IEEE.
[4]. Aoki, Y., Mitsumori, R., Li, J., & Burger, A. (1998, May). Sign language communication between Japanese- Korean and Japanese-Portuguese using CG animation. In Acoustics, Speech and Signal Processing, 1998. Proceedings of the 1998 IEEE International Conference on (Vol. 6, pp. 3765-3768). IEEE.
[5]. Aoki, Y., Tanahasi, S., Burger, A., Hattori, H., & Wakoh, H. (1997, September). Development of a CG-animation system for sign language communication between different languages. In Information, Communications and Signal Processing, 1997. ICICS, Proceedings of 1997 International Conference on (Vol. 3, pp. 1668-1671). IEEE.
[6]. Bajpai, D., Porov, U., Srivastav, G., & Sachan, N. (2015, April). Two Way Wireless Data Communication and American Sign Language Translator Glove for Images Text and Speech display on Mobile Phone. In Communication Systems and Network Technologies (CSNT), 2015 Fifth International Conference on (pp. 578-585). IEEE.
[7]. Balbin, J. R., Padilla, D. A., Caluyo, F. S., Fausto, J. C., Hortinela, C. C., Manlises, C. O., et al. (2016, November). Sign language word translator using Neural Networks for the Aurally Impaired as a tool for communication. In Control System, Computing and Engineering (ICCSCE), 2016 6 IEEE International Conference on (pp. 425-429). IEEE.
[8]. Chai, X., Wang, H., Yin, F., & Chen, X. (2015, September). Communication tool for the hard of hearings: A large vocabulary sign language recognition system. In Affective Computing and Intelligent Interaction (ACII), 2015 International Conference on (pp. 781-783). IEEE.
[9]. Chikkanna, M., & Guddeti, R. M. R. (2013, August). Kinect based real-time gesture spotting using HCRF. In Advances in Computing, Communications and Informatics (ICACCI), 2013 International Conference on (pp. 925-928). IEEE.
[10]. Chon, J., Cherniavsky, N., Riskin, E. A., & Ladner, R. E. (2009, November). Enabling access through real-time sign language communication over cell phones. In Signals, Systems and Computers, 2009 Conference Record of the Forty-Third Asilomar Conference on (pp. 588-592). IEEE.
[11]. Das, A., Yadav, L., Singhal, M., Sachan, R., Goyal, H., Taparia, K., et al. (2016, December). Smart glove for Sign Language communications. In Accessibility to Digital World (ICADW), 2016 International Conference on (pp. 27-31). IEEE.
[12]. Dogramadzi, S., Allen, C. R., Bell, G. D., & Rowland, R. (1999). An electromagnetic imaging system for remote sign language communication. In Instrumentation and Measurement Technology Conference, 1999. IMTC/99. Proceedings of the 16 IEEE (Vol. 3, pp. 1443-1446). IEEE.
[13]. Elmezain, M., Al-Hamadi, A., & Michaelis, B. (2010a, August). A robust method for hand gesture segmentation and recognition using Forward Spotting scheme in conditional random fields. In Pattern Recognition (ICPR), 2010 20 International Conference on (pp. 3850-3853). IEEE.
[14]. Elmezain, M., Al-Hamadi, A., Sadek, S., & Michaelis, B. (2010b, December). Robust methods for hand gesture spotting and recognition using Hidden Markov Models and conditional random fields. In Signal Processing and Information Technology (ISSPIT), 2010 IEEE International Symposium on (pp. 131-136). IEEE.
[15]. Elmezain, M., Al-Hamadi, A., & Michaelis, B. (2009, November). Hand trajectory-based gesture spotting and recognition using HMM. In Image Processing (ICIP), 2009 16 IEEE International Conference on (pp. 3577-3580). IEEE.
[16]. Erazo, O., Baloian, N., Pino, J. A., & Ochoa, S. F. (2017, April). Designing hand gesture interfaces for easing students participation from their spot. In Computer Supported Cooperative Work in Design (CSCWD), 2017 IEEE 21 International Conference on (pp. 133-138). IEEE.
[17]. Erdem, U. M., & Sclaroff, S. (2002). Automatic detection of relevant head gestures in American Sign Language communication. In Pattern Recognition, 2002. Proceedings. 16 International Conference on (Vol. 1, pp. 460-463). IEEE.
[18]. Ichikawa, Y., Tashiro, S., Ito, H., & Hikawa, H. (2016, December). Real time gesture recognition system with gesture spotting function. In Computational Intelligence (SSCI), 2016 IEEE Symposium Series on (pp. 1-7). IEEE.
[19]. Kim, D., Song, J., & Kim, D. (2007). Simultaneous gesture segmentation and recognition based on forward spotting accumulative HMMs. Pattern Recognition, 40(11), 3012-3026.
[20]. Kim, S. W., Lee, J. M., Oh, J. Y., & Aoki, Y. (1999). A Comparative study on the sign-language communication systems between Korea and Japan through 2D and 3D character models on the Internet. In Image Processing, 1999. ICIP 99. Proceedings. 1999 International Conference on (Vol. 4, pp. 227-231). IEEE.
[21]. Kim, S. W., Li, Z. X., & Aoki, Y. (2004, December). On intelligent avatar communication using Korean, Chinese, and Japanese sign-languages: An overview. In Control, Automation, Robotics and Vision Conference, 2004. ICARCV 2004 8 (Vol. 1, pp. 747-752). IEEE.
[22]. Krak, I., & Kondratiuk, S. (2017, September). Crossplatform software for the development of sign communication system: Dactyl language modelling. In Computer Sciences and Information Technologies (CSIT), 2017 12 International Scientific and Technical Conference on (Vol. 1, pp. 167-170). IEEE.
[23]. Krishnan, N. C., Lade, P., & Panchanathan, S. (2010, July). Activity gesture spotting using a threshold model based on adaptive boosting. In Multimedia and Expo (ICME), 2010 IEEE international conference on (pp. 155- 160). IEEE.
[24]. Lee, D., Yoon, H., & Kim, J. (2016, October). Continuous gesture recognition by using gesture spotting. In Control, Automation and Systems (ICCAS), 2016 16 International Conference on (pp. 1496-1498). IEEE.
[25]. Li, J., Aoki, Y., & Kim, S. W. (1998). Development of a sign language communication system between Japanese and Korean. In Signal Processing Proceedings, 1998. ICSP'98. 1998 Fourth International Conference on (Vol. 2, pp. 916-919). IEEE.
[26]. López-Noriega, J. E., Fernández-Valladares, M. I., & Uc-Cetina, V. (2014, September). Glove-based sign language recognition solution to assist communication for deaf users. In Electrical Engineering, Computing Science and Automatic Control (CCE), 2014 11 International Conference on (pp. 1-6). IEEE.
[27]. Malgireddy, M. R., Corso, J. J., Setlur, S., Govindaraju, V., & Mandalapu, D. (2010, August). A framework for hand gesture recognition and spotting using sub-gesture modeling. In Pattern Recognition (ICPR), 2010 20 International Conference on (pp. 3780- 3783). IEEE.
[28]. Manoranjan, M. D., & Robinson, J. A. (2000). Practical low-cost visual communication using binary images for deaf sign language. IEEE Transactions on Rehabilitation Engineering, 8(1), 81-88.
[29]. Morguet, P., & Lang, M. (1998, October). Spotting dynamic hand gestures in video image sequences using Hidden Markov Models. In Image Processing, 1998. ICIP 98. Proceedings. 1998 International Conference on (pp. 193-197). IEEE.
[30]. Nagaya, S., Seki, S., & Oka, R. (1996, October). A theoretical consideration of pattern space trajectory for gesture spotting recognition. In Automatic Face and Gesture Recognition, 1996. Proceedings of the Second International Conference on (pp. 72-77). IEEE.
[31]. Neto, P., Pereira, D., Pires, J. N., & Moreira, A. P. (2013, May). Real-time and continuous hand gesture spotting: An approach based on Artificial Neural Networks. In Robotics and Automation (ICRA), 2013 IEEE International Conference on (pp. 178-183). IEEE.
[32]. Peng, B., & Qian, G. (2011). Online gesture spotting from visual hull data. IEEE Transactions on Pattern Analysis and Machine Intelligence, 33(6), 1175-1188.
[33]. Raghavan, R. J., Prasad, K. A., Muraleedharan, R., & Geetha, M. (2013, October). Animation system for Indian Sign Language communication using LOTS notation. In Emerging Trends in Communication, Control, Signal Processing & Computing Applications (C2SPCA), 2013 International Conference on (pp. 1-7). IEEE.
[34]. Roh, M. C., Christmas, B., Kittler, J., & Lee, S. W. (2006, April). Gesture spotting in low-quality video with features based on curvature scale space. In Automatic Face and Gesture Recognition, 2006. FGR 2006. 7 International Conference on (pp. 375-380). IEEE.
[35]. Samraj, A., Mehrdel, N., & Sayeed, S. (2017, May). Sign language communication and authentication using sensor fusion of hand glove and photometric signals. In Information Technology (ICIT), 2017 8 International Conference on (pp. 214-221). IEEE.
[36]. Siau, T. L., & Li, L. (2001, December). An online sign language communication system. In Web Information Systems Engineering, 2001. Proceedings of the Second International Conference on (Vol. 1, pp. 142-147). IEEE.
[37]. Vassilia, P. N., & Konstantinos, M. G. (2003, July). Towards an assistive tool for Greek Sign Language communication. In Advanced Learning Technologies, 2003. Proceedings. The 3 IEEE International Conference on (pp. 125-129). IEEE.
[38]. Whybray, M. W. (1992, October). Moving picture transmission at low bitrates for sign language communication. In Image Processing for Disabled People, IEE Colloquium on (pp. 8-1). IET.
[39]. Xie, C., Cheng, J., Xie, Q., & Zhao, W. (2010, October). Vision-based hand gesture spotting and recognition. In Information Networking and Automation (ICINA), 2010 International Conference on (Vol. 1, pp.391-395). IEEE.
[40]. Yang, H. D., Park, A. Y., & Lee, S. W. (2007). Gesture spotting and recognition for human-robot interaction. IEEE Transactions on Robotics, 23(2), 256-270.
[41]. Yang, H. D., Park, A. Y., & Lee, S. W. (2006, April). Robust spotting of key gestures from whole body motion sequence. In Automatic Face and Gesture Recognition, 2006. FGR 2006. 7 International Conference on (pp. 231-236). IEEE.
[42]. Yang, X. (2013, December). Error resilient multiple description coding for mobile sign language communication. In Image and Signal Processing (CISP), 2013 6 International Congress on (Vol. 1, pp. 21-25). IEEE.
[43]. Yao, Y., & Li, C. T. (2015, September). Hand gesture recognition and spotting in uncontrolled environments based on classifier weighting. In Image Processing (ICIP), 2015 IEEE International Conference on (pp. 3082-3086). IEEE.

Purchase Instant Access

Single Article

North Americas,UK,
Middle East,Europe
India Rest of world
USD EUR INR USD-ROW
Pdf 35 35 200 20
Online 35 35 200 15
Pdf & Online 35 35 400 25

If you have access to this article please login to view the article or kindly login to purchase the article
Options for accessing this content:
  • If you would like institutional access to this content, please recommend the title to your librarian.
    Library Recommendation Form
  • If you already have i-manager's user account: Login above and proceed to purchase the article.
  • New Users: Please register, then proceed to purchase the article.