Text Emotion Recognition using Fast Text Word Embedding in Bi-Directional Gated Recurrent Unit

Akalya Devi C.*, Karthika Renuka D.**, Harisudhan T.***, Jeevanantham V. K.****, Jhanani J.*****, Kavi Varshini S.******
*-****** Department of Information Technology, PSG College of Technology, Coimbatore, Tamil Nadu, India.
Periodicity:September - November'2022
DOI : https://doi.org/10.26634/jit.11.4.19119

Abstract

Emotions are states of readiness in the mind that result from evaluations of one's own thinking or events. Although almost all of the important events in our lives are marked by emotions, the nature, causes, and effects of emotions are some of the least understood parts of the human experience. Emotion recognition is playing a promising role in the domains of human-computer interaction and artificial intelligence. A human's emotions can be detected using a variety of methods, including facial gestures, blood pressure, body movements, heart rate, and textual data. From an application standpoint, the ability to identify human emotions in text is becoming more and more crucial in computational linguistics. In this work, we present a classification methodology based on deep neural networks. The Bi-directional Gated Recurrent Unit (Bi-GRU) employed here demonstrates its effectiveness on the Multimodal Emotion Lines Dataset (MELD) when compared to Convolutional Neural Networks (CNN) and Long Short-Term Memory (LSTM). For word encoding, a comparison of three pre-trained word embeddings namely Glove, Word2Vec, and fastText is made. The findings from the MELD corpus support the conclusion that fastText is the best word embedding for the proposed Bi-GRU model. The experiment utilized the "glove.6B.300d" vector space. It consists of two million word representations in 300 dimensions trained on Common Crawl with sub-word information (600 billion tokens). The accuracy scores of GloVe, Word2Vec, and fastText (300 dimensions each) are tabulated and studied in order to highlight the improved results with fastText on the MELD dataset tested. It is observed that the Bidirectional Gated Recurrent Unit (Bi-GRU) with fastText word embedding outperforms GloVe and Word2Vec with an accuracy of 79.7%.

Keywords

Bi-GRU, Deep Neural Network, Emotion Recognition, fastText, GloVe, MELD, Word Embedding, Word2Vec.

How to Cite this Article?

Devi, C. A., Renuka, D. K., Harisudhan, T., Jeevanantham, V. K., Jhanani, J., and Varshini, S. K. (2022). Text Emotion Recognition using Fast Text Word Embedding in Bi-Directional Gated Recurrent Unit. i-manager’s Journal on Information Technology, 11(4), 1-8. https://doi.org/10.26634/jit.11.4.19119

References

[1]. Abdul-Mageed, M., & Ungar, L. (2017, July). Emonet: Fine-grained emotion detection with gated recurrent neural networks. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, 1, 718-728. https://doi.org/10.18653/v1/P17-1067
[2]. Azim, M. A., & Bhuiyan, M. H. (2018). Text to emotion extraction using super vised machine learning techniques. Telkomnika (Telecommunication Computing Electronics and Control), 16(3), 1394-1401. https://doi.org/10.12928/telkomnika.v16i3.8387
[3]. Banothu, S., Akula, S., Akarapu, V., & Rao, T. R. L. (2021). Emotion extraction and classification from twitter text. Proceedings of the International Conference on IoT Based Control Networks & Intelligent Systems. https://doi.org/10.2139/ssrn.3884771
[4]. Bharti, S. K., Varadhaganapathy, S., Gupta, R. K., Shukla, P. K., Bouye, M., Hingaa, S. K., & Mahmoud, A. (2022). Text-based emotion recognition using deep learning approach. Computational Intelligence and Neuroscience, 2022. https://doi.org/10.1155/2022/2645381
[5]. Calefato, F., Lanubile, F., & Novielli, N. (2017, October). EmoTxt: A toolkit for emotion recognition from text. In 2017 Seventh International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW), (pp. 79-80). IEEE. https://doi.org/10.1109/ACIIW.2017.8272591
[6]. Chen, S., & Jin, Q. (2015, October). Multi-modal dimensional emotion recognition using recurrent neural networks. In Proceedings of the 5th International Workshop on Audio/Visual Emotion Challenge, (pp. 49-56). https://doi.org/10.1145/2808196.2811638
[7]. Mikolov, T., Grave, E., Bojanowski, P., Puhrsch, C., & Joulin, A. (2017). Advances in pre-training distributed word representations. arXiv preprint arXiv:1712.09405. https://doi.org/10.48550/arXiv.1712.09405
[8]. Park, S. H., Bae, B. C., & Cheong, Y. G. (2020, February). Emotion recognition from text stories using an emotion embedding model. In 2020 IEEE International Conference on Big Data and Smart Computing (Bigcomp) (pp. 579-583). IEEE. https://doi.org/10.1109/BigComp48618.2020.00014
[9]. Weidman, A. C., Steckler, C. M., & Tracy, J. L. (2017). The jingle and jangle of emotion assessment: Imprecise measurement, casual scale usage, and conceptual fuzziness in emotion research. Emotion, 17(2), 267-295. https://doi.org/10.1037/emo0000226
[10]. Wu, C. H., Chuang, Z. J., & Lin, Y. C. (2006). Emotion recognition from text using semantic labels and separable mixture models. ACM Transactions on Asian Language Information Processing (TALIP), 5(2), 165-183. https://doi.org/10.1145/1165255.1165259
[11]. Yam, C. Y. (2015). Emotion Detection and Recognition from Text Using Deep Learning. Retrieved from https://devblogs.microsoft.com/cse/2015/11/29/emotion-detection-and-recognition-from-text-usingdeep-learning/
If you have access to this article please login to view the article or kindly login to purchase the article

Purchase Instant Access

Single Article

North Americas,UK,
Middle East,Europe
India Rest of world
USD EUR INR USD-ROW
Pdf 35 35 200 20
Online 35 35 200 15
Pdf & Online 35 35 400 25

Options for accessing this content:
  • If you would like institutional access to this content, please recommend the title to your librarian.
    Library Recommendation Form
  • If you already have i-manager's user account: Login above and proceed to purchase the article.
  • New Users: Please register, then proceed to purchase the article.