Facial Emotion Recognition using Hybrid Features-Novel Leaky Rectified Triangle Linear Unit Activation Function Based Deep Convolutional Neural Network

Anjani Suputri Devi D.*, Suneetha Eluri**
*-** Department of Computer Science and Engineering, Jawaharlal Nehru Technological University, Kakinada, Andhra Pradesh, India.
Periodicity:April - June'2022
DOI : https://doi.org/10.26634/jip.9.2.18968

Abstract

Facial Expression Recognition (FER) is an important topic that is used in many areas. FER categorizes facial expressions according to human emotions. Most networks are designed for facial emotion recognition but still have some problems, such as performance degradation and the lowest classification accuracy. To achieve greater classification accuracy, this paper proposes a new Leaky Rectified Triangle Linear Unit (LRTLU) activation function based on the Deep Convolutional Neural Network (DCNN). The input images are pre-processed using the new Adaptive Bilateral Filter Contourlet Transform (ABFCT) filtering algorithm. The face is then detected in the filtered image using the Chehra face detector. From the detected face image, facial landmarks are extracted using a cascading regression tree, and important features are extracted based on the detected landmarks. The extracted feature set is then passed as input to the Leaky Rectified Triangle Linear Unit Activation Function Based Deep Convolutional Neural Network (LRTLU-DCNN), which classifies the input image expressions into six emotions, such as happiness, sadness, neutrality, anger, disgust, and surprise. Experimentation of the proposed method is carried out using the Extended Cohn-Kanade (CK+) and Japanese Female Facial Expression (JAFFE) datasets. The proposed work provides a classification accuracy of 99.67347% for the CK+ dataset along with 99.65986% for the JAFFE dataset.

Keywords

Adaptive Bilateral Filter Contourlet Transform (ABFCT), Chehra Face Detector, Cascaded Regression Tree, Local Shearlet Tetra Pattern (Lstrp), Leaky Rectified Triangle Linear Unit Activation Function Based Deep Convolutional Neural Network (LRTLU-DCNN).

How to Cite this Article?

Devi, D. A. S., and Eluri, S. (2022). Facial Emotion Recognition using Hybrid Features-Novel Leaky Rectified Triangle Linear Unit Activation Function Based Deep Convolutional Neural Network. i-manager’s Journal on Image Processing, 9(2), 12-27. https://doi.org/10.26634/jip.9.2.18968

References

[1]. Aamir, M., Ali, T., Shaf, A., Irfan, M., & Saleem, M. Q. (2020). ML-DCNNet: multi-level deep convolutional neural network for facial expression recognition and intensity estimation. Arabian Journal for Science and Engineering, 45(12), 10605-10620. https://doi.org/10.1007/s13369-020-04811-0
[2]. Abidin, Z., & Harjoko, A. (2012). A neural network based facial expression recognition using fisherface. International Journal of Computer Applications, 59(3), 30-34.
[3]. Altameem, T., & Altameem, A. (2020). Facial expression recognition using human machine interaction and multi-modal visualization analysis for healthcare applications. Image and Vision Computing, 103, 104044. https://doi.org/10.1016/j.imavis.2020.104044
[4]. Arora, M., Kumar, M., & Garg, N. K. (2018). Facial emotion recognition system based on PCA and gradient features. National Academy Science Letters, 41(6), 365-368. https://doi.org/10.1007/s40009-018-0694-2
[5]. Ashir, A. M., Eleyan, A., & Akdemir, B. (2020). Facial expression recognition with dynamic cascaded classifier. Neural Computing and Applications, 32(10), 6295-6309. https://doi.org/10.1007/s00521-019-04138-4
[6]. Bougourzi, F., Dornaika, F., Mokrani, K., Taleb-Ahmed, A., & Ruichek, Y. (2020). Fusing transformed deep and shallow features (FTDS) for image-based facial expression recognition. Expert Systems with Applications, 156, 113459. https://doi.org/10.1016/j.eswa.2020.113459
[7]. Boutorh, A., & Guessoum, A. (2016). Complex diseases SNP selection and classification by hybrid association rule mining and artificial neural network—based evolutionary algorithms. Engineering Applications of Artificial Intelligence, 51, 58-70. https://doi.org/10.1016/j.engappai.2016.01.004
[8]. Cai, Y., Guo, Y., Jiang, H., & Huang, M. C. (2018). Machine-learning approaches for recognizing muscle activities involved in facial expressions captured by multichannels surface electromyogram. Smart Health, 5, 15-25. https://doi.org/10.1016/j.smhl.2017.11.002
[9]. Han, S., Meng, Z., Khan, A. S., & Tong, Y. (2016). Incremental boosting convolutional neural network for facial action unit recognition. Advances in Neural Information Processing Systems, 29.
[10]. Happy, S. L., & Routray, A. (2014). Automatic facial expression recognition using features of salient facial patches. IEEE Transactions on Affective Computing, 6(1), 1-12. https://doi.org/10.1109/TAFFC.2014.2386334
[11]. Happy, S. L., Dantcheva, A., & Bremond, F. (2019). A Weakly Supervised learning technique for classifying facial expressions. Pattern Recognition Letters, 128, 162-168. https://doi.org/10.1016/j.patrec.2019.08.025
[12]. Janu, N., Mathur, P., Gupta, S. K., & Agrwal, S. L. (2017, January). Performance analysis of frequency domain based feature extraction techniques for facial expression recognition. In 2017 7th International Conference on Cloud Computing, Data Science & Engineering-Confluence, (pp. 591-594). https://doi.org/10.1109/CONFLUENCE.2017.7943220
[13]. Jian, J. I. A. O., Jun, L. I. N., Xiao-hua, Z. H. O. U., & Hao, L. U. (2011). Inversion of neural network Rayleigh wave dispersion based on LM algorithm. Procedia Engineering, 15, 5126-5132. https://doi.org/10.1016/j.proeng.2011.08.951
[14]. Jung, H., Lee, S., Yim, J., Park, S., & Kim, J. (2015). Joint fine-tuning in deep neural networks for facial expression recognition. In Proceedings of the IEEE International Conference on Computer Vision, (pp. 2983-2991).
[15]. Kas, M., Ruichek, Y., & Messoussi, R. (2021). New framework for person-independent facial expression recognition combining textural and shape analysis through new feature extraction approach. Information Sciences, 549, 200-220. https://doi.org/10.1016/j.ins.2020.10.065
[16]. Kobayashi, M. (2017). Gradient descent learning for quaternionic Hopfield neural networks. Neurocomputing, 260, 174-179. https://doi.org/10.1016/j.neucom.2017.04.025
[17]. Kumar, P., Roy, P. P., & Dogra, D. P. (2018). Independent bayesian classifier combination based sign language recognition using facial expression. Information Sciences, 428, 30-48. https://doi.org/10.1016/j.ins.2017.10.046
[18]. Kurup, A. R., Ajith, M., & Ramón, M. M. (2019). Semisupervised facial expression recognition using reduced spatial features and deep belief networks. Neurocomputing, 367, 188-197. https://doi.org/10.1016/j.neucom.2019.08.029
[19]. Lekdioui, K., Messoussi, R., Ruichek, Y., Chaabi, Y., & Touahni, R. (2017). Facial decomposition for expression recognition using texture/shape descriptors and SVM classifier. Signal Processing: Image Communication, 58, 300-312. https://doi.org/10.1016/j.image.2017.08.001
[20]. Li, H., & Xu, H. (2020). Deep reinforcement learning for robust emotional classification in facial expression recognition. Knowledge-Based Systems, 204, 106172. https://doi.org/10.1016/j.knosys.2020.106172
[21]. Li, J., Jin, K., Zhou, D., Kubota, N., & Ju, Z. (2020). Attention mechanism-based CNN for facial expression recognition. Neurocomputing, 411, 340-350. https://doi.org/10.1016/j.neucom.2020.06.014
[22]. Li, Z., Wang, C., Liu, X., & Wang, Y. (2021). Facial expression description and recognition based on fuzzy semantic concepts. Future Generation Computer Systems, 114, 619-628. https://doi.org/10.1016/j.future.2020.08.034
[23]. Lin, C. H. (2016). Novel application of continuously variable transmission system using composite recurrent Laguerre orthogonal polynomials modified PSO NN control system. ISA Transactions, 64, 405-417. https://doi.org/10.1016/j.isatra.2016.05.013
[24]. Liu, M., Li, S., Shan, S., Wang, R., & Chen, X. (2014, November). Deeply learning deformable facial action parts model for dynamic expression analysis. In Asian Conference on Computer Vision, 9006, 143-157. https://doi.org/10.1007/978-3-319-16817-3_10
[25]. Liu, Y., Yuan, X., Gong, X., Xie, Z., Fang, F., & Luo, Z. (2018). Conditional convolution neural network enhanced random forest for facial expression recognition. Pattern Recognition, 84, 251-261. https://doi.org/10.1016/j.patcog.2018.07.016
[26]. Mehendale, N. (2020). Facial emotion recognition using convolutional neural networks (FERC). SN Applied Sciences, 2(3), 1-8. https://doi.org/10.1007/s42452-020-2234-1
[27]. Meng, Z., Liu, P., Cai, J., Han, S., & Tong, Y. (2017, May). Identity-aware convolutional neural network for facial expression recognition. In 2017 12th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2017), (pp. 558-565). https://doi.org/10.1109/FG.2017.140
[28]. Minaee, S., Minaei, M., & Abdolrashidi, A. (2021). Deep-emotion: Facial expression recognition using attentional convolutional network. Sensors, 21(9), 3046. https://doi.org/10.3390/s21093046
[29]. Mlakar, U., Fister, I., Brest, J., & Potočnik, B. (2017). Multi-objective differential evolution for feature selection in facial expression recognition systems. Expert Systems with Applications, 89, 129-137. https://doi.org/10.1016/j.eswa.2017.07.037
[30]. Mollahosseini, A., Chan, D., & Mahoor, M. H. (2016, March). Going deeper in facial expression recognition using deep neural networks. In 2016 IEEE Winter Conference on Applications of Computer Vision (WACV), (pp. 1-10). https://doi.org/10.1109/WACV.2016.7477450
[31]. Niu, B., Gao, Z., & Guo, B. (2021). Facial expression recognition with LBP and ORB features. Computational Intelligence and Neuroscience, 2021. https://doi.org/10.1155/2021/8828245
[32]. Pentz, E. (2008). DOIs and PubMed Central - why no links? Retrieved from https://www.crossref.org/categories/pubmed/
[33]. Rescigno, M., Spezialetti, M., & Rossi, S. (2020). Personalized models for facial emotion recognition through transfer learning. Multimedia Tools and Applications, 79(47), 35811-35828. https://doi.org/10.1007/s11042-020-09405-4
[34]. Revina, I. M., & Emmanuel, W. S. (2019). Face expression recognition with the optimization based multi- SVNN classifier and the modified LDP features. Journal of Visual Communication and Image Representation, 62, 43-55. https://doi.org/10.1016/j.jvcir.2019.04.013
[35]. Rifai, S., Bengio, Y., Courville, A., Vincent, P., & Mirza, M. (2012, October). Disentangling factors of variation for facial expression recognition. In European Conference on Computer Vision, 7577, 808-822. https://doi.org/10.1007/978-3-642-33783-3_58
[36]. Sánchez, D., Melin, P., & Castillo, O. (2017). Optimization of modular granular neural networks using a firefly algorithm for human recognition. Engineering Applications of Artificial Intelligence, 64, 172-186. https://doi.org/10.1016/j.engappai.2017.06.007
[37]. Shao, J., & Cheng, Q. (2021). E-FCNN for tiny facial expression recognition. Applied Intelligence, 51(1), 549-559. https://doi.org/10.1007/s10489-020-01855-5
[38]. Singh, A., Khan, M. A., & Baghel, N. (2020, February). Face emotion identification by fusing neural network and texture features: facial expression. In 2020 International Conference on Contemporary Computing and Applications (IC3A), (pp. 187-190). IEEE. https://doi.org/10.1109/IC3A48958.2020.233294
[39]. Sreedharan, N. P. N., Ganesan, B., Raveendran, R., Sarala, P., Dennis, B., & Boothalingam R, R. (2018). Grey wolf optimisation based feature selection and classification for facial emotion recognition. IET Biometrics, 7(5), 490-499. https://doi.org/10.1049/ietbmt.2017.0160
[40]. Sun, X., Xia, P., & Ren, F. (2021). Multi-attention based deep neural network with hybrid features for dynamic sequential facial expression recognition. Neurocomputing, 444, 378-389. https://doi.org/10.1016/j.neucom.2019.11.127
[41]. Wang, H., Wei, S., & Fang, B. (2020). Facial expression recognition using iterative fusion of MO-HOG and deep features. The Journal of Supercomputing, 76(5), 3211-3221. https://doi.org/10.1007/s11227-018-2554-8
[42]. Wang, S., Yuan, Y., Zheng, X., & Lu, X. (2021). Local and correlation attention learning for subtle facial expression recognition. Neurocomputing, 453, 742-753. https://doi.org/10.1016/j.neucom.2020.07.120
[43]. Wang, X. H., Liu, A., & Zhang, S. Q. (2015). New facial expression recognition based on FSVM and KNN. Optik, 126(21), 3132-3134. https://doi.org/10.1016/j.ijleo.2015.07.073
[44]. Wang, Y., Li, Y., Song, Y., & Rong, X. (2020). The influence of the activation function in a convolution neural network model of facial expression recognition. Applied Sciences, 10(5), 1897. https://doi.org/10.3390/app10051897
[45]. Yang, M., Liu, Y., & You, Z. (2017). The Euclidean embedding learning based on convolutional neural network for stereo matching. Neurocomputing, 267, 195-200. https://doi.org/10.1016/j.neucom.2017.06.007
[46]. Zhang, T., Zheng, W., Cui, Z., Zong, Y., & Li, Y. (2018). Spatial–temporal recurrent neural network for emotion recognition. IEEE Transactions on Cybernetics, 49(3), 839-847. https://doi.org/10.1109/TCYB.2017.2788081
[47]. Zhao, X., Liang, X., Liu, L., Li, T., Han, Y., Vasconcelos, N., & Yan, S. (2016, October). Peak-piloted deep network for facial expression recognition. In European Conference on Computer Vision, 425-442. https://doi.org/10.1007/978-3-319-46475-6_27
[48]. Zheng, H., Wang, R., Ji, W., Zong, M., Wong, W. K., Lai, Z., & Lv, H. (2020). Discriminative deep multi-task learning for facial expression recognition. Information Sciences, 533, 60-71. https://doi.org/10.1016/j.ins.2020.04.041
[49]. Zhou, L., Liu, M., Ye, B., Wang, X., & Liu, Q. (2021). Sad expressions during encoding enhance facial identity recognition in visual working memory in depression: behavioural and electrophysiological evidence. Journal of Affective Disorders, 279, 630-639. https://doi.org/10.1016/j.jad.2020.10.050
If you have access to this article please login to view the article or kindly login to purchase the article

Purchase Instant Access

Single Article

North Americas,UK,
Middle East,Europe
India Rest of world
USD EUR INR USD-ROW
Online 15 15

Options for accessing this content:
  • If you would like institutional access to this content, please recommend the title to your librarian.
    Library Recommendation Form
  • If you already have i-manager's user account: Login above and proceed to purchase the article.
  • New Users: Please register, then proceed to purchase the article.