2293 1246
Full Length Article
Fusion: Practice and Applications
Volume 2 , Issue 1, PP: 22-30 , 2020 | Cite this article as | XML | Html |PDF

Title

Identification of Facial Expressions using Deep Neural Networks

  Harsh Jain 1 * ,   Parv Bharti 2 ,   Arun Kumar Dubey 3 ,   Preetika Soni 4

1  Information Technology Bharati Vidyapeeth's College of Engg, New Delhi, India
    (harshjain2525@gmail.com)

2  Information Technology Bharati Vidyapeeth's College of Engg, New Delhi, India
    (parv.bharti@gmail.com)

3  Information Technology Bharati Vidyapeeth's College of Engg, New Delhi, india;
    (arudubey@gmail.com)

4  Information Technology Bharati Vidyapeeth's College of Engg, New Delhi, India
    (sonipreetika20@gmail.com)


Doi   :   https://doi.org/10.54216/FPA.020101

Received: March 13, 2020 Revised: April 18, 2020 Accepted: May 10, 2020

Abstract :

Detecting and analyzing emotions from human facial movements is a problem defined and developed over many years for the benefits it brings. During playback, when developing data sets, data sets with methods become more and more complex, and accuracy and difficulty increase gradually. In the given paper, we will use a deep structured learned network using the two mechanisms - Vgg and Resnet50 with deep layers to classify emotions based on input images in complex environments. Besides that, we also use learning methods combining many modern models to increase accuracy. Experimental results show that the two proposed methods have better results than some modern methods in emotional recognition problems for complex input images and some results reported in scientific studies. Particularly combined learning method gives good accuracy - 66.15% on the dataset FER2013

Keywords :

Facial expression; Deep Neural Network; VGG , Resnet;

References :

 

[1] A. T. Lopes, E. D. Aguiar, and T. Oliveirasantos. A facial expression recognition system using CNN. In Graphics, Patterns and Images, pages 273–280, 2015. 

[2] B. E. Bejnordi, J. Lin, B. Glass, M. Mullooly, G. L. Gierach, M. E. Sherman, N. Karssemeijer, J. V. D. Laak, and A. H. Beck. Deep learning-based assessment of tumor associated stroma for diagnosing breast cancer in histopathology images. In IEEE International Symposium on Biomedical Imaging, pages 929–932, 2017. 

[3] C. F. Bobis, R. C. Gonza´lez, J. Cancelas, I. A´ lvarez, and J. Enguita. Face recognition using binary thresholding for features extraction. In International Conference on Image Analysis and Processing, page 1077, 1999. 

[4] H. Li, H. Li, H. Li, H. Li, and H. Li. Does resnet learn good general-purpose features? In International Conference on Artificial Intelligence, Automation and Control Technologies, page 19, 2017. 

[5] I. J. Goodfellow, D. Erhan, P. L. Carrier, A. Courville, M. Mirza, B. Hamner, W. Cukierski, Y. Tang, D. Thaler, and D. H. Lee. Challenges in representation learning: A report on three machine learning contests. Neural Netw, 64:59–63, 2015. 

[6] M. A. Imran, M. S. U. Miah, and H. Rahman. Face recognition using eigenfaces. Proc Cvpr, 118(5):586–591, 2002. 

[7] Shen, Dinggang, Guorong Wu, and Heung-Il Suk. “Deep Learning in Medical Image Analysis.” Annual review of biomedical engineering 19 (2017): 221–248. PMC. Web. 25 June 2018. 

[8] Y. Tu, S. Li, and M. Wang. Intelligent facial expression recognition system r&c-fer. In Intelligent Control and Automation, 2008. Wcica 2008. World Congress on, pages 2501–2506, 2008. 

[9] Y. Zhang, F. Chang, L. I. Nanjun, H. Liu, and Z. Gai. Modified alexnet for dense crowd counting. (cii), 2017.

[10] Yijun Gan. Facial Expression Recognition Using Convolutional Neural Network, 2018

[11] S. Xie, R. Girshick, P. Dollar, Z. Tu and K. He. Aggregated Residual Transformations for Deep Neural Networks. arXiv preprint arXiv:1611.05431v1,2016.

[12] Zhou, Yitao & Ren, Fuji & Nishide, Shun & Kang, Xin. (2019). Facial Sentiment Classification Based on Resnet-18 Model. 463-466. 10.1109/EEI48997.2019.00106. 

[13] He, Kaiming & Zhang, Xiangyu & Ren, Shaoqing & Sun, Jian. (2016). Deep Residual Learning for Image Recognition. 770-778. 10.1109/CVPR.2016.90.        

[14] He, Kaiming & Zhang, Xiangyu & Ren, Shaoqing & Sun, Jian. (2016). Identity Mappings in Deep Residual Networks. 9908. 630-645. 10.1007/978-3-319-46493-0_38. 

[15] Simonyan, Karen & Zisserman, Andrew. (2014). Very Deep Convolutional Networks for Large-Scale Image Recognition. arXiv 1409.1556.

[16] Cheng, Shuo & Zhou, Guohui. (2019). Facial Expression Recognition Method Based on Improved VGG Convolutional Neural Network. International Journal of Pattern Recognition and Artificial Intelligence. 

 

 

 

 

 

 


Cite this Article as :
Style #
MLA Harsh Jain , Parv Bharti , Arun Kumar Dubey, Preetika Soni. "Identification of Facial Expressions using Deep Neural Networks." Fusion: Practice and Applications, Vol. 2, No. 1, 2020 ,PP. 22-30 (Doi   :  https://doi.org/10.54216/FPA.020101)
APA Harsh Jain , Parv Bharti , Arun Kumar Dubey, Preetika Soni. (2020). Identification of Facial Expressions using Deep Neural Networks. Journal of Fusion: Practice and Applications, 2 ( 1 ), 22-30 (Doi   :  https://doi.org/10.54216/FPA.020101)
Chicago Harsh Jain , Parv Bharti , Arun Kumar Dubey, Preetika Soni. "Identification of Facial Expressions using Deep Neural Networks." Journal of Fusion: Practice and Applications, 2 no. 1 (2020): 22-30 (Doi   :  https://doi.org/10.54216/FPA.020101)
Harvard Harsh Jain , Parv Bharti , Arun Kumar Dubey, Preetika Soni. (2020). Identification of Facial Expressions using Deep Neural Networks. Journal of Fusion: Practice and Applications, 2 ( 1 ), 22-30 (Doi   :  https://doi.org/10.54216/FPA.020101)
Vancouver Harsh Jain , Parv Bharti , Arun Kumar Dubey, Preetika Soni. Identification of Facial Expressions using Deep Neural Networks. Journal of Fusion: Practice and Applications, (2020); 2 ( 1 ): 22-30 (Doi   :  https://doi.org/10.54216/FPA.020101)
IEEE Harsh Jain, Parv Bharti, Arun Kumar Dubey, Preetika Soni, Identification of Facial Expressions using Deep Neural Networks, Journal of Fusion: Practice and Applications, Vol. 2 , No. 1 , (2020) : 22-30 (Doi   :  https://doi.org/10.54216/FPA.020101)