Main Article Content
This research aims to implement a student class attendance and interest assessment system with facial expression detection using a webcam, operating in a real-time environment within the classroom. The expressions on the faces of the students based on emotion dataset including "Neutral", "Happy", "Surprised", "Angry", "Sad" and "Fear" can be examined as attention and inattention. This system can be divided into two parts: the teacher section and student section. On the teacher section, there will be a dashboard displaying all students' details logging in to the course of study and the percentage of individual student attention in each subject. On the other hand, the student section will be able to log in by joining the class via the teacher's given code. Also, students can check their interest status. MobileFaceNet model was selected as a face detector. In the meantime, the mini-Xception model was also selected as facial expression detection. Therefore, the emotional status of students is to be detected via webcam by using MobileFaceNet and mini-Xception. Facial emotional expression status can be divided into two groups: Facial emotions are "normal, happy, surprised", showing interest. Simultaneously, the facial emotion groups are "angry, sad, afraid", showing their indifference. After finishing the study, the system will calculate the student's interest and disregard percentage and student attention as a summary report. A representative sample group of 4 students tested for finding attention and inattention in the classroom three times for 30 minutes. The overall results show that students are interested in 82.81% and disregard 17.19%.
Jiranantanagorn, P. and Shen, H. 2016. Sentiment analysis and visualisation in a backchannel system. In Proceedings of the 28th Australian Conference on Computer-Human Interaction, 353-357.
Leitner, P., Kung-Keat, T. and Ng, J. 2016. Confused, bored, excited? An emotion based approach to the design of online learning systems. In 7th International Conference on University Learning and Teaching (InCULT 2014) Proceedings, 221-233. Springer, Singapore.
Behnke, S. 2003. Hierarchical neural networks for image interpretation. Vol. 2766. Springer.
Essid, O., Laga, H. and Samir, C. 2018. Automatic detection and classification of manufacturing defects in metal boxes using deep neural networks. PLoS ONE, 13(11), e0203192.
วัชรชัย คงศิริวัฒนา และสรัญญา สว่างศรี. 2563. การตรวจสอบคุณภาพของถุงปูนด้วยวิธีการประมวลผลภาพโดยใช้โครงข่ายประสาทเทียมคอนโวลูชัน: กรณีศึกษากระบวนการผลิตถุงปูนซีเมนต์. PSRU Journal of Science and Technology, 5(1), 93-106. [Watcharachai Kongsiriwattana and Sarunya Sawangsri. 2020. The Quality Detection of Cement Bags by Using Image Processing with Convolutional Neural Networks: Case Study of a Manufacture Production Line of Cement Bags. PSRU Journal of Science and Technology, 5(1), 93-106. (in Thai)]
Datta, A.K., Datta, M. and Banerjee, P.K. 2015. Face Detection and Recognition: Theory and Practice. Chapman and Hall/CRC.
Kortli, Y., Jridi, M., Falou, A.A. and Atri, M. 2020. Face Recognition Systems: A Survey. Sensors (Basel, Switzerland), 20(2), 342.
Tarnowski, P., Kolodziej, M., Majkowski, A. and Rak, R.J. 2017. Emotion recognition using facial expressions. Procedia Computer Science, 108, 1175-1184.
Arriaga, O., Valdenegro-Toro, M. and Plöger, P. 2017. Real-time convolutional neural networks for emotion and gender classification. Available at: https://https://arxiv. org/abs/1710.07557. Retrieved 9 Jan 2021.
Ultra-Light Fast Generic Face Detector [Online], Available at: https://github.com/ Linzaer/Ultra-Light-Fast-Generic-Face-Detector-1MB. Retrieved 30 November 2020.
Yang, D., Alsadoon, A., Prasad, P.W.C., Singh, A.K. and Elchouemi, A. 2018. An emotion recognition model based on facial recognition in virtual learning environment. Procedia Computer Science, 125, 2-10.
Krithika, L.B. and Lakshmi, P.G.G. 2016. Student emotion recognition system (SERS) for e-learning improvement based on learner concentration metric. Procedia Computer Science, 85, 767-776.
Khalfallah, J. and Slama, J.B.H. 2015. Facial expression recognition for intelligent tutoring systems in remote laboratories platform. Procedia Computer Science, 73, 274-281.
Putra, W.B. and Arifin, F. 2019. Real-Time Emotion Recognition System to Monitor Student’s Mood in a Classroom. Journal of Physics: Conference Series, 1413, 012021.
Jiranantanagorn, P., Bhardwaj, P., Li, R., Shen, H., Goodwin, R. and Teoh, K.K. 2015. Designing a Mobile Digital Backchannel System for Monitoring Sentiments and Emotions in Large Lectures. In Proceedings of the ASWEC 2015 24th Australasian Software Engineering Conference, 141-144.
Google Classroom for Education [Online], Available at: https://edu.google.com/ products/classroom/. Retrieved 9 January 2021.
Prince, S.J.D., Elder, J., Hou, Y., Sizinstev, M. and Olevskiy, E. 2006. Towards Face Recognition at a Distance, IET Conference on Crime and Security, London, 570-575.
Hassouneh, A., Mutawa, A.M. and Murugappan, M. 2020. Development of a Real-Time Emotion Recognition System Using Facial Expressions and EEG based on machine learning and deep neural network methods. Informatics in Medicine Unlocked, 20, 100372.