Title: Bioinformatics-enabled deep learning for video-based emotion recognition

Authors: S. Jothimani; K. Premalatha

Addresses: Department of Information Technology, Bannari Amman Institute of Technology, Sathyamangalam – 638401, Tamil Nadu, India ' Department of Information Technology, Bannari Amman Institute of Technology, Sathyamangalam – 638401, Tamil Nadu, India

Abstract: Emotion recognition from videos improves human-computer interaction by identifying the user's emotions and motives. Video-based emotion identification is difficult for academics because standard algorithms often fail at accuracy and emotion classification. Deep residual-inception, also known as deep convolutional neural network (DCNN), uses deep neural networks to predict emotions to address these issues. The video is split into video keyframes to recognise the faces. Keyframe extraction can help prevent the influence of emotion-neutral frames and speed up feature extraction. The study proposes a deep residual-inception (DeepRIS) model which extracts the features automatically and uses the residual network concept to reduce overfitting. The study proposes a customised activation function Smish, which is the combination of the swish and the Mish activation function which is used to improve the recognition of emotions. Inception, residual, and facial expression video frames are used to recognise emotions in this model. DeepRIS excels on the CK+ and RAVDESS datasets, achieving 94.46% and 99.49% accuracy, respectively. It surpasses state-of-the-art models, with recall and precision values of 0.989 and 0.988, showcasing its superiority in emotion recognition.

Keywords: deep convolutional neural network; DCNN; inception; l2 regulariser; Mish; residual; Smish; Swish; video emotion recognition.

DOI: 10.1504/IJBRA.2025.148133

International Journal of Bioinformatics Research and Applications, 2025 Vol.21 No.4, pp.436 - 462

Received: 06 Mar 2024
Accepted: 01 Aug 2024

Published online: 26 Aug 2025 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article