Title: Virtual representation of facial avatar through weighted emotional recognition

Authors: Yong-Hwan Lee

Addresses: Far East University, Gamgok, Chungbuk, Korea

Abstract: This paper proposes a novel multiple emotion recognition scheme from facial expressions which is able to understand a combination of different emotions using active appearance model (AAM), k-nearest neighbour (k-NN) and the proposed classification model in mobile environments. The proposed method classifies five basic emotions (normal, happy, sad, angry, and surprise) and ambiguous facial emotions by weighted combining of the basic emotions in real time. Whereas most previous researches of emotion recognition identify various kinds of single emotion, this work recognises various multiple emotions with combination of five basic emotions. To evaluate the performance of the proposed method, we can achieve the best 93% accuracy in recognising the emotion. The experimental results show that the proposed method effectively performed well over the recognition of facial emotion, and the obtained result indicates the good performance enough to be applicable to mobile environments.

Keywords: multiple emotions; emotion recognition; facial expression analysis; emotion classification models; facial avatars; virtual representation; modelling; facial expressions; facial emotions; active appearance model; AAM; k-nearest neighbour; kNN; mobile environments.

DOI: 10.1504/IJIPT.2017.083034

International Journal of Internet Protocol Technology, 2017 Vol.10 No.1, pp.30 - 35

Received: 15 Dec 2015
Accepted: 03 Jun 2016

Published online: 13 Mar 2017 *

Full-text access for editors Access for subscribers Purchase this article Comment on this article