Virtual representation of facial avatar through weighted emotional recognition Online publication date: Sat, 18-Mar-2017
by Yong-Hwan Lee
International Journal of Internet Protocol Technology (IJIPT), Vol. 10, No. 1, 2017
Abstract: This paper proposes a novel multiple emotion recognition scheme from facial expressions which is able to understand a combination of different emotions using active appearance model (AAM), k-nearest neighbour (k-NN) and the proposed classification model in mobile environments. The proposed method classifies five basic emotions (normal, happy, sad, angry, and surprise) and ambiguous facial emotions by weighted combining of the basic emotions in real time. Whereas most previous researches of emotion recognition identify various kinds of single emotion, this work recognises various multiple emotions with combination of five basic emotions. To evaluate the performance of the proposed method, we can achieve the best 93% accuracy in recognising the emotion. The experimental results show that the proposed method effectively performed well over the recognition of facial emotion, and the obtained result indicates the good performance enough to be applicable to mobile environments.
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Internet Protocol Technology (IJIPT):
Login with your Inderscience username and password:
Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.
If you still need assistance, please email subs@inderscience.com