Title: Unlabeled facial expression capture method in virtual reality system based on big data

Authors: Feng Gao

Addresses: School of Artificial Intelligence, Chongqing University of Arts and Sciences, Chongqing 402160, China; Lyceum of The Philippines University – Batangas, Batangas City, Philippines

Abstract: In view of the problems of high error rate and low efficiency in the traditional method of facial expression capture without markers, a method of facial expression capture without markers based on large amount of data is proposed. Haar feature is used to determine the initial position of human face, and active shape model is used to extract unmarked facial feature points. The extracted feature points and the generated triangle mesh are tracked by the optical flow tracking method. The displacement of the face feature points is used to promote the overall change of the mesh and complete the unmarked facial expression capture. The experimental results show that the error rate of this method is in the range of 1.2%-1.7%, the error rate is small, and it needs 20 s-34 s to capture facial expression, which is more practical and efficient.

Keywords: big data; virtual reality system; facial feature points; unlabeled; facial expression capture.

DOI: 10.1504/IJICT.2021.114849

International Journal of Information and Communication Technology, 2021 Vol.18 No.3, pp.261 - 274

Received: 29 Nov 2019
Accepted: 26 Dec 2019

Published online: 10 May 2021 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article