Human interactive behaviour recognition method based on multi-feature fusion
by Qing Ye; Rui Li; Hang Yang; Xinran Guo
International Journal of Computational Science and Engineering (IJCSE), Vol. 25, No. 3, 2022

Abstract: Recently, the selection of the overall and individual characteristics in interactive actions and the high-dimensional complexity of features are still important factors affecting the recognition accuracy. In this paper, we propose a human interactive behaviour recognition method based on multi-feature fusion, which includes two parts, feature extraction and behaviour recognition. Firstly, we use histogram feature descriptors to form a three-dimensional gradient histogram of local space-time feature (3D-HOG) and histogram of global optical flow feature (HOF). Then the bag-of-words model is used to reduce the dimensions and the classification matrix is obtained through multilayer perceptron (MLP) classifiers. In the second part, we use recurrent neural network (RNN) to get connections in time. Considering the information of interactive behaviour will be different at different stages, an improved Gauss neural network are proposed for interactive behaviour recognition. The experimental results show that the algorithm can effectively improve the accuracy in the UT-Interaction dataset.

Online publication date: Mon, 30-May-2022

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Computational Science and Engineering (IJCSE):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com