Recognition method of dance rotation based on multi-feature fusion
by Yang Liu; Meiyan Fan; Wenfeng Xu
International Journal of Arts and Technology (IJART), Vol. 13, No. 2, 2021

Abstract: There are some problems in traditional dance rotation recognition methods, such as low accuracy of contour superposition and low recognition rate. A dance rotation recognition method based on multi-feature fusion is proposed. The background noise subtraction method is used to separate the human motion regions in the foreground of the video data, and the contour features of each frame image of the preprocessed dance video are superimposed to obtain the direction gradient histogram features of the dance action information. According to the law of optical flow, the feature vectors of the histogram of optical flow direction are normalised. According to the shape and motion characteristics of human dance in dance video, the dance rotation recognition classifier is constructed to complete the dance rotation recognition based on multi-feature fusion. The experimental results show that the proposed method has higher accuracy of 97% and lower error rate of 0.7%.

Online publication date: Wed, 26-Jan-2022

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Arts and Technology (IJART):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com