Tempo and beat tracking for audio signals with music genre classification Online publication date: Fri, 07-Aug-2009
by Mao-Yuan Kao, Chang-Biau Yang, Shyue-Horng Shiau
International Journal of Intelligent Information and Database Systems (IJIIDS), Vol. 3, No. 3, 2009
Abstract: Most people follow the music to hum or the rhythm to tap sometimes. We may get different meanings of a music style if it is explained or felt by different people. Therefore we cannot obtain a very explicit answer if there is no music notation. Tempo and beats are very important elements in the perceptual music. Therefore, tempo estimation and beat tracking are fundamental techniques in automatic audio processing, which are crucial to multimedia applications. We first develop an artificial neural network to classify the music excerpts into the evaluation preference. And then, with the preference classification, we can obtain accurate estimation for tempo and beats, by either Ellis's method or Dixon's method. We test our method with mixed data set which contains ten music genres from the 'ballroom dancer' database. Our experimental results show that the accuracy of our method is higher than only one individual Ellis's method or Dixon's method.
Online publication date: Fri, 07-Aug-2009
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Intelligent Information and Database Systems (IJIIDS):
Login with your Inderscience username and password:
Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.
If you still need assistance, please email firstname.lastname@example.org