Title: Tempo and beat tracking for audio signals with music genre classification

Authors: Mao-Yuan Kao, Chang-Biau Yang, Shyue-Horng Shiau

Addresses: Department of Computer Science and Engineering, National Sun Yat-sen University, No. 70, Lienhai Rd., Kaohsiung, Taiwan. ' Department of Computer Science and Engineering, National Sun Yat-sen University, No. 70, Lienhai Rd., Kaohsiung, Taiwan. ' Department of Computer Aided Media Design, Chang Jung Christian University, No. 396, Chang Jung Rd., Sec.1, Kway Jen, Tainan, Taiwan

Abstract: Most people follow the music to hum or the rhythm to tap sometimes. We may get different meanings of a music style if it is explained or felt by different people. Therefore we cannot obtain a very explicit answer if there is no music notation. Tempo and beats are very important elements in the perceptual music. Therefore, tempo estimation and beat tracking are fundamental techniques in automatic audio processing, which are crucial to multimedia applications. We first develop an artificial neural network to classify the music excerpts into the evaluation preference. And then, with the preference classification, we can obtain accurate estimation for tempo and beats, by either Ellis|s method or Dixon|s method. We test our method with mixed data set which contains ten music genres from the |ballroom dancer| database. Our experimental results show that the accuracy of our method is higher than only one individual Ellis|s method or Dixon|s method.

Keywords: tempo estimation; beat tracking; audio processing; artificial neural networks; classification; music genre; multimedia; ANNs; musical excerpts; evaluation preferences; audio signals.

DOI: 10.1504/IJIIDS.2009.027687

International Journal of Intelligent Information and Database Systems, 2009 Vol.3 No.3, pp.275 - 290

Published online: 07 Aug 2009 *

Full-text access for editors Access for subscribers Purchase this article Comment on this article