Title: Hybrid dynamic k-nearest-neighbour and distance and attribute weighted method for classification
Authors: Jia Wu; Zhi-hua Cai; Shuang Ao
Addresses: School of Computer Science, China University of Geosciences, No. 388 Lumo Road, Wuhan 430074, China. ' School of Computer Science, China University of Geosciences, No. 388 Lumo Road, Wuhan 430074, China. ' School of Computer Science, China University of Geosciences, No. 388 Lumo Road, Wuhan 430074, China
Abstract: K-nearest-neighbour (KNN) as an important classification method has been widely used in data mining. However, the class probability estimation, the neighbourhood size and the type of distance function confronting KNN may affect its classification accuracy. Many researchers have been focused on improving the accuracy of KNN via distance weighted, attribute weighted, and dynamic selected methods etc. In this paper, we firstly reviewed some improved algorithms of KNN in three categories mentioned above. Then, we singled out an improved algorithm called dynamic KNN with distance and attribute weighted, simply DKNDAW. We experimentally tested our new algorithm in Weka system. In our experiment, we compared it to KNN, WAKNN, KNNDW, KNNDAW, and DKNN. The experimental results show that DKNDAW significantly outperforms other algorithms in terms of the classification accuracy. Besides, how to learn a weighted DKNDAW with accurate ranking from data, or more precisely, different attribute weighted method of DKNDAW can produce accurate ranking. We explore various methods: the gain ratio method, the correlation-based feature selection method, and the decision tree-based method. We concluded that the gain ratio method is more suitable for our improved KNN algorithm DKNDAW.
Keywords: dynamic KNN; k-nearest-neighbour; distance weighted; attribute weighted; decision tree; classification accuracy; gain ratio; feature selection.
International Journal of Computer Applications in Technology, 2012 Vol.43 No.4, pp.378 - 384
Published online: 01 Jun 2012 *Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article