Title: Classification by majority voting in feature partitions
Authors: Hari Seetha; M. Narasimha Murty; R. Saravanan
Addresses: School of Computing Science and Engineering, VIT University, Vellore-632014, India ' Department of Computer Science and Automation, Indian Institute of Science, Bangalore-12, India ' School of Information Technology and Engineering, VIT University, Vellore-632 014, India
Abstract: Nearest neighbour classifier and support vector machine (SVM) are successful classifiers that are widely used in many important application areas. But both these classifiers suffer from the curse of dimensionality. Nearest neighbour search, in high dimensional data, using Euclidean distance is questionable since all the pair wise distances seem to be almost the same. In order to overcome this problem, we propose a novel classification system based on majority voting. Firstly, we partition the features into a number of blocks and construct a classifier for each block. The majority voting is then performed across all classifiers to determine the final class label. Classification is also performed using non-negative matrix factorisation (NNMF) that embeds high dimensional data into low dimensional space. Experiments were conducted on three of the benchmark datasets and the results obtained showed that the proposed system outperformed the conventional classification using both k-nearest neighbour (k-NN) and support vector machine (SVM) classifiers. The proposed system also showed better performance when compared with the classification performance of 1NN and SVM classifier using NNMF-based dimensionally reduced data.
Keywords: curse of dimensionality; classification; support vector machines; SVM; non-negative matrix factorisation; NNMF; nearest neighbour classifier; majority voting; feature partitions; k-nearest neighbour; k-NN.
DOI: 10.1504/IJIDS.2016.076509
International Journal of Information and Decision Sciences, 2016 Vol.8 No.2, pp.109 - 124
Published online: 11 May 2016 *
Full-text access for editors Access for subscribers Purchase this article Comment on this article