Title: Data generalisation with k-means for scalable data mining

Authors: F.A. Siddiky; Faisal Kabir; S.M. Monzurur Rahman

Addresses: School of Computer Science and Engineering, United International University, House 80 Road No. 8/A, Dhanmondi, Dhaka 1209, Bangladesh ' School of Computer Science and Engineering, United International University, House 80 Road No. 8/A, Dhanmondi, Dhaka 1209, Bangladesh ' School of Computer Science and Engineering, United International University, House 80 Road No. 8/A, Dhanmondi, Dhaka 1209, Bangladesh

Abstract: A classification paradigm is a data mining framework containing all the concepts extracted from the training dataset to discriminate one class from other classes existing in data. Most of classification frameworks aim to provide a solution where either entire dataset is considered or a fractional dataset is considered. When a classification framework considers whole dataset then the algorithm may become unusable because of the inherent un-scalable nature of the algorithm itself as well as the rapid growing nature of training dataset. The alternative way of making classification usable is to select a small portion of the training dataset. This limits the entire representation of training dataset in algorithm which results to poor performance of classification accuracy when unseen data is presented to the classification framework. Our paper first addresses these problems inherent in classifiers and then proposes a framework using k-means and C4.5 classification algorithm as the solution. The framework is scalable as it can classify any dataset irrespective of the size with significant accuracy rate.

Keywords: data mining; scalability; CRISP-DM; SEMMA classification; decision trees; K-means; Bayesian; data generalisation; classification accuracy.

DOI: 10.1504/IJKEDM.2012.051242

International Journal of Knowledge Engineering and Data Mining, 2012 Vol.2 No.2/3, pp.215 - 235

Received: 08 May 2021
Accepted: 12 May 2021

Published online: 29 Dec 2012 *

Full-text access for editors Access for subscribers Purchase this article Comment on this article