Data generalisation with k-means for scalable data mining
by F.A. Siddiky; Faisal Kabir; S.M. Monzurur Rahman
International Journal of Knowledge Engineering and Data Mining (IJKEDM), Vol. 2, No. 2/3, 2012

Abstract: A classification paradigm is a data mining framework containing all the concepts extracted from the training dataset to discriminate one class from other classes existing in data. Most of classification frameworks aim to provide a solution where either entire dataset is considered or a fractional dataset is considered. When a classification framework considers whole dataset then the algorithm may become unusable because of the inherent un-scalable nature of the algorithm itself as well as the rapid growing nature of training dataset. The alternative way of making classification usable is to select a small portion of the training dataset. This limits the entire representation of training dataset in algorithm which results to poor performance of classification accuracy when unseen data is presented to the classification framework. Our paper first addresses these problems inherent in classifiers and then proposes a framework using k-means and C4.5 classification algorithm as the solution. The framework is scalable as it can classify any dataset irrespective of the size with significant accuracy rate.

Online publication date: Sat, 13-Sep-2014

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Knowledge Engineering and Data Mining (IJKEDM):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com