Feature selection with ensemble learning using enriched SOM Online publication date: Mon, 24-Jul-2017
by Ameni Filali; Chiraz Jlassi; Najet Arous
International Journal of Intelligent Systems Technologies and Applications (IJISTA), Vol. 16, No. 3, 2017
Abstract: Finding pertinent subspaces in very high-dimensional dataset is a challenging task. The selection of features should be stable, but on the other hand clustering results have to be enhanced. Ensemble methods have successfully increased the stability and clustering accuracy, but their runtime prevents them from scaling up to real-world applications. This paper treats the problem of selecting a subset of the most relevant features for each cluster from a dataset. The proposed model is an extension of the random forests method using enriched self-organising map (SOM) to unlabelled data that assess the out-of-bag (oob) feature importance from an ensemble of partitions. Each partition is produced using a different bootstrap sample and a random subset of the features. We then assessed the accuracy and the scalability of the proposed method on 19 benchmark datasets and we compared its effectiveness against other unsupervised feature selection methods with ensemble learning.
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Intelligent Systems Technologies and Applications (IJISTA):
Login with your Inderscience username and password:
Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.
If you still need assistance, please email subs@inderscience.com