Combining multiple classifiers for wrapper feature selection Online publication date: Wed, 14-Jan-2009
by Kyriacos Chrysostomou, Sherry Y. Chen, Xiaohui Liu
International Journal of Data Mining, Modelling and Management (IJDMMM), Vol. 1, No. 1, 2008
Abstract: Wrapper feature selection methods are widely used to select relevant features. However, wrappers only use a single classifier. The downside to this approach is that each classifier will have its own biases and will therefore select very different features. In order to overcome the biases of individual classifiers, this study introduces a new data mining method called wrapper-based decision trees (WDT), which combines different classifiers and uses decision trees to classify selected features. The WDT method combines multiple classifiers so selecting classifiers for use in the combinations is an important issue. Thus, we investigate how the number and nature of classifiers influence the results of feature selection. Regarding the number of classifiers, results showed that few classifiers selected more relevant features whereas many selected few features. Regarding the nature of classifier, decision tree classifiers selected more features and the features that generated accuracies much higher than other classifiers.
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Data Mining, Modelling and Management (IJDMMM):
Login with your Inderscience username and password:
Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.
If you still need assistance, please email subs@inderscience.com