Performance analysis of the Bayesian data reduction algorithm
by Douglas M. Kline, Craig S. Galbraith
International Journal of Data Mining, Modelling and Management (IJDMMM), Vol. 1, No. 3, 2009

Abstract: This paper compares the performance of the recently proposed Bayesian data reduction algorithm (BDRA) with a rigorously trained automated feed-forward back-propagation artificial neural network (ANN) classifier on a number of benchmark problems. Using the UCI Machine Learning Repository, six two-group classification problems were examined: Wisconsin breast cancer disease, glass identification, ionosphere, IRIS plant, Pima Indian diabetes and liver disorders. Using re-sampling process to reduce sample bias, the two classifiers were compared along the dimensions of in-sample classification, test-sample classification, dimensionality reduction and training time requirements. Significant differences between performances were determined by pair-wise repeated measures t-tests between means. The results indicated that the BDRA consistently outperformed the neural network in dimensionality reduction and training time requirements, while obtaining, with the exception of one database, comparable classification rates. For benchmarking purposes, both the BDRA and ANN were compared with a step-wise linear regression classification model.

Online publication date: Sun, 19-Jul-2009

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Data Mining, Modelling and Management (IJDMMM):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?

Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email