Title: Framework for stream learning algorithms

Authors: Archana Patil; Vahida Attar

Addresses: Department of Computer Engineering and Information Technology, College of Engineering, Pune, ShivajiNagar, Maharashtra, 411005, India. ' Department of Computer Engineering and Information Technology, College of Engineering, Pune, ShivajiNagar, Maharashtra, 411005, India

Abstract: Stream learning algorithms learn decision models that continuously evolve over time, run in resource-aware environments, detect and react to changes in the environment generating data. The model should perform well on both training and unseen data and has to give the best possible results in minimum resources. Performance of different classifiers for same task in same environment can differ. Hence, there is a need of user friendly interface having facility of selecting multiple classifiers for performance comparison, saving environment for future use and plotting the performance graph of classifiers, so that one can select the best suited classifier for the required task. As the performance varies with respect to type and distribution, there is a need to provide different measures for performance comparison. Objective of this paper is to provide a framework by enhancing the existing software used for stream data analysis with the above mentioned facilities. Memory is one of the resources consumed by classifier while working. This paper also implements a new classifier that reduces memory usage without much compromise with accuracy.

Keywords: stream data analysis; merging; learning algorithms; classification; equal predictability; performance evaluation; performance measures; decision tree; framework; classifier selection; multiple classifiers; performance comparison; performance graphs.

DOI: 10.1504/IJCISTUDIES.2012.050367

International Journal of Computational Intelligence Studies, 2012 Vol.1 No.4, pp.368 - 383

Received: 02 Mar 2012
Accepted: 01 Jun 2012

Published online: 28 Aug 2014 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article