Title: An empirical comparison of incremental linear feature extraction methods

Authors: Xue-Qiang Zeng; Guo-Zheng Li; Hua-Xing Zou; Qian-Sheng Chen

Addresses: Computer Center, Nanchang University, Nanchang 330029, China ' Data Centre of Traditional Chinese Medicine, China Academy of Chinese Medical Science, Beijing 100700, China ' Computer Center, Nanchang University, Nanchang 330029, China ' Military College of Nanchang, Nanchang 330013, China

Abstract: Incremental feature extraction methods are effective in facilitating analysis of instance extensive applications. However, most current incremental feature extraction methods are not suitable for processing large-scale data with high feature dimension, since few methods have low time complexity. In recent years, some highly efficient incremental linear feature extraction methods were proposed whose time complexities are linear with both the numbers of instances and features, such as Incremental Principal Component Analysis (IPCA), Incremental Maximum Margin Criterion (IMMC) and Incremental Inter-class Scatter (IIS). Nevertheless, the performances of these incremental methods have not been compared directly yet. This paper proposes a novel comparative study of incremental feature extraction methods. Extensive experiments on handwritten digit recognition data set demonstrate the performances of the compared methods. Based on the extensive experimental results, the method of IMMC has been found to be the best among the compared feature extraction models.

Keywords: linear feature extraction; incremental feature extraction; incremental learning; large-scale data; handwritten digit recognition; incremental principal component analysis; IPCA; incremental maximum margin criterion; IMMC; incremental inter-class scatter; IIS.

DOI: 10.1504/IJWMC.2015.069387

International Journal of Wireless and Mobile Computing, 2015 Vol.8 No.3, pp.249 - 255

Received: 19 Jul 2014
Accepted: 01 Sep 2014

Published online: 14 May 2015 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article