Title: A hybrid feature selection algorithm for big data dimensionality reduction

Authors: M.D. Anto Praveena; B. Bharathi

Addresses: Sathyabama Institute of Science and Technology, Old Mamallapuram Road, Chennai, Tamilnadu 600119, India ' Sathyabama Institute of Science and Technology, Old Mamallapuram Road, Chennai, Tamilnadu 600119, India

Abstract: Data dimensionality causes problems like large storage requirements due to data redundancy, noisy data visualisation and high computational cost in analytical processes. Hence, reduction of high data dimension data set is the vital task in big data analytics. Feature selection is a dimensionality reduction approach that targets to reduce the high data dimensions into l subset by attaining a set of uncorrelated principal variables based on key features from the data source by eliminating redundant, noisy features and irrelevant. In the proposed work, ant colony optimisation (ACO) and quick branch and bound (QBB) based hybrid algorithm for feature selection is presented to improve the efficiency of feature selection in big data analytics. ACO was implemented for feature selection by observing real ants in their search of their food resources. QBB algorithm is used to partition the large data set into two partitions initially. These two algorithms are combined and implemented to reduce the data dimensionality based on feature selection approach. This algorithm was simulated and compared against existing feature selection algorithms in terms of a set of performance metrics such as precision, recall, F-score, classification accuracy, size of selected feature and proved the efficacy of proposed hybrid algorithm.

Keywords: dimensionality reduction; feature selection; ant colony optimisation; ACO; quick branch and bound; QBB.

DOI: 10.1504/IJAIP.2021.116363

International Journal of Advanced Intelligence Paradigms, 2021 Vol.19 No.3/4, pp.380 - 392

Received: 19 May 2018
Accepted: 12 Jun 2018

Published online: 21 Jul 2021 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article