Title: Bit mask-oriented genetic algorithm for grammatical inference and premature convergence

Authors: Hari Mohan Pandey; Ankit Chaudhary; Deepti Mehrotra

Addresses: Department of Computer Science and Engineering, ASET, Amity University, Sector 125, Noida, U.P., India ' Department of Computer Science, Northwest Missouri State University, Maryville, MO 64468, USA ' Department of Information Technology, ASET, Amity University, Sector 125, Noida, U.P., India

Abstract: In this paper, a bit mask-oriented genetic algorithm (BMOGA) is presented for grammatical inference (GI). GI is techniques to infer context free grammar from a set of corpora. The BMOGA combines the traditional genetic algorithm with a bit-mask oriented data structure and Boolean-based procedure (uses Boolean operators) that can exploit an optimum offspring. Extensive parameters tuning is done that makes the BMOGA more robust, statistically sound, and quickly convergent. The BMOGA is applied over the context free as well as regular languages of varying complexities. The results show that BMOGA finds optimal or close-to-optimal solution. The Boolean operators introduce diversity in the population that helps in exploring the search space adequately that helps to alleviate premature convergence. First, we evaluate the performance of the BMOGA against three algorithms: the genetic algorithm, particle swarm optimisation and simulated annealing. Then, the BMOGA is tested against two different offspring generation algorithms: random offspring generation and elite mating pool approach. Statistical tests are conducted that indicate the superiority of the proposed algorithm over others.

Keywords: bit-masking oriented data structure; context free grammar; CFG; genetic algorithm; grammar inference; learning system.

DOI: 10.1504/IJBIC.2018.093339

International Journal of Bio-Inspired Computation, 2018 Vol.12 No.1, pp.54 - 69

Accepted: 06 Jan 2017
Published online: 29 Jun 2018 *

Full-text access for editors Access for subscribers Purchase this article Comment on this article