Title: A novel particle swarm optimisation with search space tuning parameter to avoid premature convergence

Authors: Raja Chandrasekaran; R. Agilesh Saravanan; D. Ashok Kumar; N. Gangatharan

Addresses: Department of ECE, KL University, Vijayawada, India ' Department of ECE, KL University, Vijayawada, India ' Department of Biomedical Engineering, SRMIST, Kattankulathur, Chennai, India ' Department of ECE, R.M.K. College of Engineering and Technology, Puduvoyal, Chennai, India

Abstract: Particle swarm optimisation is a trendy optimisation technique that is inhaled from the space navigational intelligence of birds. The optimisation technique is popular among the researchers for several decades because of the fact that it is inspired by the zonal and universal best members in all the generations. The optimisation by PSO is found better than few other optimisation techniques, in several trials with the optimisation of the mathematical benchmarks and real-time applications. But the more-than-modest orientation style of the algorithm often leads the population to premature convergence. Inertia weight parameter is used to tune the explorability of the population. In this paper, a zonal monitor (based on success in the recent iterations)-based inertia weight tuning is redressed by including universal monitors (based on the success with a universal fitness perspective). The proposed algorithm excels the conventional PSO, the PSO with zonal monitors alone. The inertia weight of the PSO with zonal monitor is also not dynamic whereas the proposed PSO's inertia weight are found to be more dynamic with tuning the explore ability with regard to zonal and universal context of fitness.

Keywords: particle swarm optimisation; PSO; adaptive inertia weight; search space tuning.

DOI: 10.1504/IJMMNO.2019.10021528

International Journal of Mathematical Modelling and Numerical Optimisation, 2019 Vol.9 No.4, pp.382 - 399

Received: 22 May 2018
Accepted: 24 Sep 2018

Published online: 23 May 2019 *

Full-text access for editors Access for subscribers Purchase this article Comment on this article