Title: Cooperative co-evolution with improved differential grouping method for large-scale global optimisation

Authors: Rui Wang; Fuxing Zhang; Tao Zhang; Peter J. Fleming

Addresses: College of Systems Engineering, National University of Defense Technology, Changsha, Hunan 410073, China ' College of Systems Engineering, National University of Defense Technology, Changsha, Hunan 410073, China ' College of Systems Engineering, National University of Defense Technology, Changsha, Hunan 410073, China ' Department of Automatic Control and Systems Engineering, University of Sheffield, Mappin Street, S1, 3JD, UK

Abstract: The cooperative co-evolution (CC) framework has shown to be effective in dealing with large-scale global optimisation (LSGO). However, the performance of algorithms based on the CC framework is often affected by the chosen variable grouping method, i.e., how variables are grouped into different sub-components. In this study, an improved variable grouping strategy based on the differential grouping (DG) is proposed, namely ε-based differential grouping (ε-DG). The ε-DG strategy can identify both direct and indirect interactions between variables. Moreover, a simple yet effective method is introduced in ε-DG to identify the calculation error that is detrimental to variable grouping in the original DG method (which needs to set the threshold value appropriately). The ε-DG is compared against the DG on the CEC 2010 LSGO benchmarks, and is found to perform better in terms of the grouping accuracy on almost all problems. Moreover, a CC-based differential evolution algorithm with the ε-DG strategy shows good performance on the 2010 LSGO benchmarks.

Keywords: large-scale global optimisation; LSGO; variable grouping strategy; differential grouping; differential evolution; evolutionary algorithms.

DOI: 10.1504/IJBIC.2018.096481

International Journal of Bio-Inspired Computation, 2018 Vol.12 No.4, pp.214 - 225

Accepted: 14 Jul 2018
Published online: 04 Dec 2018 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article