Forthcoming articles


International Journal of Metaheuristics


These articles have been peer-reviewed and accepted for publication in IJMHeur, but are pending final changes, are not yet published and may not appear here in their final order of publication until they are assigned to issues. Therefore, the content conforms to our standards but the presentation (e.g. typesetting and proof-reading) is not necessarily up to the Inderscience standard. Additionally, titles, authors, abstracts and keywords may change before publication. Articles will not be published until the final proofs are validated by their authors.


Forthcoming articles must be purchased for the purposes of research, teaching and private study only. These articles can be cited using the expression "in press". For example: Smith, J. (in press). Article Title. Journal Title.


Articles marked with this shopping trolley icon are available for purchase - click on the icon to send an email request to purchase.


Articles marked with this Open Access icon are freely available and openly accessible to all without any restriction except the ones stated in their respective CC licenses.


Register for our alerting service, which notifies you by email when new issues of IJMHeur are published online.


We also offer RSS feeds which provide timely updates of tables of contents, newly published articles and calls for papers.


International Journal of Metaheuristics (4 papers in press)


Regular Issues


  • Particle swarm optimization with population size and acceleration coefficients adaptation using hidden markov model state classification   Order a copy of this article
    by Oussama Aoun, Malek Sarhani, Abdellatif El Afia 
    Abstract: Particle swarm optimization (PSO) is a metaheuristic algorithm based on population, it succeeded in solving a large number of optimization problems. Several adaptive PSO algorithms have been proposed to enhance the performance of the original one. In particular, parameter adaptation has become a promising issue of PSO. In this article, we propose an adaptive control of two PSO parameters using HMM classification to enhance PSO performance, called HMM Adaptive Control of PSO (HMM-ACPSO). That is, we integrate Hidden Markov Model (HMM) to have a stochastic control of states at each iteration. Then, the classified state by HMM is used to adapt PSO both acceleration parameters and population size. Furthermore, several strategies varying the swarm are adopted according to the classified state. We performed evaluations on several benchmark functions to test the HMM-ACPSO algorithm. Experimental results reveal that our suggested scheme gives competitive results comparing to PSO variants regarding both solution accuracy and convergence speed.
    Keywords: Particle swarm optimization; swarm Intelligence; hidden markov model; machine learning; adaptive population size; parameters adaptation; metaheuristics control.

  • Flying Elephants Method applied to the problem of covering solid bodies with spheres   Order a copy of this article
    by Daniela Cristina Lubke, Vinicius Layter Xavier, Helder Manoel Venceslau, Adilson Elias Xavier 
    Abstract: The use of the Flying Elephants Method engenders a simple one-level completely differentiable optimization problem and allows overcoming the main difficulties presented by the original one. Computational results obtained for the covering of some solid body test instances show the good performance of the proposed methodology.
    Keywords: location problems; min-max-min problems; non-differentiable programming; smoothing.

  • Azeotropy in a refrigerant system: an useful scenario to test and compare metaheuristics   Order a copy of this article
    by Gustavo Platt, Lucas Lima 
    Abstract: The comparison between metaheuristics has been frequently addressed in the last decades and, in a certain way, there is some controversy regarding the techniques to be employed in such comparison. In a multimodal optimization problem, the capability of the algorithm to identify more than one solution (possibly using auxiliary techniques to the identification of multiple solutions) must be considered. Computation time and/or the number of fitness function evaluations are possible metrics to be compared. On the other hand, the robustness and the accuracy of the methodologies are also fundamental quantities. In this work, we present a scenario of several comparisons between two metaheuristics the Differential Evolution and the Symbiotic Organisms Search, arbitrarily chosen. This scenario consisted of a problem characterized by a nonlinear algebraic system (converted into an optimization problem) with two solutions (in a narrow range of temperatures): the double azeotrope problem in the refrigerant fluid formed by ammonia + R-125. The statistical analysis of the results indicates that Differential Evolution and Symbiotic Organisms Search exhibits similar performances in the search for the first minimum of the problem. Nevertheless, the Differential Evolution outperformed the Symbiotic Organisms Search with respect to the capability to identify both minima.
    Keywords: Metaheuristics; Azeotropy; Refrigerant Systems; Thermodynamics.

  • Binary Whale Optimization: An Effective Swarm Algorithm for Feature Selection   Order a copy of this article
    by Heba Eid 
    Abstract: Feature selection process is considered as one of the most difficult challenges in machine learning and has attracted many researchers recently. The main disadvantages of the classical optimization algorithms based feature selection are slow convergence speed and local optima stagnation. This makes bio-inspired optimization algorithm a reliable alternative to alleviate these drawbacks. In this work, a novel binary version of the Whale optimization (WO) is proposed for selecting the optimal feature subset and increasing the classification accuracy.The performance of the proposed binary whale optimization (BWO) is verified by comparisons with three well known optimization based feature selection algorithm; genetic algorithm, ant colony optimization and particle swarm optimization; on nine benchmark datasets. The qualitative and quantitative results show the capability of the proposed BWO to search the feature space for optimal feature combinations. Moreover, results prove that the proposed BWO based feature selection is able to outperform the current algorithms on the majority of datasets in terms of both average classification accuracy and convergence speed.
    Keywords: Heuristic algorithm; Bio-Inspired Optimization; Whale Optimization; Binary Whale Optimization (BWO); Feature selection; Particle swarm optimization; Genetic algorithm; Ant Colony Optimization.