Int. J. of Wireless and Mobile Computing   »   2017 Vol.12, No.2

 

 

Title: An improved particle swarm-ant colony hybrid algorithm for HMM training

 

Authors: Rong Li; Qing-Shan Zhao

 

Addresses:
Department of Computer, Xinzhou Teachers' University, Xinzhou, Shanxi 034000, China
Department of Computer, Xinzhou Teachers' University, Xinzhou, Shanxi 034000, China

 

Abstract: The traditional parameter estimation methods of Hidden Markov Models (HMM) are easy to fall into local optimum, have higher requirements for initial parameter values and might result in over-coupling phenomena. In order to improve the robustness and identification performance of the model, a novel HMM parameter training method based on an improved particle swarm-ant colony algorithm (IPSAA) is presented. First, extremum disturbance is added into particle swarm optimisation algorithm (PSO) and parameters of ant colony algorithm (ACA) such as stimulating factor, volatilisation coefficients and pheromone are all improved adaptively. Second, the fitness function values of particles' history optimal solutions after PSO coarse search are used to adjust the initial pheromone distribution in fine search of ACA. Finally, Baum-Welch algorithm (B-W) is adopted to locally modify the approximate global optimal solution. The new algorithm not only solves the BW dependency on initial values and the trapped local optimum problem, but also makes full use of the global search ability of IPSAA and local development ability of B-W. The experimental results show that the system using the new algorithm is more efficient, more stable, and has better classification performance than that of traditional HMM training algorithm.

 

Keywords: hidden Markov model; parameter training; improved optimisation algorithm; particle swarm algorithm; ant colony algorithm.

 

DOI: 10.1504/IJWMC.2017.10004967

 

Int. J. of Wireless and Mobile Computing, 2017 Vol.12, No.2, pp.174 - 181

 

Submission date: 16 May 2016
Date of acceptance: 22 Dec 2016
Available online: 08 May 2017

 

 

Editors Full text accessAccess for SubscribersPurchase this articleComment on this article