Chapter 1: Invited Addresses and Tutorials on Signals, Coding,
  Systems and Intelligent Techniques

Title: Combining backpropagation and genetic algorithms to train neural networks

Author(s): George Papakostas, Yiannis Boutalis, Sofoklis Samartzidis, Dimitrios Karras, Basil Mertzios

Address: Department of Electrical and Computer Engineering, Democritus University of Thrace, Xanthi, Greece | Department of Electrical and Computer Engineering, Democritus University of Thrace, Xanthi, Greece | Department of Electrical and Computer Engineering, Democritus University of Thrace, Xanthi, Greece | Department of Automation Chalkis Institute of Technology, Chalkis, Greece | Department of Automation, Laboratory of Control Systems and Comp. Intell. Thessaloniki Institute of Technology, Thessaloniki, Greece

Reference: 12th International Workshop on Systems, Signals and Image Processing pp. 171 - 177

Abstract/Summary: In the present paper a comparative study of two possible combinations of the Backpropagation (BP) and a Genetic Algorithm (GA), for Neural Networks training is performed. The performance of these approaches is compared to each other and to each algorithm incorporated separately in the training procedure. The construction of hybrid optimization algorithms is originated from the need to manipulate and solve difficult optimization problems by combining their advantages. The locality and globality behaviour of BP and GA is investigated by the presented hybrid structures, by applying them in five popular benchmark problems. It is concluded, that a more sophisticated structure based on the collaboration of two powerful optimization algorithms can be used to train a typical neural network more efficiently.

Order a copy of this article Order a copy of this article