Full Citation and Abstract
|
Title: |
Combining backpropagation and genetic algorithms to train neural networks |
|
Author(s): |
George Papakostas, Yiannis Boutalis, Sofoklis Samartzidis, Dimitrios Karras, Basil Mertzios |
|
Address: |
Department of Electrical and Computer Engineering,
Democritus University of Thrace, Xanthi, Greece
Department of Electrical and Computer Engineering,
Democritus University of Thrace, Xanthi, Greece
Department of Electrical and Computer Engineering,
Democritus University of Thrace, Xanthi, Greece
Department of Automation
Chalkis Institute of Technology, Chalkis, Greece
Department of Automation, Laboratory of Control Systems and Comp. Intell.
Thessaloniki Institute of Technology, Thessaloniki, Greece gpapakos @ ee.duth.gr, ybout @ ee.duth.gr, ssamrt @ ee.duth.gr, dakarras @ teihal.gr, mertzios @ uom.gr |
|
Reference: |
SSIP-SP1, 2005 pp. 171 - 177 |
|
Abstract/ Summary |
In the present paper a comparative study of two possible combinations of the Backpropagation (BP) and a Genetic Algorithm (GA), for Neural Networks training is performed. The performance of these approaches is compared to each other and to each algorithm incorporated separately in the training procedure. The construction of hybrid optimization algorithms is originated from the need to manipulate and solve difficult optimization problems by combining their advantages. The locality and globality behaviour of BP and GA is investigated by the presented hybrid structures, by applying them in five popular benchmark problems. It is concluded, that a more sophisticated structure based on the collaboration of two powerful optimization algorithms can be used to train a typical neural network more efficiently. |
|
|
|
We welcome your comments about this Article |