Authors: M. Sornam; P. Thangavel
Addresses: Department of Computer Science, University of Madras, Chepauk, Chennai 600005, Tamil Nadu, India. ' Department of Computer Science, University of Madras, Chepauk, Chennai 600005, Tamil Nadu, India
Abstract: An improved Optical Backpropagation (OBP) algorithm for training single hidden layer feedforward neural network with third term is proposed. The major limitations of backpropagation algorithm are the local minima problem and the slow rate of convergence. To solve these problems, we have proposed an algorithm by introducing a third term with optical backpropagation (OBPWT). This method has been applied to the multilayer neural network to improve the efficiency in terms of convergence speed. In the proposed algorithm, a non-linear function on the error term is introduced before applying the backpropagation phase. This error term is used along with a third term in the weight updation rule. We have shown how the new proposed algorithm drastically accelerates the training convergence at the same time maintaining the neural network’s performance. The effectiveness of the proposed algorithm has been shown by testing five benchmark problems. The simulation results show that the proposed algorithm is capable of speeding up the learning and hence the rate of convergence.
Keywords: optical backpropagation; multilayer neural networks; weight updation; nonlinear function; third term; convergence speed.
International Journal of Artificial Intelligence and Soft Computing, 2011 Vol.2 No.4, pp.321 - 333
Available online: 27 Sep 2011 *Full-text access for editors Access for subscribers Purchase this article Comment on this article