Title: A simple way to enhance the efficiency of Nguyen-Widrow method in neural networks initialisation

Authors: Faiza Saadi; Ahmed Chibat

Addresses: Applied Mathematics and Modeling Laboratory, Department of Mathematics, University of Constantine 1, 25000 Constantine, Algeria ' Deceased, formerly of: University of Constantine 1, Algeria

Abstract: The choice of the initial values of neural network parameters is of crucial importance for the conduct and successful completion of training. The most known and most used method to perform this task is the Nguyen-Widrow method. It has brought a well-established advantage over the traditional method of purely random choice. However, this advantage is not always guaranteed. A hidden defect can appear in some situations leading to a quality deteriorate of training. In this work, the existence of this hidden defect is revealed and the way to remedy this is proposed. We show how a simple procedure, built on conditional resets, eliminates unsuitable starting conditions and ensures the steadiness of good training quality. In this way, the search for the optimal architecture of a network when processing any given problem becomes faster and more reliable.

Keywords: neural networks; weight initialisation; function approximation; regression.

DOI: 10.1504/IJMOR.2022.122804

International Journal of Mathematics in Operational Research, 2022 Vol.21 No.4, pp.497 - 514

Received: 25 Oct 2020
Accepted: 14 Feb 2021

Published online: 12 May 2022 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article