Title: A simple and practical review of over-fitting in neural network learning

Authors: Oyebade K. Oyedotun; Ebenezer O. Olaniyi; Adnan Khashman

Addresses: Interdisciplinary Centre for Security, Reliability and Trust (SnT), University of Luxembourg, Luxembourg; European Centre of Research and Academic Affairs (ECRAA), Lefkosa, Mersin 10, North Cyprus ' European Centre of Research and Academic Affairs (ECRAA), Lefkosa, Mersin 10, North Cyprus; Faculty of Engineering, Adeleke University, Ede, Osun State, Nigeria ' European Centre of Research and Academic Affairs (ECRAA), Lefkosa, Mersin 10, North Cyprus; Final International University, Girne, Mersin 10, Turkey

Abstract: Training a neural network involves the adaptation of its internal parameters for modelling a specific task. The states of the internal parameters during training describe how much experiential knowledge the model has acquired. Although, it is desirable that a trained neural network achieves zero classification error on the training examples while tuning its internal parameters for a task, the amount of generalisation power that is lost while enforcing such a learning constraint on the model is quite important. In this paper, we review from a practical perspective the consequences of enforcing such a learning constraint which results in a model that has learned a smooth mapping function or essentially 'memorised' the training data. In addition, we investigate how the curse of dimensionality relates to such a learning constraint. For our experiments, we consider handwritten character recognition applications using publicly available datasets.

Keywords: neural network; memorisation; over-fitting; generalisation.

DOI: 10.1504/IJAPR.2017.089384

International Journal of Applied Pattern Recognition, 2017 Vol.4 No.4, pp.307 - 328

Received: 27 Oct 2016
Accepted: 11 May 2017

Published online: 22 Jan 2018 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article