A simple and practical review of over-fitting in neural network learning
by Oyebade K. Oyedotun; Ebenezer O. Olaniyi; Adnan Khashman
International Journal of Applied Pattern Recognition (IJAPR), Vol. 4, No. 4, 2017

Abstract: Training a neural network involves the adaptation of its internal parameters for modelling a specific task. The states of the internal parameters during training describe how much experiential knowledge the model has acquired. Although, it is desirable that a trained neural network achieves zero classification error on the training examples while tuning its internal parameters for a task, the amount of generalisation power that is lost while enforcing such a learning constraint on the model is quite important. In this paper, we review from a practical perspective the consequences of enforcing such a learning constraint which results in a model that has learned a smooth mapping function or essentially 'memorised' the training data. In addition, we investigate how the curse of dimensionality relates to such a learning constraint. For our experiments, we consider handwritten character recognition applications using publicly available datasets.

Online publication date: Mon, 22-Jan-2018

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Applied Pattern Recognition (IJAPR):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com