MLP neural network using constructive training algorithm: application to face recognition and facial expression recognition
by Hayet Boughrara; Mohamed Chtourou; Chokri Ben Amar
International Journal of Intelligent Systems Technologies and Applications (IJISTA), Vol. 16, No. 1, 2017

Abstract: This paper presents a constructive training algorithm applied to face recognition and facial expression recognition. The multi layer perceptron (MLP) neural network is formed by a single hidden layer using a predefined number of neurons and a small number of training patterns. During the learning, the hidden neuron number is incremented when the mean square error (MSE) on the training data (TD) is not reaches a predefined value. Input patterns are learned incrementally until all patterns of TD are presented. The proposed algorithm allows to find synthesis parameters as the number of patterns corresponding for subsets of each class to be presented initially in the training step, the initial number of hidden neurons, the iterations number as well as the MSE value. The feature extraction stage is based on the perceived facial images and the Gabor filter. Compared to the literature review and the fixed MLP architecture, experimental results demonstrate the efficiency of the proposed approach.

Online publication date: Thu, 29-Dec-2016

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Intelligent Systems Technologies and Applications (IJISTA):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?

Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email