Title: MLP neural network using constructive training algorithm: application to face recognition and facial expression recognition

Authors: Hayet Boughrara; Mohamed Chtourou; Chokri Ben Amar

Addresses: Control and Energy Management Laboratory, University of Sfax, ENIS, BP. 1173, 3038, Sfax, Tunisia ' Control and Energy Management Laboratory, University of Sfax, ENIS, BP. 1173, 3038, Sfax, Tunisia ' REGIM-Lab.: Research Groups in Intelligent Machines, University of Sfax, ENIS, BP. 1173, 3038, Sfax, Tunisia

Abstract: This paper presents a constructive training algorithm applied to face recognition and facial expression recognition. The multi layer perceptron (MLP) neural network is formed by a single hidden layer using a predefined number of neurons and a small number of training patterns. During the learning, the hidden neuron number is incremented when the mean square error (MSE) on the training data (TD) is not reaches a predefined value. Input patterns are learned incrementally until all patterns of TD are presented. The proposed algorithm allows to find synthesis parameters as the number of patterns corresponding for subsets of each class to be presented initially in the training step, the initial number of hidden neurons, the iterations number as well as the MSE value. The feature extraction stage is based on the perceived facial images and the Gabor filter. Compared to the literature review and the fixed MLP architecture, experimental results demonstrate the efficiency of the proposed approach.

Keywords: constructive training algorithm; MLP neural networks; multi layer perceptron; back-propagation neural networks; face recognition; facial expressions; expression recognition; PFI; perceived facial images; Gabor filters; biometrics; mean square error; MSE.

DOI: 10.1504/IJISTA.2017.081316

International Journal of Intelligent Systems Technologies and Applications, 2017 Vol.16 No.1, pp.53 - 79

Received: 19 Nov 2015
Accepted: 12 Jun 2016

Published online: 04 Jan 2017 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article