Title: Stacked autoencoder for Arabic handwriting word recognition

Authors: Samir Benbakreti; Mohamed Benouis; Ahmed Roumane; Soumia Benbakreti

Addresses: Department of Speciality, National Institute of Telecommunication and ICT, Oran, Algeria ' Computer Science Department, University of M'sila, M'sila, Algeria ' Department of Speciality, National Institute of Telecommunication and ICT, Oran, Algeria ' Laboratory of Mathematic, University of Sidi Bel Abbes, Sidi Bel Abbes, Algeria

Abstract: Arabic handwritten recognition systems face several challenges such as the very diverse scripting styles, the presence of pseudo-words and the position-dependent shape of a character inside a given word, etc. These characteristics complicate the task of features extraction. Our proposed solution to this problem is a stacked autoencoder (SAE) unsupervised learning approach applied to resolve the unconstrained Arabic handwritten word recognition. Our strategy consists in using an unsupervised pre-training stage, i.e., SAE which will extract the features layer by layer, then, through fine-tuning, the global system will be used for classification tasks. By exploiting this, our system gets the advantage of applying a holistic approach, i.e., without word segmentation. In order to train our model, we have enhanced the NOUN v3 hybrid (i.e., offline and online) database that contains 9,600 handwritten Arabic words and 4,800 characters. However, this work is focusing on the offline recognition of Arabic word handwriting using a SAE-based architecture for images classification. Our experiment study shows that after a careful tuning of the main SAE parameters we got good results (98.03%).

Keywords: Arabic handwriting; offline recognition; neural network; deep learning; unsupervised training; stacked autoencoder; SAE.

DOI: 10.1504/IJCSE.2021.119988

International Journal of Computational Science and Engineering, 2021 Vol.24 No.6, pp.629 - 638

Received: 11 May 2020
Accepted: 12 Mar 2021

Published online: 04 Jan 2022 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article