Enhanced structural perceptual feature extraction model for Arabic literal amount recognition Online publication date: Mon, 15-Aug-2016
by Qais Al-Nuzaili; Siti Z. Mohd. Hashim; Faisal Saeed; Mohammed Sayim Khalil; Dzulkifli Bin Mohamad
International Journal of Intelligent Systems Technologies and Applications (IJISTA), Vol. 15, No. 3, 2016
Abstract: One of the important applications for document recognition is the bank cheque processing, which is known as cheque literal amount. A few studies focused on Arabic bank cheque processing system compared to other systems, such as Latin and Chinese. The Arabic script has a number of characteristics that makes it unique among other scripts. It is known that humans are the best pattern recognisers. As such, the features detected while human reads the script can get better recognition rates. Therefore, proposing human reading inspired features (which are called perceptual features) can overcome the unique technical challenges in Arabic literal amount recognition. In this paper, the enhanced structural perceptual feature extraction model (PFM) has been proposed. Two main groups of features, which are the components and dots features and the loops and characters shapes features were combined to construct the PFM. This model was evaluated on standard Arabic Handwriting DataBase (AHDB) dataset. The PFM results outperformed the results reported in the previous studies.
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Intelligent Systems Technologies and Applications (IJISTA):
Login with your Inderscience username and password:
Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.
If you still need assistance, please email subs@inderscience.com