Gain parameter and dropout-based fine tuning of deep networks
by M. Arif Wani; Saduf Afzal
International Journal of Intelligent Information and Database Systems (IJIIDS), Vol. 11, No. 4, 2018

Abstract: Training of deep neural networks can involve two phases: unsupervised pre-training and supervised fine tuning. Unsupervised pre-training is used to learn the initial parameter values of deep networks, while as supervised fine tuning improves upon what has been learned in the pre-training stage. Backpropagation algorithm can be used for supervised fine tuning of deep neural networks. In this paper we evaluate the use of backpropagation with gain parameter algorithm for fine tuning of deep networks. We further propose a modification where backpropagation with gain parameter algorithm is integrated with the dropout technique and evaluate its performance in fine tuning of deep networks. The effectiveness of fine tuning done by proposed technique is also compared with other variants of backpropagation algorithm on benchmark datasets. The experimental results show that the fine tuning of deep networks using the proposed technique yields promising results among all the studied methods on the tested datasets.

Online publication date: Tue, 04-Dec-2018

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Intelligent Information and Database Systems (IJIIDS):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com