Characteristics of contrastive Hebbian learning with pseudorehearsal for multilayer neural networks on reduction of catastrophic forgetting
by Motonobu Hattori; Shunta Nakano
International Journal of Computational Intelligence Studies (IJCISTUDIES), Vol. 7, No. 3/4, 2018

Abstract: Neural networks encounter serious catastrophic forgetting or catastrophic interference when information is learned sequentially. One of the methods which can reduce catastrophic forgetting is pseudorehearsal, in which pseudopatterns are learned with training patterns. This method has shown superior performance for multilayer neural networks trained by the backpropagation algorithm. However, the backpropagation algorithm is biologically implausible because it requires passage of error signals backward from output neurons to input ones. That is, the learning cannot by executed locally. On the other hand, contrastive Hebbian learning (CHL) is a learning method using Hebbian rule for synaptic weight changes. Since Hebbian learning can be performed locally between two neurons and does not need to take into account error information computed at output neurons, it is much more biologically plausible than the backpropagation algorithm. In this paper, we examine characteristics of multilayer neural networks trained by CHL with pseudorehearsal when information is applied sequentially, and how catastrophic forgetting can be reduced.

Online publication date: Thu, 15-Nov-2018

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Computational Intelligence Studies (IJCISTUDIES):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?

Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email