Title: Characteristics of contrastive Hebbian learning with pseudorehearsal for multilayer neural networks on reduction of catastrophic forgetting
Authors: Motonobu Hattori; Shunta Nakano
Addresses: Faculty of Interdisciplinary Research, University of Yamanashi, 4-3-11 Takeda, Kofu, Yamanashi 400-8511, Japan ' Integrated Graduate School of Medicine, Engineering and Agricultural Sciences, University of Yamanashi, 4-3-11 Takeda, Kofu, Yamanashi 400-8511, Japan
Abstract: Neural networks encounter serious catastrophic forgetting or catastrophic interference when information is learned sequentially. One of the methods which can reduce catastrophic forgetting is pseudorehearsal, in which pseudopatterns are learned with training patterns. This method has shown superior performance for multilayer neural networks trained by the backpropagation algorithm. However, the backpropagation algorithm is biologically implausible because it requires passage of error signals backward from output neurons to input ones. That is, the learning cannot by executed locally. On the other hand, contrastive Hebbian learning (CHL) is a learning method using Hebbian rule for synaptic weight changes. Since Hebbian learning can be performed locally between two neurons and does not need to take into account error information computed at output neurons, it is much more biologically plausible than the backpropagation algorithm. In this paper, we examine characteristics of multilayer neural networks trained by CHL with pseudorehearsal when information is applied sequentially, and how catastrophic forgetting can be reduced.
Keywords: contrastive Hebbian learning; pseudorehearsal; multilayer neural networks; catastrophic forgetting; pseudopatterns; Hebbian rule; additional learning.
International Journal of Computational Intelligence Studies, 2018 Vol.7 No.3/4, pp.289 - 311
Received: 10 Feb 2018
Accepted: 09 Apr 2018
Published online: 13 Nov 2018 *