Strong Wolfe condition-based variable stacking length multi-gradient parameter identification algorithm
by Yiqiao Shi; Shaoxue Jing
International Journal of Modelling, Identification and Control (IJMIC), Vol. 41, No. 4, 2022

Abstract: This paper considers the acceleration of the gradient algorithm for the linear models. Traditional stochastic gradient algorithm requires less computation, but it converges to the true parameter slowly. To accelerate gradient algorithm, a novel gradient algorithm using several gradients is proposed. One important issue of the proposed algorithm is how to determine the stacking length. The stacking length defines the number of gradients used in each recursion. A variable stacking length based on the strong Wolfe condition is presented to enable the algorithm to converge faster. The stacking length obtained by using the SW condition can ensure that the proposed multi-gradient algorithm converges faster. Several experiments are made to validate the proposed algorithm.

Online publication date: Tue, 17-Jan-2023

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Modelling, Identification and Control (IJMIC):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com