Adaptive and iterative least squares support vector regression based on quadratic Renyi entropy
by Jingqing Jiang, Chuyi Song, Haiyan Zhao, Chunguo Wu, Yanchun Liang
International Journal of Granular Computing, Rough Sets and Intelligent Systems (IJGCRSIS), Vol. 1, No. 3, 2010

Abstract: An adaptive and iterative LSSVR algorithm based on quadratic Renyi entropy is presented in this paper. LS-SVM loses the sparseness of support vector which is one of the important advantages of conventional SVM. The proposed algorithm overcomes this drawback. The quadratic Renyi entropy is the evaluating criterion for working set selection, and the size of working set is determined at the process of iteration adaptively. The regression parameters are calculated by incremental learning and the calculation of inversing a large scale matrix is avoided. So the running speed is improved. This algorithm reserves well the sparseness of support vector and improves the learning speed.

Online publication date: Mon, 30-Nov-2009

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Granular Computing, Rough Sets and Intelligent Systems (IJGCRSIS):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com