Authors: Natsuki Sano; Tomomichi Suzuki
Addresses: Department of Economics, Management and Information Science, Onomichi City University, 1600-2 Hisayamada-cho, Onomichi, Hiroshima, Japan ' Department of Industrial Administration, Tokyo University of Science, 2641 Yamazaki, Noda, Chiba, Japan
Abstract: Support vector regression (SVR) is a nonlinear prediction method using kernel functions and has been widely applied to real-world problems. Although the accuracy of an effectively tuned SVR is high, its performance strongly depends on hyper parameters. Therefore, the determination of the parameters is important when applying SVR to real-world problems. Although the optimum parameters are usually determined by an exhaustive grid search, using this method is not realistic when the sample size is considerably large. To decrease the computational time required to determine the optimum parameters, we employ orthogonal array and propose two efficient methods for SVR parameter tuning based on variable selection in Taguchi method. The proposed methods can reduce the computational time to approximately one-twelfth of that taken by a grid search method. We also validated the actual computational time and accuracy of the proposed methods by applying it to five real datasets in UCI repository.
Keywords: machine learning; support vector regression; SVR; hyper parameter; orthogonal array; Taguchi method.
International Journal of Computational Intelligence Studies, 2017 Vol.6 No.1, pp.40 - 51
Received: 12 Nov 2015
Accepted: 03 Feb 2016
Published online: 11 Aug 2017 *