Abstract
Hyperparameter selection, an important and challenging problem in machine learning, is particularly crucial for achieving optimal performance. Least Squares Support Vector Regression (LSSVR) combines the powerful capabilities of Support Vector Machines (SVM) in machine learning with the simplicity of the Least Squares method. In this paper, we propose a new hyperparameter selection method for LSSVR by constructing a novel optimization problem, which is then solved using the gradient descent algorithm. By comparing the performance of the grid algorithm and several heuristic algorithms in terms of mean squared error (MSE), the experimental results demonstrate that our method can select more suitable parameters compared to other approaches, ultimately leading to a smaller MSE value.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Erden, C., Demir, H.I., Kökçam, A.H.: Enhancing Machine Learning Model Performance with Hyper Parameter Optimization: A Comparative Study. arXiv preprint arXiv:2302.11406 (2023)
Van Rijn, J.N., Hutter, F.: Hyperparameter importance across datasets. In: Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 2367–2376. ACM (2018)
Han, S., Qubo, C., Meng, H.: Parameter selection in SVM with RBF Kernel function. In: World Autom. Congr. (WAC), pp. 1–4. IEEE (2012)
Boyd, S., Vandenberghe, L.: Convex Optimization. Cambridge University Press (2004)
Yu, L., Dai, W., Tang, L., Wu, J.: A hybrid grid-GA-based LSSVR learning paradigm for crude oil price forecasting. Neural Comput. Appl. 27, 2193–2215 (2016)
Zheng, J., Shao, X., Gao, L., Jiang, P., Qiu, H.: A prior-knowledge input LSSVR metamodeling method with tuning based on cellular particle swarm optimization for engineering design. Expert Syst. Appl. 41(5), 2111–2125 (2014)
Zhou, Z.-H.: Machine Learning. Springer Nature (2021)
Awad, M., Khanna, R.: Support vector regression. In: Efficient Learning Machines: Theories, Concepts, and Applications for Engineers and System Designers, pp. 67–80. Springer (2015)
Liashchynskyi, P., Liashchynskyi, P.: Grid Search, Random Search, Genetic Algorithm: A Big Comparison for NAS. arXiv preprint arXiv:1912.06059 (2019)
Yang, L., Shami, A.: On hyperparameter optimization of machine learning algorithms: theory and practice. Neurocomputing 415, 295–316 (2020)
Staelin, C.: Parameter Selection for Support Vector Machines. Hewlett-Packard Company, Tech. Rep. HPL-2002–354R1 (2003)
Bergstra, J., Bengio, Y.: Random search for hyper-parameter optimization. J. Mach. Learn. Res. 13(2) (2012)
Wu, J., Chen, X.-Y., Zhang, H., Xiong, L.-D., Lei, H., Deng, S.-H.: Hyperparameter optimization for machine learning models based on bayesian optimization. J. Electron. Sci. Technol. 17(1), 26–40 (2019)
Holland, J.H.: Genetic algorithms. Sci. Am. 267(1), 66–73 (1992)
Syarif, I., Prugel-Bennett, A., Wills, G.: SVM parameter optimization using grid search and genetic algorithm to improve classification performance. TELKOMNIKA Telecommun. Comput. Electron. Control 14(4), 1502–1509 (2016)
Karaboga, D., Basturk, B.: On the performance of artificial bee colony (ABC) algorithm. Appl. Soft Comput. 8(1), 687–697 (2008)
Karaboga, D., Gorkemli, B., Ozturk, C., Karaboga, N.: A comprehensive survey: artificial bee colony (ABC) algorithm and applications. Artif. Intell. Rev. 42, 21–57 (2014)
Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proceedings of ICNN'95-International Conference on Neural Networks, vol. 4, pp. 1942–1948. IEEE (1995)
Mirjalili, S., Mirjalili, S.M., Lewis, A.: Grey wolf optimizer. Adv. Eng. Softw. 69, 46–61 (2014)
Mirjalili, S., Lewis, A.: The whale optimization algorithm. Adv. Eng. Softw. 95, 51–67 (2016)
Acknowledgments
This work is supported in part by National Natural Science Foundation of China (Nos. 62106112, 62366035 and 61966024), and in part by Natural Science Foundation of Inner Mongolia Autonomous Region of China (No. 2023MS01006).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Ethics declarations
Disclosure of Interests
The authors have no competing interests to declare that are relevant to the content of this article.
Rights and permissions
Copyright information
© 2025 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Le, A., Lan, B., Zhen, W. (2025). Gradient Descent for Hyperparameter Selection in Least Squares Support Vector Regression. In: Huang, DS., Chen, W., Zhang, C., Pan, Y., Zhang, Q., Kong, X. (eds) Applied Intelligence. ICAI 2024. Communications in Computer and Information Science, vol 2388. Springer, Singapore. https://doi.org/10.1007/978-981-96-1904-7_26
Download citation
DOI: https://doi.org/10.1007/978-981-96-1904-7_26
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-96-1903-0
Online ISBN: 978-981-96-1904-7
eBook Packages: Computer ScienceComputer Science (R0)