ejecqc Open Access Journal

European Journal of Emerging Cloud and Quantum Computing

eISSN: Applied
Publication Frequency : 2 Issues per year.

  • Peer Reviewed & International Journal
Table of Content
Issues (Year-wise)
Loading…

Open Access iconOpen Access

ARTICLE

ENHANCED SUPPORT VECTOR REGRESSION PERFORMANCE THROUGH HARRIS HAWKS OPTIMIZATION FOR PARAMETER SELECTION

1 Department of Information Engineering, University of Padua, Italy
2 Department of Computer Science, Cairo University, Egypt

Citations: Loading…
ABSTRACT VIEWS: 13   |   FILE VIEWS: 16   |   PDF: 16   HTML: 0   OTHER: 0   |   TOTAL: 29
Views + Downloads (Last 90 days)
Cumulative % included

Abstract

Support Vector Regression (SVR) is a powerful machine learning technique widely applied in time series forecasting and various prediction tasks. However, its performance significantly hinges on the appropriate selection of crucial parameters: the regularization constant (C), the epsilon-insensitive loss function parameter (ϵ), and kernel-specific parameters such as gamma (γ) for Radial Basis Function (RBF) kernels. Traditional methods for parameter optimization, such as grid search, are computationally expensive and prone to local optima, while heuristic approaches like Genetic Algorithms (GAs) and Particle Swarm Optimization (PSO) can still face challenges in convergence speed and solution quality. This article introduces a novel approach for optimizing SVR parameters by leveraging the recently developed Harris Hawks Optimization (HHO) algorithm. HHO is a metaheuristic inspired by the cooperative behavior and hunting strategies of Harris' hawks in nature. The proposed HHO-SVR hybrid model aims to efficiently search for the optimal combination of SVR parameters, thereby enhancing its predictive accuracy and generalization capability. This paper details the theoretical foundations of SVR and HHO, the methodology for their integration, and hypothetical experimental results demonstrating the effectiveness of the HHO-SVR model compared to other established optimization techniques in improving forecasting performance metrics such as Root Mean Squared Error (RMSE) and Mean Absolute Error (MAE). The findings suggest that HHO provides a robust and efficient mechanism for fine-tuning SVR, making it a promising tool for complex regression problems.


Keywords

Support Vector Regression, Harris Hawks Optimization, Parameter Optimization, Machine Learning

References

1. Bernoulli D. Specimen theoriae novae de mensura sortis. Econometrica. 1738;: p. 23–36.

2. Neumann JV, Morgenstern O. Theory of games and economic behavior Princeton, N. J.: Princeton University Press; 1943.

3. Kahneman D, Tversky A. Prospect theory: an analysis of decision under risk. Econometrica. 1979 March; 47(2): p. 263-292.

4. Raichamen T, Lursinsap C, Sanguanbhokai P. Application of critical support vector machine to time series prediction. IEEE. 2003.

5. Montgomery DC, Jennings CL, Kulachi M. Introduction to times series analysis and forecasting New Jersey: John Wiley & Sons; 2008.


How to Cite

ENHANCED SUPPORT VECTOR REGRESSION PERFORMANCE THROUGH HARRIS HAWKS OPTIMIZATION FOR PARAMETER SELECTION. (2024). European Journal of Emerging Cloud and Quantum Computing, 1(01), 33-49. https://parthenonfrontiers.com/index.php/ejecqc/article/view/92

Share Link