International Journal of Machine Learning and Cybernetics, cilt.17, sa.2, 2026 (SCI-Expanded, Scopus)
In the realm of Machine Learning (ML), the optimization of hyperparameters is a pivotal step in creating intelligent models of high efficacy. Configuration of hyperparameters in a smart way contributes to enhanced model generalization and resilience, mitigating the risks associated with overfitting or underfitting. Optimization algorithms play a decisive role in determining the ideal hyperparameters for ML algorithms. This study introduces a novel Hyperparameter Optimization (HPO) approach based on the Marine Predators Algorithm (MPA) designed for regression problems. The performance of the proposed algorithm is benchmarked against a spectrum of popular and classical optimization algorithms from the literature. Furthermore, HPO is examined for three widely employed ML algorithms, which are Extreme Gradient Boosting (XGB), Light Gradient Boosting Machine (LGBM), and Support Vector Regression (SVR). Regression models are constructed using an actual world dataset for the prediction of shipment status time, and the proposed algorithm is further evaluated with three public datasets suitable for regression analysis. A new hyperparameter decoding approach is proposed to turn HPO into a numerical optimization problem regardless of parameter type. Additionally, the performance of the proposed approach is evaluated using the benchmark test suite from the Congress on Evolutionary Computation (CEC) 2020 competition. This evaluation aims to assess the convergence ability and stability of the method across a range of test problems. Consequently, the experimental results indicate that the proposed optimization algorithm exhibited superior performance than the compared optimizers in HPO for both ML algorithms across all regression datasets.