A Combinatorial Approach to Hyperparameter Optimization
Files
TR Number
Date
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
In machine learning, hyperparameter optimization (HPO) is essential for effective model training and significantly impacts model performance. Hyperparameters are predefined model settings which fine-tune the model’s behavior and are critical to modeling complex data patterns. Traditional HPO approaches such as Grid Search, Random Search, and Bayesian Optimization have been widely used in this field. However, as datasets grow and models increase in complexity, these approaches often require a significant amount of time and resources for HPO. This research introduces a novel approach using 𝑡-way testing—a combinatorial approach to software testing used for identifying faults with a test set that covers all 𝑡-way interactions—for HPO. 𝑇 -way testing substantially narrows the search space and effectively covers parameter interactions. Our experimental results show that our approach reduces the number of necessary model evaluations and significantly cuts computational expenses while still outperforming traditional HPO approaches for the models studied in our experiments.