Leveraging axis-aligned subspaces for high-dimensional Bayesian optimization with group testing
(2025)- Abstract
- Bayesian optimization (BO ) is an effective method for optimizing expensive-to-evaluate black-box functions. While high-dimensional problems can be particularly challenging, due to the multitude of parameter choices and the potentially high number of data points required to fit the model, this limitation can be addressed if the problem satisfies simplifying assumptions. Axis-aligned subspace approaches, where few dimensions have a significant impact on the objective, motivated several algorithms for high-dimensional BO . However, the validity of this assumption is rarely verified, and the assumption is rarely exploited to its full extent. We propose a group testing ( GT) approach to identify active variables to facilitate efficient... (More)
- Bayesian optimization (BO ) is an effective method for optimizing expensive-to-evaluate black-box functions. While high-dimensional problems can be particularly challenging, due to the multitude of parameter choices and the potentially high number of data points required to fit the model, this limitation can be addressed if the problem satisfies simplifying assumptions. Axis-aligned subspace approaches, where few dimensions have a significant impact on the objective, motivated several algorithms for high-dimensional BO . However, the validity of this assumption is rarely verified, and the assumption is rarely exploited to its full extent. We propose a group testing ( GT) approach to identify active variables to facilitate efficient optimization in these domains. The proposed algorithm, Group Testing Bayesian Optimization (GTBO), first runs a testing phase where groups of variables are systematically selected and tested on whether they influence the objective, then terminates once active dimensions are identified. To that end, we extend the well-established GT theory to functions over continuous domains. In the second phase, GTBO guides optimization by placing more importance on the active dimensions. By leveraging the axis-aligned subspace assumption, GTBO outperforms state-of-the-art methods on benchmarks satisfying the assumption of axis-aligned subspaces, while offering improved interpretability. (Less)
Please use this url to cite or link to this publication:
https://lup.lub.lu.se/record/df8a5df6-6da6-4038-ba6a-8796020bf2f3
- author
- Hellsten, Erik
LU
; Hvarfner, Carl LU ; Papenmeier, Leonard LU
and Nardi, Luigi LU
- organization
- publishing date
- 2025
- type
- Chapter in Book/Report/Conference proceeding
- publication status
- submitted
- subject
- host publication
- International Conference on Automated Machine Learning 2025
- pages
- 18 pages
- language
- English
- LU publication?
- yes
- id
- df8a5df6-6da6-4038-ba6a-8796020bf2f3
- alternative location
- https://arxiv.org/abs/2504.06111
- date added to LUP
- 2025-05-07 15:22:21
- date last changed
- 2025-05-15 11:34:56
@inproceedings{df8a5df6-6da6-4038-ba6a-8796020bf2f3, abstract = {{Bayesian optimization (BO ) is an effective method for optimizing expensive-to-evaluate black-box functions. While high-dimensional problems can be particularly challenging, due to the multitude of parameter choices and the potentially high number of data points required to fit the model, this limitation can be addressed if the problem satisfies simplifying assumptions. Axis-aligned subspace approaches, where few dimensions have a significant impact on the objective, motivated several algorithms for high-dimensional BO . However, the validity of this assumption is rarely verified, and the assumption is rarely exploited to its full extent. We propose a group testing ( GT) approach to identify active variables to facilitate efficient optimization in these domains. The proposed algorithm, Group Testing Bayesian Optimization (GTBO), first runs a testing phase where groups of variables are systematically selected and tested on whether they influence the objective, then terminates once active dimensions are identified. To that end, we extend the well-established GT theory to functions over continuous domains. In the second phase, GTBO guides optimization by placing more importance on the active dimensions. By leveraging the axis-aligned subspace assumption, GTBO outperforms state-of-the-art methods on benchmarks satisfying the assumption of axis-aligned subspaces, while offering improved interpretability.}}, author = {{Hellsten, Erik and Hvarfner, Carl and Papenmeier, Leonard and Nardi, Luigi}}, booktitle = {{International Conference on Automated Machine Learning 2025}}, language = {{eng}}, title = {{Leveraging axis-aligned subspaces for high-dimensional Bayesian optimization with group testing}}, url = {{https://arxiv.org/abs/2504.06111}}, year = {{2025}}, }