Vanilla Bayesian Optimization Performs Great in High Dimensions
(2024) 41st International Conference on Machine Learning, ICML 2024 In Proceedings of Machine Learning Research 235. p.20793-20817- Abstract
High-dimensional problems have long been considered the Achilles' heel of Bayesian optimization. Spurred by the curse of dimensionality, a large collection of algorithms aim to make it more performant in this setting, commonly by imposing various simplifying assumptions on the objective. In this paper, we identify the degeneracies that make vanilla Bayesian optimization poorly suited to high-dimensional tasks, and further show how existing algorithms address these degeneracies through the lens of lowering the model complexity. Moreover, we propose an enhancement to the prior assumptions that are typical to vanilla Bayesian optimization, which reduces the complexity to manageable levels without imposing structural restrictions on the... (More)
High-dimensional problems have long been considered the Achilles' heel of Bayesian optimization. Spurred by the curse of dimensionality, a large collection of algorithms aim to make it more performant in this setting, commonly by imposing various simplifying assumptions on the objective. In this paper, we identify the degeneracies that make vanilla Bayesian optimization poorly suited to high-dimensional tasks, and further show how existing algorithms address these degeneracies through the lens of lowering the model complexity. Moreover, we propose an enhancement to the prior assumptions that are typical to vanilla Bayesian optimization, which reduces the complexity to manageable levels without imposing structural restrictions on the objective. Our modification - a simple scaling of the Gaussian process lengthscale prior with the dimensionality - reveals that standard Bayesian optimization works drastically better than previously thought in high dimensions, clearly outperforming existing state-of-the-art algorithms on multiple commonly considered real-world high-dimensional tasks.
(Less)
- author
- Hvarfner, Carl
LU
; Hellsten, Erik O.
LU
and Nardi, Luigi LU
- organization
- publishing date
- 2024
- type
- Contribution to journal
- publication status
- published
- subject
- in
- Proceedings of Machine Learning Research
- volume
- 235
- pages
- 25 pages
- publisher
- ML Research Press
- conference name
- 41st International Conference on Machine Learning, ICML 2024
- conference location
- Vienna, Austria
- conference dates
- 2024-07-21 - 2024-07-27
- external identifiers
-
- scopus:85203797485
- ISSN
- 2640-3498
- language
- English
- LU publication?
- yes
- additional info
- Publisher Copyright: Copyright 2024 by the author(s)
- id
- 4d9e7bc3-2aab-4c5d-a341-67c40b8e20d3
- alternative location
- https://proceedings.mlr.press/v235/hussain24a.html
- date added to LUP
- 2024-12-13 09:20:50
- date last changed
- 2025-05-30 22:39:58
@article{4d9e7bc3-2aab-4c5d-a341-67c40b8e20d3, abstract = {{<p>High-dimensional problems have long been considered the Achilles' heel of Bayesian optimization. Spurred by the curse of dimensionality, a large collection of algorithms aim to make it more performant in this setting, commonly by imposing various simplifying assumptions on the objective. In this paper, we identify the degeneracies that make vanilla Bayesian optimization poorly suited to high-dimensional tasks, and further show how existing algorithms address these degeneracies through the lens of lowering the model complexity. Moreover, we propose an enhancement to the prior assumptions that are typical to vanilla Bayesian optimization, which reduces the complexity to manageable levels without imposing structural restrictions on the objective. Our modification - a simple scaling of the Gaussian process lengthscale prior with the dimensionality - reveals that standard Bayesian optimization works drastically better than previously thought in high dimensions, clearly outperforming existing state-of-the-art algorithms on multiple commonly considered real-world high-dimensional tasks.</p>}}, author = {{Hvarfner, Carl and Hellsten, Erik O. and Nardi, Luigi}}, issn = {{2640-3498}}, language = {{eng}}, pages = {{20793--20817}}, publisher = {{ML Research Press}}, series = {{Proceedings of Machine Learning Research}}, title = {{Vanilla Bayesian Optimization Performs Great in High Dimensions}}, url = {{https://proceedings.mlr.press/v235/hussain24a.html}}, volume = {{235}}, year = {{2024}}, }