Understanding high-dimensional Bayesian optimization
(2025) 42nd International Conference on Machine Learning, ICML 2025 267. p.47902-47923- Abstract
- Recent work reported that simple Bayesian optimization methods perform well for high-dimensional real-world tasks, seemingly contradicting prior work and tribal knowledge. This paper investigates the 'why'. We identify fundamental challenges that arise in high-dimensional Bayesian optimization and explain why recent methods succeed. Our analysis shows that vanishing gradients caused by Gaussian process initialization schemes play a major role in the failures of high-dimensional Bayesian optimization and that methods that promote local search behaviors are better suited for the task. We find that maximum likelihood estimation of Gaussian process length scales suffices for state-of-the-art performance. Based on this, we propose a simple... (More)
- Recent work reported that simple Bayesian optimization methods perform well for high-dimensional real-world tasks, seemingly contradicting prior work and tribal knowledge. This paper investigates the 'why'. We identify fundamental challenges that arise in high-dimensional Bayesian optimization and explain why recent methods succeed. Our analysis shows that vanishing gradients caused by Gaussian process initialization schemes play a major role in the failures of high-dimensional Bayesian optimization and that methods that promote local search behaviors are better suited for the task. We find that maximum likelihood estimation of Gaussian process length scales suffices for state-of-the-art performance. Based on this, we propose a simple variant of maximum likelihood estimation called MSR that leverages these findings to achieve state-of-the-art performance on a comprehensive set of real-world applications. We also present targeted experiments to illustrate and confirm our findings. (Less)
Please use this url to cite or link to this publication:
https://lup.lub.lu.se/record/1c8d8028-aeb6-42fc-9bd0-47d28333e8dd
- author
- Papenmeier, Leonard
LU
; Poloczek, Matthias
and Nardi, Luigi
LU
- organization
- publishing date
- 2025
- type
- Chapter in Book/Report/Conference proceeding
- publication status
- published
- subject
- host publication
- Proceedings of Machine Learning Research
- volume
- 267
- pages
- 47902 - 47923
- publisher
- ML Research Press
- conference name
- 42nd International Conference on Machine Learning, ICML 2025
- conference location
- Vancouver, Canada
- conference dates
- 2025-07-13 - 2025-07-19
- external identifiers
-
- scopus:105023567114
- language
- English
- LU publication?
- yes
- id
- 1c8d8028-aeb6-42fc-9bd0-47d28333e8dd
- alternative location
- https://arxiv.org/abs/2502.09198
- date added to LUP
- 2025-05-07 15:19:33
- date last changed
- 2026-02-03 16:13:04
@inproceedings{1c8d8028-aeb6-42fc-9bd0-47d28333e8dd,
abstract = {{Recent work reported that simple Bayesian optimization methods perform well for high-dimensional real-world tasks, seemingly contradicting prior work and tribal knowledge. This paper investigates the 'why'. We identify fundamental challenges that arise in high-dimensional Bayesian optimization and explain why recent methods succeed. Our analysis shows that vanishing gradients caused by Gaussian process initialization schemes play a major role in the failures of high-dimensional Bayesian optimization and that methods that promote local search behaviors are better suited for the task. We find that maximum likelihood estimation of Gaussian process length scales suffices for state-of-the-art performance. Based on this, we propose a simple variant of maximum likelihood estimation called MSR that leverages these findings to achieve state-of-the-art performance on a comprehensive set of real-world applications. We also present targeted experiments to illustrate and confirm our findings.}},
author = {{Papenmeier, Leonard and Poloczek, Matthias and Nardi, Luigi}},
booktitle = {{Proceedings of Machine Learning Research}},
language = {{eng}},
pages = {{47902--47923}},
publisher = {{ML Research Press}},
title = {{Understanding high-dimensional Bayesian optimization}},
url = {{https://arxiv.org/abs/2502.09198}},
volume = {{267}},
year = {{2025}},
}