Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

Bayesian optimization in high dimensions : a journey through subspaces and challenges

Papenmeier, Leonard LU orcid (2025)
Abstract
This thesis explores the challenges and advancements in high-dimensional Bayesian optimization (HDBO), focusing on understanding, quantifying, and improving optimization techniques in high-dimensional spaces.
Bayesian optimization (BO) is a powerful method for optimizing expensive black-box functions, but its effectiveness diminishes as the dimensionality of the search space increases due to the curse of dimensionality. The thesis introduces novel algorithms and methodologies to make HDBO more practical.
Key contributions include the development of the BAxUS algorithm, which leverages nested subspaces to optimize high-dimensional problems without estimating the dimensionality of the effective subspace.
Additionally, the... (More)
This thesis explores the challenges and advancements in high-dimensional Bayesian optimization (HDBO), focusing on understanding, quantifying, and improving optimization techniques in high-dimensional spaces.
Bayesian optimization (BO) is a powerful method for optimizing expensive black-box functions, but its effectiveness diminishes as the dimensionality of the search space increases due to the curse of dimensionality. The thesis introduces novel algorithms and methodologies to make HDBO more practical.
Key contributions include the development of the BAxUS algorithm, which leverages nested subspaces to optimize high-dimensional problems without estimating the dimensionality of the effective subspace.
Additionally, the Bounce algorithm extends these techniques to combinatorial and mixed spaces, providing robust solutions for real-world applications.
The thesis also explores the quantification of exploration in acquisition functions, proposing new methods of quantifying exploration and strategies to design more effective optimization approaches.
Furthermore, this work analyzes why simple BO setups have recently shown promising performance in high-dimensional spaces, challenging the conventional belief that BO is limited to low-dimensional problems.
This thesis offers insights and recommendations for designing more efficient HDBO algorithms by identifying and addressing failure modes such as vanishing gradients and biases in model fitting. Through a combination of theoretical analysis, empirical evaluations, and practical implementations, this thesis contributes to the field of BO by advancing our understanding of high-dimensional optimization and providing actionable methods to improve its performance in complex scenarios. (Less)
Please use this url to cite or link to this publication:
author
supervisor
opponent
  • Assoc. Prof. Garnett, Roman, Washington University in St Louis, USA.
organization
publishing date
type
Thesis
publication status
published
subject
keywords
optimization, Bayesian optimization, Gaussian process, machine learning
pages
318 pages
publisher
Computer Science, Lund University
defense location
Lecture Hall E:1406, building E, Klas Anshelms väg 10, Faculty of Engineering LTH, Lund University, Lund.
defense date
2025-06-12 13:15:00
ISBN
978-91-8104-548-2
978-91-8104-547-5
language
English
LU publication?
yes
id
c3f066f5-7cc7-4256-9502-75a0ba83ecfa
date added to LUP
2025-05-07 15:04:29
date last changed
2025-05-15 11:02:23
@phdthesis{c3f066f5-7cc7-4256-9502-75a0ba83ecfa,
  abstract     = {{This thesis explores the challenges and advancements in high-dimensional Bayesian optimization (HDBO), focusing on understanding, quantifying, and improving optimization techniques in high-dimensional spaces.<br/>Bayesian optimization (BO) is a powerful method for optimizing expensive black-box functions, but its effectiveness diminishes as the dimensionality of the search space increases due to the curse of dimensionality. The thesis introduces novel algorithms and methodologies to make HDBO more practical. <br/>Key contributions include the development of the BAxUS algorithm, which leverages nested subspaces to optimize high-dimensional problems without estimating the dimensionality of the effective subspace. <br/>Additionally, the Bounce algorithm extends these techniques to combinatorial and mixed spaces, providing robust solutions for real-world applications. <br/>The thesis also explores the quantification of exploration in acquisition functions, proposing new methods of quantifying exploration and strategies to design more effective optimization approaches. <br/>Furthermore, this work analyzes why simple BO setups have recently shown promising performance in high-dimensional spaces, challenging the conventional belief that BO is limited to low-dimensional problems. <br/>This thesis offers insights and recommendations for designing more efficient HDBO algorithms by identifying and addressing failure modes such as vanishing gradients and biases in model fitting. Through a combination of theoretical analysis, empirical evaluations, and practical implementations, this thesis contributes to the field of BO by advancing our understanding of high-dimensional optimization and providing actionable methods to improve its performance in complex scenarios.}},
  author       = {{Papenmeier, Leonard}},
  isbn         = {{978-91-8104-548-2}},
  keywords     = {{optimization; Bayesian optimization; Gaussian process; machine learning}},
  language     = {{eng}},
  publisher    = {{Computer Science, Lund University}},
  school       = {{Lund University}},
  title        = {{Bayesian optimization in high dimensions : a journey through subspaces and challenges}},
  url          = {{https://lup.lub.lu.se/search/files/218696831/thesis_may_5_final.pdf}},
  year         = {{2025}},
}