Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

Bayesian optimization across the spectrum of knowledge : enhancing efficiency through beliefs, information and assumptions

Hvarfner, Carl LU (2025)
Abstract
Bayesian Optimization has emerged as a crucial technique for optimizing costly, black-box functions where each evaluation comes at a high cost, such as in scientific experiments, and machine learning hyperparameter optimization. By combining probabilistic modeling with sequential decision-making, Bayesian Optimization achieves efficient exploration, guiding the search toward optimal parameters with minimal data. However, real-world applications present three main challenges: leveraging expert knowledge, ensuring accurate model assumptions, and managing high-dimensional search spaces.

This thesis addresses these challenges by advancing Bayesian Optimization in three key areas. First, it develops methods to incorporate practitioner... (More)
Bayesian Optimization has emerged as a crucial technique for optimizing costly, black-box functions where each evaluation comes at a high cost, such as in scientific experiments, and machine learning hyperparameter optimization. By combining probabilistic modeling with sequential decision-making, Bayesian Optimization achieves efficient exploration, guiding the search toward optimal parameters with minimal data. However, real-world applications present three main challenges: leveraging expert knowledge, ensuring accurate model assumptions, and managing high-dimensional search spaces.

This thesis addresses these challenges by advancing Bayesian Optimization in three key areas. First, it develops methods to incorporate practitioner insights directly into the optimization process, using domain expert knowledge to guide the search more efficiently and reduce the need for extensive evaluations. Second, it proposes techniques for dynamically validating and adapting model assumptions, enabling the Gaussian Process surrogates commonly used in Bayesian Optimization to align more closely with the complexities of real-world objective functions. Finally, this work introduces adaptive strategies for high-dimensional optimization, allowing Bayesian Optimization to focus on relevant subspaces and improve sample efficiency in vast parameter spaces, thereby mitigating the "Curse of Dimensionality."

These contributions collectively enhance Bayesian Optimization’s robustness, adaptability, and efficiency, positioning it as a more powerful tool for sample-efficient optimization in complex, resource-intensive scenarios. By demonstrating these improvements through theoretical insights and empirical evaluations, this thesis establishes a pathway for more effective Bayesian Optimization in diverse, real-world applications where data is sparse and costly to obtain. (Less)
Please use this url to cite or link to this publication:
author
supervisor
opponent
  • Doc. Acerbi, Luigi, University of Helsinki, Finland.
organization
publishing date
type
Thesis
publication status
published
subject
keywords
Machine Learning (ML), Probability, Gaussian Process, Bayesian Statistics, Bayesian Optimization
pages
250 pages
publisher
Department of Computer Science, Lund University
defense location
Lecture Hall M:D, building M, Ole Römers väg 1, Faculty of Engineering LTH, Lund University, Lund.
defense date
2025-03-17 13:00:00
ISBN
978-91-8104-389-1
978-91-8104-388-4
language
English
LU publication?
yes
id
472a0a99-b364-4498-a22a-d18f63d7e0f7
date added to LUP
2025-02-18 15:18:58
date last changed
2025-04-08 13:12:28
@phdthesis{472a0a99-b364-4498-a22a-d18f63d7e0f7,
  abstract     = {{Bayesian Optimization has emerged as a crucial technique for optimizing costly, black-box functions where each evaluation comes at a high cost, such as in scientific experiments, and machine learning hyperparameter optimization. By combining probabilistic modeling with sequential decision-making, Bayesian Optimization achieves efficient exploration, guiding the search toward optimal parameters with minimal data. However, real-world applications present three main challenges: leveraging expert knowledge, ensuring accurate model assumptions, and managing high-dimensional search spaces.<br/><br/>This thesis addresses these challenges by advancing Bayesian Optimization in three key areas. First, it develops methods to incorporate practitioner insights directly into the optimization process, using domain expert knowledge to guide the search more efficiently and reduce the need for extensive evaluations. Second, it proposes techniques for dynamically validating and adapting model assumptions, enabling the Gaussian Process surrogates commonly used in Bayesian Optimization to align more closely with the complexities of real-world objective functions. Finally, this work introduces adaptive strategies for high-dimensional optimization, allowing Bayesian Optimization to focus on relevant subspaces and improve sample efficiency in vast parameter spaces, thereby mitigating the "Curse of Dimensionality."<br/><br/>These contributions collectively enhance Bayesian Optimization’s robustness, adaptability, and efficiency, positioning it as a more powerful tool for sample-efficient optimization in complex, resource-intensive scenarios. By demonstrating these improvements through theoretical insights and empirical evaluations, this thesis establishes a pathway for more effective Bayesian Optimization in diverse, real-world applications where data is sparse and costly to obtain.}},
  author       = {{Hvarfner, Carl}},
  isbn         = {{978-91-8104-389-1}},
  keywords     = {{Machine Learning (ML); Probability; Gaussian Process; Bayesian Statistics; Bayesian Optimization}},
  language     = {{eng}},
  month        = {{02}},
  publisher    = {{Department of Computer Science, Lund University}},
  school       = {{Lund University}},
  title        = {{Bayesian optimization across the spectrum of knowledge : enhancing efficiency through beliefs, information and assumptions}},
  url          = {{https://lup.lub.lu.se/search/files/215006757/PhD_Thesis_20250212-2.pdf}},
  year         = {{2025}},
}