Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

LassoBench : A High-Dimensional Hyperparameter Optimization Benchmark Suite for Lasso

Šehić, Kenan LU orcid ; Gramfort, Alexandre ; Salmon, Joseph and Nardi, Luigi LU (2022) 1st International Conferenceon Automated Machine Learning, AutoML 2022 188.
Abstract

While Weighted Lasso sparse regression has appealing statistical guarantees that would entail a major real-world impact in finance, genomics, and brain imaging applications, it is typically scarcely adopted due to its complex high-dimensional space composed by thousands of hyperparameters. On the other hand, the latest progress with high-dimensional hyperparameter optimization (HD-HPO) methods for black-box functions demonstrates that high-dimensional applications can indeed be efficiently optimized. Despite this initial success, HD-HPO approaches are mostly applied to synthetic problems with a moderate number of dimensions, which limits its impact in scientific and engineering applications. We propose LassoBench, the first benchmark... (More)

While Weighted Lasso sparse regression has appealing statistical guarantees that would entail a major real-world impact in finance, genomics, and brain imaging applications, it is typically scarcely adopted due to its complex high-dimensional space composed by thousands of hyperparameters. On the other hand, the latest progress with high-dimensional hyperparameter optimization (HD-HPO) methods for black-box functions demonstrates that high-dimensional applications can indeed be efficiently optimized. Despite this initial success, HD-HPO approaches are mostly applied to synthetic problems with a moderate number of dimensions, which limits its impact in scientific and engineering applications. We propose LassoBench, the first benchmark suite tailored for Weighted Lasso regression. LassoBench consists of benchmarks for both well-controlled synthetic setups (number of samples, noise level, ambient and effective dimensionalities, and multiple fidelities) and real-world datasets, which enables the use of many flavors of HPO algorithms to be studied and extended to the high-dimensional Lasso setting. We evaluate 6 state-of-the-art HPO methods and 3 Lasso baselines, and demonstrate that Bayesian optimization and evolutionary strategies can improve over the methods commonly used for sparse regression while highlighting limitations of these frameworks in very high-dimensional and noisy settings.

(Less)
Please use this url to cite or link to this publication:
author
; ; and
organization
publishing date
type
Chapter in Book/Report/Conference proceeding
publication status
published
subject
host publication
Proceedings of Machine Learning Research
volume
188
article number
189437
conference name
1st International Conferenceon Automated Machine Learning, AutoML 2022
conference location
Baltimore, United States
conference dates
2022-07-25 - 2022-07-27
external identifiers
  • scopus:85163568913
language
English
LU publication?
yes
id
1dbfd9bc-fde6-4960-b83e-7343e7fd4228
alternative location
https://openreview.net/pdf?id=S4leJbTrLg5
date added to LUP
2023-10-30 11:45:16
date last changed
2023-12-19 16:06:35
@inproceedings{1dbfd9bc-fde6-4960-b83e-7343e7fd4228,
  abstract     = {{<p>While Weighted Lasso sparse regression has appealing statistical guarantees that would entail a major real-world impact in finance, genomics, and brain imaging applications, it is typically scarcely adopted due to its complex high-dimensional space composed by thousands of hyperparameters. On the other hand, the latest progress with high-dimensional hyperparameter optimization (HD-HPO) methods for black-box functions demonstrates that high-dimensional applications can indeed be efficiently optimized. Despite this initial success, HD-HPO approaches are mostly applied to synthetic problems with a moderate number of dimensions, which limits its impact in scientific and engineering applications. We propose LassoBench, the first benchmark suite tailored for Weighted Lasso regression. LassoBench consists of benchmarks for both well-controlled synthetic setups (number of samples, noise level, ambient and effective dimensionalities, and multiple fidelities) and real-world datasets, which enables the use of many flavors of HPO algorithms to be studied and extended to the high-dimensional Lasso setting. We evaluate 6 state-of-the-art HPO methods and 3 Lasso baselines, and demonstrate that Bayesian optimization and evolutionary strategies can improve over the methods commonly used for sparse regression while highlighting limitations of these frameworks in very high-dimensional and noisy settings.</p>}},
  author       = {{Šehić, Kenan and Gramfort, Alexandre and Salmon, Joseph and Nardi, Luigi}},
  booktitle    = {{Proceedings of Machine Learning Research}},
  language     = {{eng}},
  title        = {{LassoBench : A High-Dimensional Hyperparameter Optimization Benchmark Suite for Lasso}},
  url          = {{https://openreview.net/pdf?id=S4leJbTrLg5}},
  volume       = {{188}},
  year         = {{2022}},
}