Advanced

Metric selection in fast dual forward-backward splitting

Giselsson, Pontus LU and Boyd, Stephen (2015) In Automatica 62. p.1-10
Abstract
The performance of fast forward-backward splitting, or equivalently fast proximal gradient methods, depends on the conditioning of the optimization problem data. This conditioning is related to a metric that is defined by the space on which the optimization problem is stated; selecting a space on which the optimization data is better conditioned improves the performance of the algorithm. In this paper, we propose several methods, with different computational complexity, to find a space on which the algorithm performs well. We evaluate the proposed metric selection procedures by comparing the performance to the case when the Euclidean space is used. For the most ill-conditioned problem we consider, the computational complexity is improved... (More)
The performance of fast forward-backward splitting, or equivalently fast proximal gradient methods, depends on the conditioning of the optimization problem data. This conditioning is related to a metric that is defined by the space on which the optimization problem is stated; selecting a space on which the optimization data is better conditioned improves the performance of the algorithm. In this paper, we propose several methods, with different computational complexity, to find a space on which the algorithm performs well. We evaluate the proposed metric selection procedures by comparing the performance to the case when the Euclidean space is used. For the most ill-conditioned problem we consider, the computational complexity is improved by two to three orders of magnitude. We also report comparable to superior performance compared to state-of-the-art optimization software. (C) 2015 Elsevier Ltd. All rights reserved. (Less)
Please use this url to cite or link to this publication:
author
organization
publishing date
type
Contribution to journal
publication status
published
subject
keywords
First order optimization algorithms, Metric selection, Preconditioning, Model predictive control, Distributed optimization
in
Automatica
volume
62
pages
1 - 10
publisher
Pergamon
external identifiers
  • wos:000366233700001
  • scopus:84947813956
ISSN
0005-1098
DOI
10.1016/j.automatica.2015.09.010
language
English
LU publication?
yes
id
0f719ac1-f4a7-4843-b707-701ad10be82f (old id 8556996)
date added to LUP
2016-01-26 13:33:43
date last changed
2017-10-29 03:50:34
@article{0f719ac1-f4a7-4843-b707-701ad10be82f,
  abstract     = {The performance of fast forward-backward splitting, or equivalently fast proximal gradient methods, depends on the conditioning of the optimization problem data. This conditioning is related to a metric that is defined by the space on which the optimization problem is stated; selecting a space on which the optimization data is better conditioned improves the performance of the algorithm. In this paper, we propose several methods, with different computational complexity, to find a space on which the algorithm performs well. We evaluate the proposed metric selection procedures by comparing the performance to the case when the Euclidean space is used. For the most ill-conditioned problem we consider, the computational complexity is improved by two to three orders of magnitude. We also report comparable to superior performance compared to state-of-the-art optimization software. (C) 2015 Elsevier Ltd. All rights reserved.},
  author       = {Giselsson, Pontus and Boyd, Stephen},
  issn         = {0005-1098},
  keyword      = {First order optimization algorithms,Metric selection,Preconditioning,Model predictive control,Distributed optimization},
  language     = {eng},
  pages        = {1--10},
  publisher    = {Pergamon},
  series       = {Automatica},
  title        = {Metric selection in fast dual forward-backward splitting},
  url          = {http://dx.doi.org/10.1016/j.automatica.2015.09.010},
  volume       = {62},
  year         = {2015},
}