Metric selection in fast dual forward-backward splitting
(2015) In Automatica 62. p.1-10- Abstract
- The performance of fast forward-backward splitting, or equivalently fast proximal gradient methods, depends on the conditioning of the optimization problem data. This conditioning is related to a metric that is defined by the space on which the optimization problem is stated; selecting a space on which the optimization data is better conditioned improves the performance of the algorithm. In this paper, we propose several methods, with different computational complexity, to find a space on which the algorithm performs well. We evaluate the proposed metric selection procedures by comparing the performance to the case when the Euclidean space is used. For the most ill-conditioned problem we consider, the computational complexity is improved... (More)
- The performance of fast forward-backward splitting, or equivalently fast proximal gradient methods, depends on the conditioning of the optimization problem data. This conditioning is related to a metric that is defined by the space on which the optimization problem is stated; selecting a space on which the optimization data is better conditioned improves the performance of the algorithm. In this paper, we propose several methods, with different computational complexity, to find a space on which the algorithm performs well. We evaluate the proposed metric selection procedures by comparing the performance to the case when the Euclidean space is used. For the most ill-conditioned problem we consider, the computational complexity is improved by two to three orders of magnitude. We also report comparable to superior performance compared to state-of-the-art optimization software. (C) 2015 Elsevier Ltd. All rights reserved. (Less)
Please use this url to cite or link to this publication:
https://lup.lub.lu.se/record/8556996
- author
- Giselsson, Pontus LU and Boyd, Stephen
- organization
- publishing date
- 2015
- type
- Contribution to journal
- publication status
- published
- subject
- keywords
- First order optimization algorithms, Metric selection, Preconditioning, Model predictive control, Distributed optimization
- in
- Automatica
- volume
- 62
- pages
- 1 - 10
- publisher
- Pergamon Press Ltd.
- external identifiers
-
- wos:000366233700001
- scopus:84947813956
- ISSN
- 0005-1098
- DOI
- 10.1016/j.automatica.2015.09.010
- language
- English
- LU publication?
- yes
- id
- 0f719ac1-f4a7-4843-b707-701ad10be82f (old id 8556996)
- date added to LUP
- 2016-04-01 13:22:50
- date last changed
- 2023-11-12 15:49:47
@article{0f719ac1-f4a7-4843-b707-701ad10be82f, abstract = {{The performance of fast forward-backward splitting, or equivalently fast proximal gradient methods, depends on the conditioning of the optimization problem data. This conditioning is related to a metric that is defined by the space on which the optimization problem is stated; selecting a space on which the optimization data is better conditioned improves the performance of the algorithm. In this paper, we propose several methods, with different computational complexity, to find a space on which the algorithm performs well. We evaluate the proposed metric selection procedures by comparing the performance to the case when the Euclidean space is used. For the most ill-conditioned problem we consider, the computational complexity is improved by two to three orders of magnitude. We also report comparable to superior performance compared to state-of-the-art optimization software. (C) 2015 Elsevier Ltd. All rights reserved.}}, author = {{Giselsson, Pontus and Boyd, Stephen}}, issn = {{0005-1098}}, keywords = {{First order optimization algorithms; Metric selection; Preconditioning; Model predictive control; Distributed optimization}}, language = {{eng}}, pages = {{1--10}}, publisher = {{Pergamon Press Ltd.}}, series = {{Automatica}}, title = {{Metric selection in fast dual forward-backward splitting}}, url = {{http://dx.doi.org/10.1016/j.automatica.2015.09.010}}, doi = {{10.1016/j.automatica.2015.09.010}}, volume = {{62}}, year = {{2015}}, }