Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

Efficient and Flexible First-Order Optimization Algorithms

Sadeghi, Hamed LU orcid (2022)
Abstract
Optimization problems occur in many areas in science and engineering. When the optimization problem at hand is of large-scale, the computational cost of the optimization algorithm is a main concern. First-order optimization algorithms—in which updates are performed using only gradient or subgradient of the objective function—have low per-iteration computational cost, which make them suitable for tackling large-scale optimization problems. Even though the per-iteration computational cost of these methods is reasonably low, the number of iterations needed for finding a solution—especially if medium or high accuracy is needed—can in practice be very high; as a result, the overall computational cost of using these methods would still be... (More)
Optimization problems occur in many areas in science and engineering. When the optimization problem at hand is of large-scale, the computational cost of the optimization algorithm is a main concern. First-order optimization algorithms—in which updates are performed using only gradient or subgradient of the objective function—have low per-iteration computational cost, which make them suitable for tackling large-scale optimization problems. Even though the per-iteration computational cost of these methods is reasonably low, the number of iterations needed for finding a solution—especially if medium or high accuracy is needed—can in practice be very high; as a result, the overall computational cost of using these methods would still be high.

This thesis focuses on one of the most widely used first-order optimization algorithms, namely, the forward–backward splitting algorithm, and attempts to improve its performance. To that end, this thesis proposes novel first-order optimization algorithms which all are built upon the forward–backward method. An important feature of the proposed methods is their flexibility. Using the flexibility of the proposed algorithms along with the safeguarding notion, this thesis provides a framework through which many new and efficient optimization algorithms can be developed.

To improve efficiency of the forward–backward algorithm, two main approaches are taken in this thesis. In the first one, a technique is proposed to adjust the point at which the forward–backward operator is evaluated. This is done through including additive terms—which are called deviations—in the input argument of the forward– backward operator. The deviations then, in order to have a convergent algorithm, have to satisfy a safeguard condition at each iteration. Incorporating deviations provides great flexibility to the algorithm and paves the way for designing new and improved forward–backward-based methods. A few instances of employing this flexibility to derive new algorithms are presented in the thesis.

In the second proposed approach, a globally (and potentially slow) convergent algorithm can be combined with a fast and locally convergent one to form an efficient optimization scheme. The role of the globally convergent method is to ensure convergence of the overall scheme. The fast local algorithm’s role is to speed up the convergence; this is done by switching from the globally convergent algorithm to the local one whenever it is safe, i.e., when a safeguard condition is satisfied. This approach, which allows for combining different global and local algorithms within its framework, can result in fast and globally convergent optimization schemes. (Less)
Please use this url to cite or link to this publication:
author
supervisor
opponent
  • Professor Dirk Lorenz, Technical University of Braunschweig, Germany.
organization
publishing date
type
Thesis
publication status
published
subject
keywords
Monotone Inclusion, Convex Optimization, First-Order Methods, Fixed-Point Iterations, Forward–Backward Splitting, Primal–Dual Algorithm
pages
160 pages
publisher
Department of Automatic Control, Lund University
defense location
Lecture hall C, building KC4, Naturvetarvägen 18, Lund. Zoom: https://lu-se.zoom.us/j/62525384194
defense date
2022-12-15 10:15:00
ISBN
978-91-8039-468-0
978-91-8039-467-3
project
Large-Scale Optimization
On acceleration of first-order convex optimization methods
language
English
LU publication?
yes
id
f7e20e00-0d5f-4891-a68d-13c91802aabe
date added to LUP
2022-11-21 19:01:09
date last changed
2022-11-23 02:41:47
@phdthesis{f7e20e00-0d5f-4891-a68d-13c91802aabe,
  abstract     = {{Optimization problems occur in many areas in science and engineering. When the optimization problem at hand is of large-scale, the computational cost of the optimization algorithm is a main concern. First-order optimization algorithms—in which updates are performed using only gradient or subgradient of the objective function—have low per-iteration computational cost, which make them suitable for tackling large-scale optimization problems. Even though the per-iteration computational cost of these methods is reasonably low, the number of iterations needed for finding a solution—especially if medium or high accuracy is needed—can in practice be very high; as a result, the overall computational cost of using these methods would still be high.<br/><br/> This thesis focuses on one of the most widely used first-order optimization algorithms, namely, the forward–backward splitting algorithm, and attempts to improve its performance. To that end, this thesis proposes novel first-order optimization algorithms which all are built upon the forward–backward method. An important feature of the proposed methods is their flexibility. Using the flexibility of the proposed algorithms along with the safeguarding notion, this thesis provides a framework through which many new and efficient optimization algorithms can be developed.<br/><br/> To improve efficiency of the forward–backward algorithm, two main approaches are taken in this thesis. In the first one, a technique is proposed to adjust the point at which the forward–backward operator is evaluated. This is done through including additive terms—which are called deviations—in the input argument of the forward– backward operator. The deviations then, in order to have a convergent algorithm, have to satisfy a safeguard condition at each iteration. Incorporating deviations provides great flexibility to the algorithm and paves the way for designing new and improved forward–backward-based methods. A few instances of employing this flexibility to derive new algorithms are presented in the thesis.<br/><br/>In the second proposed approach, a globally (and potentially slow) convergent algorithm can be combined with a fast and locally convergent one to form an efficient optimization scheme. The role of the globally convergent method is to ensure convergence of the overall scheme. The fast local algorithm’s role is to speed up the convergence; this is done by switching from the globally convergent algorithm to the local one whenever it is safe, i.e., when a safeguard condition is satisfied. This approach, which allows for combining different global and local algorithms within its framework, can result in fast and globally convergent optimization schemes.}},
  author       = {{Sadeghi, Hamed}},
  isbn         = {{978-91-8039-468-0}},
  keywords     = {{Monotone Inclusion; Convex Optimization; First-Order Methods; Fixed-Point Iterations; Forward–Backward Splitting; Primal–Dual Algorithm}},
  language     = {{eng}},
  month        = {{11}},
  publisher    = {{Department of Automatic Control, Lund University}},
  school       = {{Lund University}},
  title        = {{Efficient and Flexible First-Order Optimization Algorithms}},
  url          = {{https://lup.lub.lu.se/search/files/129108225/Doctoral_Dissertation.pdf}},
  year         = {{2022}},
}