Sampling and Update Frequencies in Proximal Variance-Reduced Stochastic Gradient Methods
(2025) In Journal of Optimization Theory and Applications 205(3).- Abstract
Variance-reduced stochastic gradient methods have gained popularity in recent times. Several variants exist with different strategies for storing and sampling gradients, and this work concerns the interactions between these two aspects. We present a general proximal variance-reduced gradient method and analyze it under strong convexity assumptions. Special cases of the algorithm include SAGA, L-SVRG, and their proximal variants. Our analysis sheds light on epoch-length selection and the need to balance the convergence of the iterates with how often gradients are stored. The analysis improves on other convergence rates found in the literature and produces a new and faster converging sampling strategy for SAGA. Problem instances for which... (More)
Variance-reduced stochastic gradient methods have gained popularity in recent times. Several variants exist with different strategies for storing and sampling gradients, and this work concerns the interactions between these two aspects. We present a general proximal variance-reduced gradient method and analyze it under strong convexity assumptions. Special cases of the algorithm include SAGA, L-SVRG, and their proximal variants. Our analysis sheds light on epoch-length selection and the need to balance the convergence of the iterates with how often gradients are stored. The analysis improves on other convergence rates found in the literature and produces a new and faster converging sampling strategy for SAGA. Problem instances for which the predicted rates are the same as the practical rates are presented together with problems based on real-world data.
(Less)
- author
- Morin, Martin
LU
and Giselsson, Pontus
LU
- organization
- publishing date
- 2025-06
- type
- Contribution to journal
- publication status
- published
- subject
- keywords
- Epoch length, L-SVRG, SAGA, Sampling, Stochastic gradient, Variance-reduced
- in
- Journal of Optimization Theory and Applications
- volume
- 205
- issue
- 3
- article number
- 58
- publisher
- Springer
- external identifiers
-
- scopus:105003321216
- ISSN
- 0022-3239
- DOI
- 10.1007/s10957-025-02671-y
- language
- English
- LU publication?
- yes
- id
- 20da8231-8e44-419f-b138-95c00ba98bbf
- date added to LUP
- 2025-07-29 09:50:29
- date last changed
- 2025-07-29 09:51:39
@article{20da8231-8e44-419f-b138-95c00ba98bbf, abstract = {{<p>Variance-reduced stochastic gradient methods have gained popularity in recent times. Several variants exist with different strategies for storing and sampling gradients, and this work concerns the interactions between these two aspects. We present a general proximal variance-reduced gradient method and analyze it under strong convexity assumptions. Special cases of the algorithm include SAGA, L-SVRG, and their proximal variants. Our analysis sheds light on epoch-length selection and the need to balance the convergence of the iterates with how often gradients are stored. The analysis improves on other convergence rates found in the literature and produces a new and faster converging sampling strategy for SAGA. Problem instances for which the predicted rates are the same as the practical rates are presented together with problems based on real-world data.</p>}}, author = {{Morin, Martin and Giselsson, Pontus}}, issn = {{0022-3239}}, keywords = {{Epoch length; L-SVRG; SAGA; Sampling; Stochastic gradient; Variance-reduced}}, language = {{eng}}, number = {{3}}, publisher = {{Springer}}, series = {{Journal of Optimization Theory and Applications}}, title = {{Sampling and Update Frequencies in Proximal Variance-Reduced Stochastic Gradient Methods}}, url = {{http://dx.doi.org/10.1007/s10957-025-02671-y}}, doi = {{10.1007/s10957-025-02671-y}}, volume = {{205}}, year = {{2025}}, }