Using Similarity to Accelerate the Primal-Dual Hybrid Gradient Algorithm for Linear Programming
(2025)Department of Automatic Control
- Abstract
- This thesis investigates the use of several acceleration schemes to improve the performance of the primal-dual hybrid gradient algorithm for linear programming (PDLP), a state-of-the-art first-order method for linear programming (LP). The three main acceleration schemes investigated are: Polyak heavy ball momentum, Nesterov acceleration, and steering vectors selected in a residual momentum direction. The three benchmarks used for measuring the performance of the algorithms are: MIP Relaxations, LP Benchmark, and the Netlib LP benchmark, which were previously used to evaluate the performance of the PDLP algorithm. A novel condition for applying acceleration schemes dubbed similarity thresholding is developed and evaluated, where the cosine... (More)
- This thesis investigates the use of several acceleration schemes to improve the performance of the primal-dual hybrid gradient algorithm for linear programming (PDLP), a state-of-the-art first-order method for linear programming (LP). The three main acceleration schemes investigated are: Polyak heavy ball momentum, Nesterov acceleration, and steering vectors selected in a residual momentum direction. The three benchmarks used for measuring the performance of the algorithms are: MIP Relaxations, LP Benchmark, and the Netlib LP benchmark, which were previously used to evaluate the performance of the PDLP algorithm. A novel condition for applying acceleration schemes dubbed similarity thresholding is developed and evaluated, where the cosine similarity of the two previous steps is used to decide whether or not to apply the acceleration scheme. Combining PDLP with similarity thresholded Nesterov acceleration results in an algorithm which outperforms PDLP on all of the three benchmarks for both moderate and high accuracies. This indicates that this algorithm could improve the state-of-the-art performance for massive scale LP. However, this requires testing on the largest scales of LP problems to be verified, which is left for future work because of limited computational resources. Similarity thresholding is also shown to be able to improve the performance of Polyak momentum and steering vector acceleration schemes for these benchmarks, and improve the performance for Nesterov and Polyak schemes on the Rosenbrock problem. This suggests that similarity thresholding is a versatile technique that can be used to improve the performance for several acceleration schemes and problem settings. (Less)
Please use this url to cite or link to this publication:
http://lup.lub.lu.se/student-papers/record/9205572
- author
- Anderberg Törngren, Theodor
- supervisor
- organization
- year
- 2025
- type
- H3 - Professional qualifications (4 Years - )
- subject
- report number
- TFRT-6273
- other publication id
- 0280-5316
- language
- English
- id
- 9205572
- date added to LUP
- 2025-06-26 08:57:36
- date last changed
- 2025-06-26 08:57:36
@misc{9205572, abstract = {{This thesis investigates the use of several acceleration schemes to improve the performance of the primal-dual hybrid gradient algorithm for linear programming (PDLP), a state-of-the-art first-order method for linear programming (LP). The three main acceleration schemes investigated are: Polyak heavy ball momentum, Nesterov acceleration, and steering vectors selected in a residual momentum direction. The three benchmarks used for measuring the performance of the algorithms are: MIP Relaxations, LP Benchmark, and the Netlib LP benchmark, which were previously used to evaluate the performance of the PDLP algorithm. A novel condition for applying acceleration schemes dubbed similarity thresholding is developed and evaluated, where the cosine similarity of the two previous steps is used to decide whether or not to apply the acceleration scheme. Combining PDLP with similarity thresholded Nesterov acceleration results in an algorithm which outperforms PDLP on all of the three benchmarks for both moderate and high accuracies. This indicates that this algorithm could improve the state-of-the-art performance for massive scale LP. However, this requires testing on the largest scales of LP problems to be verified, which is left for future work because of limited computational resources. Similarity thresholding is also shown to be able to improve the performance of Polyak momentum and steering vector acceleration schemes for these benchmarks, and improve the performance for Nesterov and Polyak schemes on the Rosenbrock problem. This suggests that similarity thresholding is a versatile technique that can be used to improve the performance for several acceleration schemes and problem settings.}}, author = {{Anderberg Törngren, Theodor}}, language = {{eng}}, note = {{Student Paper}}, title = {{Using Similarity to Accelerate the Primal-Dual Hybrid Gradient Algorithm for Linear Programming}}, year = {{2025}}, }