Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

Bilinear parameterization for differentiable rank-regularization

Ornhag, Marcus Valtonen LU ; Olsson, Carl LU and Heyden, Anders LU orcid (2020) 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2020 In IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops 2020-June. p.1416-1425
Abstract

Low rank approximation is a commonly occurring problem in many computer vision and machine learning applications. There are two common ways of optimizing the resulting models. Either the set of matrices with a given rank can be explicitly parametrized using a bilinear factorization, or low rank can be implicitly enforced using regularization terms penalizing non-zero singular values. While the former approach results in differentiable problems that can be efficiently optimized using local quadratic approximation, the latter is typically not differentiable (sometimes even discontinuous) and requires first order subgradient or splitting methods. It is well known that gradient based methods exhibit slow convergence for ill-conditioned... (More)

Low rank approximation is a commonly occurring problem in many computer vision and machine learning applications. There are two common ways of optimizing the resulting models. Either the set of matrices with a given rank can be explicitly parametrized using a bilinear factorization, or low rank can be implicitly enforced using regularization terms penalizing non-zero singular values. While the former approach results in differentiable problems that can be efficiently optimized using local quadratic approximation, the latter is typically not differentiable (sometimes even discontinuous) and requires first order subgradient or splitting methods. It is well known that gradient based methods exhibit slow convergence for ill-conditioned problems.In this paper we show how many non-differentiable regularization methods can be reformulated into smooth objectives using bilinear parameterization. This allows us to use standard second order methods, such as Levenberg- Marquardt (LM) and Variable Projection (VarPro), to achieve accurate solutions for ill-conditioned cases. We show on several real and synthetic experiments that our second order formulation converges to substantially more accurate solutions than competing state-of-the-art methods.1

(Less)
Please use this url to cite or link to this publication:
author
; and
organization
publishing date
type
Chapter in Book/Report/Conference proceeding
publication status
published
subject
host publication
Proceedings - 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2020
series title
IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
volume
2020-June
article number
9151079
pages
10 pages
publisher
IEEE Computer Society
conference name
2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2020
conference location
Virtual, Online, United States
conference dates
2020-06-14 - 2020-06-19
external identifiers
  • scopus:85090146175
ISSN
2160-7516
2160-7508
ISBN
9781728193601
DOI
10.1109/CVPRW50498.2020.00181
language
English
LU publication?
yes
id
69cb3e0c-ea9b-404e-8ab0-bee6ac14e136
date added to LUP
2020-09-24 14:09:14
date last changed
2024-04-03 13:24:33
@inproceedings{69cb3e0c-ea9b-404e-8ab0-bee6ac14e136,
  abstract     = {{<p>Low rank approximation is a commonly occurring problem in many computer vision and machine learning applications. There are two common ways of optimizing the resulting models. Either the set of matrices with a given rank can be explicitly parametrized using a bilinear factorization, or low rank can be implicitly enforced using regularization terms penalizing non-zero singular values. While the former approach results in differentiable problems that can be efficiently optimized using local quadratic approximation, the latter is typically not differentiable (sometimes even discontinuous) and requires first order subgradient or splitting methods. It is well known that gradient based methods exhibit slow convergence for ill-conditioned problems.In this paper we show how many non-differentiable regularization methods can be reformulated into smooth objectives using bilinear parameterization. This allows us to use standard second order methods, such as Levenberg- Marquardt (LM) and Variable Projection (VarPro), to achieve accurate solutions for ill-conditioned cases. We show on several real and synthetic experiments that our second order formulation converges to substantially more accurate solutions than competing state-of-the-art methods.<sup>1</sup></p>}},
  author       = {{Ornhag, Marcus Valtonen and Olsson, Carl and Heyden, Anders}},
  booktitle    = {{Proceedings - 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2020}},
  isbn         = {{9781728193601}},
  issn         = {{2160-7516}},
  language     = {{eng}},
  month        = {{06}},
  pages        = {{1416--1425}},
  publisher    = {{IEEE Computer Society}},
  series       = {{IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops}},
  title        = {{Bilinear parameterization for differentiable rank-regularization}},
  url          = {{http://dx.doi.org/10.1109/CVPRW50498.2020.00181}},
  doi          = {{10.1109/CVPRW50498.2020.00181}},
  volume       = {{2020-June}},
  year         = {{2020}},
}