Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

Low-rank inducing norms with optimality interpretations

Grussler, Christian LU and Giselsson, Pontus LU orcid (2018) In SIAM Journal on Optimization 28(4). p.3057-3078
Abstract

Optimization problems with rank constraints appear in many diverse fields such as control, machine learning, and image analysis. Since the rank constraint is nonconvex, these problems are often approximately solved via convex relaxations. Nuclear norm regularization is the prevailing convexifying technique for dealing with these types of problem. This paper introduces a family of low-rank inducing norms and regularizers which include the nuclear norm as a special case. A posteriori guarantees on solving an underlying rank constrained optimization problem with these convex relaxations are provided. We evaluate the performance of the low-rank inducing norms on three matrix completion problems. In all examples, the nuclear norm heuristic... (More)

Optimization problems with rank constraints appear in many diverse fields such as control, machine learning, and image analysis. Since the rank constraint is nonconvex, these problems are often approximately solved via convex relaxations. Nuclear norm regularization is the prevailing convexifying technique for dealing with these types of problem. This paper introduces a family of low-rank inducing norms and regularizers which include the nuclear norm as a special case. A posteriori guarantees on solving an underlying rank constrained optimization problem with these convex relaxations are provided. We evaluate the performance of the low-rank inducing norms on three matrix completion problems. In all examples, the nuclear norm heuristic is outperformed by convex relaxations based on other low-rank inducing norms. For two of the problems there exist low-rank inducing norms that succeed in recovering the partially unknown matrix, while the nuclear norm fails. These low-rank inducing norms are shown to be representable as semidefinite programs. Moreover, these norms have cheaply computable proximal mappings, which make it possible to also solve problems of large size using first-order methods.

(Less)
Please use this url to cite or link to this publication:
author
and
organization
publishing date
type
Contribution to journal
publication status
published
subject
keywords
Low-rank inducing norms, Low-rank optimization, Matrix completion, Regularization, Semidefinite programming
in
SIAM Journal on Optimization
volume
28
issue
4
pages
22 pages
publisher
Society for Industrial and Applied Mathematics
external identifiers
  • scopus:85060277532
ISSN
1052-6234
DOI
10.1137/17M1115770
language
English
LU publication?
yes
id
22510557-d19d-4cc6-96eb-76d539396894
date added to LUP
2019-02-04 12:55:36
date last changed
2023-10-20 22:22:05
@article{22510557-d19d-4cc6-96eb-76d539396894,
  abstract     = {{<p>Optimization problems with rank constraints appear in many diverse fields such as control, machine learning, and image analysis. Since the rank constraint is nonconvex, these problems are often approximately solved via convex relaxations. Nuclear norm regularization is the prevailing convexifying technique for dealing with these types of problem. This paper introduces a family of low-rank inducing norms and regularizers which include the nuclear norm as a special case. A posteriori guarantees on solving an underlying rank constrained optimization problem with these convex relaxations are provided. We evaluate the performance of the low-rank inducing norms on three matrix completion problems. In all examples, the nuclear norm heuristic is outperformed by convex relaxations based on other low-rank inducing norms. For two of the problems there exist low-rank inducing norms that succeed in recovering the partially unknown matrix, while the nuclear norm fails. These low-rank inducing norms are shown to be representable as semidefinite programs. Moreover, these norms have cheaply computable proximal mappings, which make it possible to also solve problems of large size using first-order methods.</p>}},
  author       = {{Grussler, Christian and Giselsson, Pontus}},
  issn         = {{1052-6234}},
  keywords     = {{Low-rank inducing norms; Low-rank optimization; Matrix completion; Regularization; Semidefinite programming}},
  language     = {{eng}},
  number       = {{4}},
  pages        = {{3057--3078}},
  publisher    = {{Society for Industrial and Applied Mathematics}},
  series       = {{SIAM Journal on Optimization}},
  title        = {{Low-rank inducing norms with optimality interpretations<sup>∗</sup>}},
  url          = {{http://dx.doi.org/10.1137/17M1115770}},
  doi          = {{10.1137/17M1115770}},
  volume       = {{28}},
  year         = {{2018}},
}