Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

Non-convex Rank/Sparsity Regularization and Local Minima

Olsson, Carl LU ; Carlsson, Marcus LU ; Andersson, Fredrik LU and Larsson, Viktor LU (2017) 16th IEEE International Conference on Computer Vision, ICCV 2017 2017-October. p.332-340
Abstract

This paper considers the problem of recovering either a low rank matrix or a sparse vector from observations of linear combinations of the vector or matrix elements. Recent methods replace the non-convex regularization with ℓ1 or nuclear norm relaxations. It is well known that this approach recovers near optimal solutions if a so called restricted isometry property (RIP) holds. On the other hand it also has a shrinking bias which can degrade the solution. In this paper we study an alternative non-convex regularization term that does not suffer from this bias. Our main theoretical results show that if a RIP holds then the stationary points are often well separated, in the sense that their differences must be of high cardinality/rank.... (More)

This paper considers the problem of recovering either a low rank matrix or a sparse vector from observations of linear combinations of the vector or matrix elements. Recent methods replace the non-convex regularization with ℓ1 or nuclear norm relaxations. It is well known that this approach recovers near optimal solutions if a so called restricted isometry property (RIP) holds. On the other hand it also has a shrinking bias which can degrade the solution. In this paper we study an alternative non-convex regularization term that does not suffer from this bias. Our main theoretical results show that if a RIP holds then the stationary points are often well separated, in the sense that their differences must be of high cardinality/rank. Thus, with a suitable initial solution the approach is unlikely to fall into a bad local minimum. Our numerical tests show that the approach is likely to converge to a better solution than standard ℓ1/nuclear-norm relaxation even when starting from trivial initializations. In many cases our results can also be used to verify global optimality of our method.

(Less)
Please use this url to cite or link to this publication:
author
; ; and
organization
publishing date
type
Chapter in Book/Report/Conference proceeding
publication status
published
subject
host publication
Proceedings - 2017 IEEE International Conference on Computer Vision, ICCV 2017
volume
2017-October
article number
8237306
pages
9 pages
publisher
IEEE - Institute of Electrical and Electronics Engineers Inc.
conference name
16th IEEE International Conference on Computer Vision, ICCV 2017
conference location
Venice, Italy
conference dates
2017-10-22 - 2017-10-29
external identifiers
  • scopus:85041894852
ISBN
9781538610329
DOI
10.1109/ICCV.2017.44
language
English
LU publication?
yes
id
e010ffd0-9073-4a0e-bd12-799fa82fe3d1
date added to LUP
2018-02-22 09:04:02
date last changed
2022-09-06 09:57:22
@inproceedings{e010ffd0-9073-4a0e-bd12-799fa82fe3d1,
  abstract     = {{<p>This paper considers the problem of recovering either a low rank matrix or a sparse vector from observations of linear combinations of the vector or matrix elements. Recent methods replace the non-convex regularization with ℓ1 or nuclear norm relaxations. It is well known that this approach recovers near optimal solutions if a so called restricted isometry property (RIP) holds. On the other hand it also has a shrinking bias which can degrade the solution. In this paper we study an alternative non-convex regularization term that does not suffer from this bias. Our main theoretical results show that if a RIP holds then the stationary points are often well separated, in the sense that their differences must be of high cardinality/rank. Thus, with a suitable initial solution the approach is unlikely to fall into a bad local minimum. Our numerical tests show that the approach is likely to converge to a better solution than standard ℓ1/nuclear-norm relaxation even when starting from trivial initializations. In many cases our results can also be used to verify global optimality of our method.</p>}},
  author       = {{Olsson, Carl and Carlsson, Marcus and Andersson, Fredrik and Larsson, Viktor}},
  booktitle    = {{Proceedings - 2017 IEEE International Conference on Computer Vision, ICCV 2017}},
  isbn         = {{9781538610329}},
  language     = {{eng}},
  month        = {{12}},
  pages        = {{332--340}},
  publisher    = {{IEEE - Institute of Electrical and Electronics Engineers Inc.}},
  title        = {{Non-convex Rank/Sparsity Regularization and Local Minima}},
  url          = {{http://dx.doi.org/10.1109/ICCV.2017.44}},
  doi          = {{10.1109/ICCV.2017.44}},
  volume       = {{2017-October}},
  year         = {{2017}},
}