On gradient based descent algorithms for joint diagonalization of matrices
(2024) 32nd European Signal Processing Conference, EUSIPCO 2024 In European Signal Processing Conference p.2632-2636- Abstract
Joint diagonalization of collections of matrices, i.e. the problem of finding a joint set of approximate eigenvectors, is an important problem that appears in many applicative contexts. It is commonly formulated as finding the minimizer, over the set of all possible bases, for a certain non-convex functional that measures the size of off-diagonal elements. Many approaches have been studied in the literature, some of the most popular ones working with approximations of this cost functional. In this work, we deviate from this philosophy and instead propose to directly attempt to find a minimizer making use of the gradient and Hessian of the original functional. Our main contributions are as follows. First, we design and study gradient... (More)
Joint diagonalization of collections of matrices, i.e. the problem of finding a joint set of approximate eigenvectors, is an important problem that appears in many applicative contexts. It is commonly formulated as finding the minimizer, over the set of all possible bases, for a certain non-convex functional that measures the size of off-diagonal elements. Many approaches have been studied in the literature, some of the most popular ones working with approximations of this cost functional. In this work, we deviate from this philosophy and instead propose to directly attempt to find a minimizer making use of the gradient and Hessian of the original functional. Our main contributions are as follows. First, we design and study gradient descent and conjugate gradient algorithms. Second, we show that the intricate geometry of the functional makes it beneficial to change basis at each iteration, leading to faster convergence. Third, we conduct large sets of numerical experiments that indicate that our proposed descent methods yield competitive results when compared to popular methods such as WJDTE.
(Less)
- author
- Troedsson, Erik LU ; Carlsson, Marcus LU and Wendt, Herwig
- organization
- publishing date
- 2024
- type
- Chapter in Book/Report/Conference proceeding
- publication status
- published
- subject
- keywords
- conjugate gradient, gradient descent, joint eigen-decomposition, matrix diagonalization, simultaneous diagonalization
- host publication
- 32nd European Signal Processing Conference, EUSIPCO 2024 - Proceedings
- series title
- European Signal Processing Conference
- pages
- 5 pages
- publisher
- European Signal Processing Conference, EUSIPCO
- conference name
- 32nd European Signal Processing Conference, EUSIPCO 2024
- conference location
- Lyon, France
- conference dates
- 2024-08-26 - 2024-08-30
- external identifiers
-
- scopus:85205836619
- ISSN
- 2219-5491
- ISBN
- 9789464593617
- DOI
- 10.23919/eusipco63174.2024.10715124
- language
- English
- LU publication?
- yes
- id
- e57bb4fb-e53c-4754-945a-94725fd73ea8
- date added to LUP
- 2025-01-16 10:58:15
- date last changed
- 2025-04-04 13:58:32
@inproceedings{e57bb4fb-e53c-4754-945a-94725fd73ea8, abstract = {{<p>Joint diagonalization of collections of matrices, i.e. the problem of finding a joint set of approximate eigenvectors, is an important problem that appears in many applicative contexts. It is commonly formulated as finding the minimizer, over the set of all possible bases, for a certain non-convex functional that measures the size of off-diagonal elements. Many approaches have been studied in the literature, some of the most popular ones working with approximations of this cost functional. In this work, we deviate from this philosophy and instead propose to directly attempt to find a minimizer making use of the gradient and Hessian of the original functional. Our main contributions are as follows. First, we design and study gradient descent and conjugate gradient algorithms. Second, we show that the intricate geometry of the functional makes it beneficial to change basis at each iteration, leading to faster convergence. Third, we conduct large sets of numerical experiments that indicate that our proposed descent methods yield competitive results when compared to popular methods such as WJDTE.</p>}}, author = {{Troedsson, Erik and Carlsson, Marcus and Wendt, Herwig}}, booktitle = {{32nd European Signal Processing Conference, EUSIPCO 2024 - Proceedings}}, isbn = {{9789464593617}}, issn = {{2219-5491}}, keywords = {{conjugate gradient; gradient descent; joint eigen-decomposition; matrix diagonalization; simultaneous diagonalization}}, language = {{eng}}, pages = {{2632--2636}}, publisher = {{European Signal Processing Conference, EUSIPCO}}, series = {{European Signal Processing Conference}}, title = {{On gradient based descent algorithms for joint diagonalization of matrices}}, url = {{http://dx.doi.org/10.23919/eusipco63174.2024.10715124}}, doi = {{10.23919/eusipco63174.2024.10715124}}, year = {{2024}}, }