Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

The Rank-Reduced Kalman Filter : Approximate Dynamical-Low-Rank Filtering In High Dimensions

Schmidt, Jonathan ; Nick, Jörg ; Hennig, Philipp and Tronarp, Filip LU (2023) 37th Conference on Neural Information Processing Systems, NeurIPS 2023 36.
Abstract

Inference and simulation in the context of high-dimensional dynamical systems remain computationally challenging problems. Some form of dimensionality reduction is required to make the problem tractable in general. In this paper, we propose a novel approximate Gaussian filtering and smoothing method which propagates low-rank approximations of the covariance matrices. This is accomplished by projecting the Lyapunov equations associated with the prediction step to a manifold of low-rank matrices, which are then solved by a recently developed, numerically stable, dynamical low-rank integrator. Meanwhile, the update steps are made tractable by noting that the covariance update only transforms the column space of the covariance matrix, which... (More)

Inference and simulation in the context of high-dimensional dynamical systems remain computationally challenging problems. Some form of dimensionality reduction is required to make the problem tractable in general. In this paper, we propose a novel approximate Gaussian filtering and smoothing method which propagates low-rank approximations of the covariance matrices. This is accomplished by projecting the Lyapunov equations associated with the prediction step to a manifold of low-rank matrices, which are then solved by a recently developed, numerically stable, dynamical low-rank integrator. Meanwhile, the update steps are made tractable by noting that the covariance update only transforms the column space of the covariance matrix, which is low-rank by construction. The algorithm differentiates itself from existing ensemble-based approaches in that the low-rank approximations of the covariance matrices are deterministic, rather than stochastic. Crucially, this enables the method to reproduce the exact Kalman filter as the low-rank dimension approaches the true dimensionality of the problem. Our method reduces computational complexity from cubic (for the Kalman filter) to quadratic in the state-space size in the worst-case, and can achieve linear complexity if the state-space model satisfies certain criteria. Through a set of experiments in classical data-assimilation and spatio-temporal regression, we show that the proposed method consistently outperforms the ensemble-based methods in terms of error in the mean and covariance with respect to the exact Kalman filter. This comes at no additional cost in terms of asymptotic computational complexity.

(Less)
Please use this url to cite or link to this publication:
author
; ; and
organization
publishing date
type
Chapter in Book/Report/Conference proceeding
publication status
published
subject
host publication
Advances in Neural Information Processing Systems
volume
36
publisher
Morgan Kaufmann Publishers
conference name
37th Conference on Neural Information Processing Systems, NeurIPS 2023
conference location
New Orleans, United States
conference dates
2023-12-10 - 2023-12-16
external identifiers
  • scopus:85191166081
language
English
LU publication?
yes
id
3a2171a5-862f-4d75-b40c-4f59af123b12
date added to LUP
2024-05-08 10:14:50
date last changed
2024-05-08 10:16:15
@inproceedings{3a2171a5-862f-4d75-b40c-4f59af123b12,
  abstract     = {{<p>Inference and simulation in the context of high-dimensional dynamical systems remain computationally challenging problems. Some form of dimensionality reduction is required to make the problem tractable in general. In this paper, we propose a novel approximate Gaussian filtering and smoothing method which propagates low-rank approximations of the covariance matrices. This is accomplished by projecting the Lyapunov equations associated with the prediction step to a manifold of low-rank matrices, which are then solved by a recently developed, numerically stable, dynamical low-rank integrator. Meanwhile, the update steps are made tractable by noting that the covariance update only transforms the column space of the covariance matrix, which is low-rank by construction. The algorithm differentiates itself from existing ensemble-based approaches in that the low-rank approximations of the covariance matrices are deterministic, rather than stochastic. Crucially, this enables the method to reproduce the exact Kalman filter as the low-rank dimension approaches the true dimensionality of the problem. Our method reduces computational complexity from cubic (for the Kalman filter) to quadratic in the state-space size in the worst-case, and can achieve linear complexity if the state-space model satisfies certain criteria. Through a set of experiments in classical data-assimilation and spatio-temporal regression, we show that the proposed method consistently outperforms the ensemble-based methods in terms of error in the mean and covariance with respect to the exact Kalman filter. This comes at no additional cost in terms of asymptotic computational complexity.</p>}},
  author       = {{Schmidt, Jonathan and Nick, Jörg and Hennig, Philipp and Tronarp, Filip}},
  booktitle    = {{Advances in Neural Information Processing Systems}},
  language     = {{eng}},
  publisher    = {{Morgan Kaufmann Publishers}},
  title        = {{The Rank-Reduced Kalman Filter : Approximate Dynamical-Low-Rank Filtering In High Dimensions}},
  volume       = {{36}},
  year         = {{2023}},
}