Advanced

Adaptive sequential Monte Carlo by means of mixture of experts

Cornebise, Julien; Moulines, Eric and Olsson, Jimmy LU (2014) In Statistics and Computing 24(3). p.317-337
Abstract
Appropriately designing the proposal kernel of particle filters is an issue of significant importance, since a bad choice may lead to deterioration of the particle sample and, consequently, waste of computational power. In this paper we introduce a novel algorithm adaptively approximating the so-called optimal proposal kernel by a mixture of integrated curved exponential distributions with logistic weights. This family of distributions, referred to as mixtures of experts, is broad enough to be used in the presence of multi-modality or strongly skewed distributions. The mixtures are fitted, via online-EM methods, to the optimal kernel through minimisation of the Kullback-Leibler divergence between the auxiliary target and instrumental... (More)
Appropriately designing the proposal kernel of particle filters is an issue of significant importance, since a bad choice may lead to deterioration of the particle sample and, consequently, waste of computational power. In this paper we introduce a novel algorithm adaptively approximating the so-called optimal proposal kernel by a mixture of integrated curved exponential distributions with logistic weights. This family of distributions, referred to as mixtures of experts, is broad enough to be used in the presence of multi-modality or strongly skewed distributions. The mixtures are fitted, via online-EM methods, to the optimal kernel through minimisation of the Kullback-Leibler divergence between the auxiliary target and instrumental distributions of the particle filter. At each iteration of the particle filter, the algorithm is required to solve only a single optimisation problem for the whole particle sample, yielding an algorithm with only linear complexity. In addition, we illustrate in a simulation study how the method can be successfully applied to optimal filtering in nonlinear state-space models. (Less)
Please use this url to cite or link to this publication:
author
organization
publishing date
type
Contribution to journal
publication status
published
subject
keywords
Optimal proposal kernel, Adaptive algorithms, Kullback-Leibler, divergence, Coefficient of variation, Expectation-maximisation, Particle, filter, Sequential Monte Carlo, Shannon entropy
in
Statistics and Computing
volume
24
issue
3
pages
317 - 337
publisher
Springer
external identifiers
  • wos:000334435400003
  • scopus:84898548772
ISSN
0960-3174
DOI
10.1007/s11222-012-9372-2
language
English
LU publication?
yes
id
d93994fd-09a4-461c-bf79-430d075ba6fb (old id 4438916)
date added to LUP
2014-05-21 13:52:51
date last changed
2017-01-01 05:23:29
@article{d93994fd-09a4-461c-bf79-430d075ba6fb,
  abstract     = {Appropriately designing the proposal kernel of particle filters is an issue of significant importance, since a bad choice may lead to deterioration of the particle sample and, consequently, waste of computational power. In this paper we introduce a novel algorithm adaptively approximating the so-called optimal proposal kernel by a mixture of integrated curved exponential distributions with logistic weights. This family of distributions, referred to as mixtures of experts, is broad enough to be used in the presence of multi-modality or strongly skewed distributions. The mixtures are fitted, via online-EM methods, to the optimal kernel through minimisation of the Kullback-Leibler divergence between the auxiliary target and instrumental distributions of the particle filter. At each iteration of the particle filter, the algorithm is required to solve only a single optimisation problem for the whole particle sample, yielding an algorithm with only linear complexity. In addition, we illustrate in a simulation study how the method can be successfully applied to optimal filtering in nonlinear state-space models.},
  author       = {Cornebise, Julien and Moulines, Eric and Olsson, Jimmy},
  issn         = {0960-3174},
  keyword      = {Optimal proposal kernel,Adaptive algorithms,Kullback-Leibler,divergence,Coefficient of variation,Expectation-maximisation,Particle,filter,Sequential Monte Carlo,Shannon entropy},
  language     = {eng},
  number       = {3},
  pages        = {317--337},
  publisher    = {Springer},
  series       = {Statistics and Computing},
  title        = {Adaptive sequential Monte Carlo by means of mixture of experts},
  url          = {http://dx.doi.org/10.1007/s11222-012-9372-2},
  volume       = {24},
  year         = {2014},
}