Advanced

Partially Exchangeable Networks and Architectures for Learning Summary Statistics in Approximate Bayesian Computation

Wiqvist, Samuel LU ; Mattei, Pierre-Alexandre; Picchini, Umberto and Frellsen, Jes (2019)
Abstract
We present a novel family of deep neural architectures, named partially exchangeable networks (PENs) that leverage probabilistic symmetries. By design, PENs are invariant to block-switch transformations, which characterize the partial exchangeability properties of conditionally Markovian processes. Moreover, we show that any block-switch invariant function has a PEN-like representation. The DeepSets architecture is a special case of PEN and we can therefore also target fully exchangeable data. We employ PENs to learn summary statistics in approximate Bayesian computation (ABC). When comparing PENs to previous deep learning methods for learning summary statistics, our results are highly competitive, both considering time series and static... (More)
We present a novel family of deep neural architectures, named partially exchangeable networks (PENs) that leverage probabilistic symmetries. By design, PENs are invariant to block-switch transformations, which characterize the partial exchangeability properties of conditionally Markovian processes. Moreover, we show that any block-switch invariant function has a PEN-like representation. The DeepSets architecture is a special case of PEN and we can therefore also target fully exchangeable data. We employ PENs to learn summary statistics in approximate Bayesian computation (ABC). When comparing PENs to previous deep learning methods for learning summary statistics, our results are highly competitive, both considering time series and static models. Indeed, PENs provide more reliable posterior samples even when using less training data. (Less)
Please use this url to cite or link to this publication:
author
organization
publishing date
type
Working Paper
publication status
unpublished
subject
pages
13 pages
language
English
LU publication?
yes
id
c48881bb-23d4-4181-8f1d-3c638eec30ce
date added to LUP
2019-01-30 10:57:20
date last changed
2019-05-24 19:26:20
@misc{c48881bb-23d4-4181-8f1d-3c638eec30ce,
  abstract     = {We present a novel family of deep neural architectures, named partially exchangeable networks (PENs) that leverage probabilistic symmetries. By design, PENs are invariant to block-switch transformations, which characterize the partial exchangeability properties of conditionally Markovian processes. Moreover, we show that any block-switch invariant function has a PEN-like representation. The DeepSets architecture is a special case of PEN and we can therefore also target fully exchangeable data. We employ PENs to learn summary statistics in approximate Bayesian computation (ABC). When comparing PENs to previous deep learning methods for learning summary statistics, our results are highly competitive, both considering time series and static models. Indeed, PENs provide more reliable posterior samples even when using less training data. },
  author       = {Wiqvist, Samuel and Mattei,  Pierre-Alexandre and Picchini, Umberto and Frellsen, Jes},
  language     = {eng},
  note         = {Working Paper},
  pages        = {13},
  title        = {Partially Exchangeable Networks and Architectures for Learning Summary Statistics in Approximate Bayesian Computation},
  year         = {2019},
}