Variational auto-encoders with Student’s t-prior
(2019) 27th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning- Abstract
- We propose a new structure for the variational auto-encoders (VAEs) prior, with the weakly informative multivariate Student’s t-distribution. In the proposed model all distribution parameters are trained, thereby allowing for a more robust approximation of the underlying data distribution. We used Fashion-MNIST data in two experiments to compare the proposed VAEs with the standard Gaussian priors. Both experiments showed a better reconstruction of the images with VAEs using Student’s t-prior distribution.
Please use this url to cite or link to this publication:
https://lup.lub.lu.se/record/24babff9-6288-42ab-a099-4272559768c3
- author
- Abiri, Najmeh
LU
and Ohlsson, Mattias
LU
- organization
- publishing date
- 2019
- type
- Chapter in Book/Report/Conference proceeding
- publication status
- published
- subject
- host publication
- ESANN 2019 - Proceedings : The 27th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning - The 27th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning
- conference name
- 27th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning
- conference location
- Bruges, Belgium
- conference dates
- 2019-04-24 - 2019-04-26
- external identifiers
-
- scopus:85071324436
- ISBN
- 978-287-587-065-0
- project
- Lund University AI Research
- language
- English
- LU publication?
- yes
- id
- 24babff9-6288-42ab-a099-4272559768c3
- alternative location
- https://www.elen.ucl.ac.be/Proceedings/esann/esannpdf/es2019-42.pdf
- date added to LUP
- 2019-07-29 09:56:44
- date last changed
- 2024-03-03 21:38:03
@inproceedings{24babff9-6288-42ab-a099-4272559768c3, abstract = {{We propose a new structure for the variational auto-encoders (VAEs) prior, with the weakly informative multivariate Student’s t-distribution. In the proposed model all distribution parameters are trained, thereby allowing for a more robust approximation of the underlying data distribution. We used Fashion-MNIST data in two experiments to compare the proposed VAEs with the standard Gaussian priors. Both experiments showed a better reconstruction of the images with VAEs using Student’s t-prior distribution.<br/>}}, author = {{Abiri, Najmeh and Ohlsson, Mattias}}, booktitle = {{ESANN 2019 - Proceedings : The 27th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning}}, isbn = {{978-287-587-065-0}}, language = {{eng}}, title = {{Variational auto-encoders with Student’s t-prior}}, url = {{https://www.elen.ucl.ac.be/Proceedings/esann/esannpdf/es2019-42.pdf}}, year = {{2019}}, }