Uncertainty Estimation in Deep Neural Networks: A Comparative Study of Bayesian Approximation and Conformal Prediction
(2025) DABN01 20251Department of Economics
Department of Statistics
- Abstract
- Deep neural networks have been increasingly used in various scientific fields due to their versatility and high performance. Despite achieving high classification accuracy, deep learning models can be poorly calibrated, assigning overly confident probabilities to incorrect predictions. This overconfidence highlights the absence of built-in mechanisms for uncertainty quantification.
The thesis compares two distinct approaches to uncertainty estimation: Bayesian approximation using Monte Carlo Dropout and the nonparametric Conformal Prediction framework. These methods are applied to predictions generated by two convolutional neural network architectures, H-CNN VGG16, and GoogLeNet, trained on the image classification benchmark dataset... (More) - Deep neural networks have been increasingly used in various scientific fields due to their versatility and high performance. Despite achieving high classification accuracy, deep learning models can be poorly calibrated, assigning overly confident probabilities to incorrect predictions. This overconfidence highlights the absence of built-in mechanisms for uncertainty quantification.
The thesis compares two distinct approaches to uncertainty estimation: Bayesian approximation using Monte Carlo Dropout and the nonparametric Conformal Prediction framework. These methods are applied to predictions generated by two convolutional neural network architectures, H-CNN VGG16, and GoogLeNet, trained on the image classification benchmark dataset Fashion-MNIST.
The results highlight that H-CNN VGG16 achieves higher accuracy but tends to be overconfident, while GoogLeNet results in better calibrated uncertainty estimation. Moreover, Conformal Predictions provide valid prediction sets that can compensate for unreliable or misleading uncertainty estimations. As a result, it is a valuable approach in high-risk settings.
The thesis contributes to evaluating uncertainty estimation techniques by comparing Conformal Predictions to Bayesian approximation over two different architectures. It provides a deeper understanding of the need to assess a model’s performance beyond its accuracy. (Less)
Please use this url to cite or link to this publication:
http://lup.lub.lu.se/student-papers/record/9192215
- author
- Ruijs, Sanne LU and Kosiakova, Alina
- supervisor
- organization
- course
- DABN01 20251
- year
- 2025
- type
- H1 - Master's Degree (One Year)
- subject
- keywords
- Uncertainty estimation, Conformal Prediction, deep neural networks, Bayesian approximation, Monte Carlo Dropout, machine learning
- language
- English
- id
- 9192215
- date added to LUP
- 2025-09-12 09:05:47
- date last changed
- 2025-09-12 09:05:47
@misc{9192215, abstract = {{Deep neural networks have been increasingly used in various scientific fields due to their versatility and high performance. Despite achieving high classification accuracy, deep learning models can be poorly calibrated, assigning overly confident probabilities to incorrect predictions. This overconfidence highlights the absence of built-in mechanisms for uncertainty quantification. The thesis compares two distinct approaches to uncertainty estimation: Bayesian approximation using Monte Carlo Dropout and the nonparametric Conformal Prediction framework. These methods are applied to predictions generated by two convolutional neural network architectures, H-CNN VGG16, and GoogLeNet, trained on the image classification benchmark dataset Fashion-MNIST. The results highlight that H-CNN VGG16 achieves higher accuracy but tends to be overconfident, while GoogLeNet results in better calibrated uncertainty estimation. Moreover, Conformal Predictions provide valid prediction sets that can compensate for unreliable or misleading uncertainty estimations. As a result, it is a valuable approach in high-risk settings. The thesis contributes to evaluating uncertainty estimation techniques by comparing Conformal Predictions to Bayesian approximation over two different architectures. It provides a deeper understanding of the need to assess a model’s performance beyond its accuracy.}}, author = {{Ruijs, Sanne and Kosiakova, Alina}}, language = {{eng}}, note = {{Student Paper}}, title = {{Uncertainty Estimation in Deep Neural Networks: A Comparative Study of Bayesian Approximation and Conformal Prediction}}, year = {{2025}}, }