Evaluation of Out-of-Distribution Detection Performance on Autonomous Driving Datasets
(2023) 5th IEEE International Conference on Artificial Intelligence Testing, AITest 2023 In Proceedings - 5th IEEE International Conference on Artificial Intelligence Testing, AITest 2023 p.74-81- Abstract
Safety measures need to be systemically investigated to what extent they evaluate the intended performance of Deep Neural Networks (DNNs) for critical applications. Due to a lack of verification methods for high-dimensional DNNs, a trade-off is needed between accepted performance and handling of out-of-distribution (OOD) samples.This work evaluates rejecting outputs from semantic segmentation DNNs by applying a Mahalanobis distance (MD) based on the most probable class-conditional Gaussian distribution for the predicted class as an OOD score. The evaluation follows three DNNs trained on the Cityscapes dataset and tested on four automotive datasets and finds that classification risk can drastically be reduced at the cost of pixel... (More)
Safety measures need to be systemically investigated to what extent they evaluate the intended performance of Deep Neural Networks (DNNs) for critical applications. Due to a lack of verification methods for high-dimensional DNNs, a trade-off is needed between accepted performance and handling of out-of-distribution (OOD) samples.This work evaluates rejecting outputs from semantic segmentation DNNs by applying a Mahalanobis distance (MD) based on the most probable class-conditional Gaussian distribution for the predicted class as an OOD score. The evaluation follows three DNNs trained on the Cityscapes dataset and tested on four automotive datasets and finds that classification risk can drastically be reduced at the cost of pixel coverage, even when applied on unseen datasets. The applicability of our findings will support legitimizing safety measures and motivate their usage when arguing for safe usage of DNNs in automotive perception.
(Less)
- author
- Henriksson, Jens ; Berger, Christian ; Ursing, Stig and Borg, Markus LU
- organization
- publishing date
- 2023
- type
- Chapter in Book/Report/Conference proceeding
- publication status
- published
- subject
- keywords
- automotive safety, out-of-distribution detection, semantic segmentation
- host publication
- Proceedings - 5th IEEE International Conference on Artificial Intelligence Testing, AITest 2023
- series title
- Proceedings - 5th IEEE International Conference on Artificial Intelligence Testing, AITest 2023
- pages
- 8 pages
- publisher
- IEEE - Institute of Electrical and Electronics Engineers Inc.
- conference name
- 5th IEEE International Conference on Artificial Intelligence Testing, AITest 2023
- conference location
- Athens, Greece
- conference dates
- 2023-07-17 - 2023-07-20
- external identifiers
-
- scopus:85172280991
- ISBN
- 9798350336290
- DOI
- 10.1109/AITest58265.2023.00021
- language
- English
- LU publication?
- yes
- additional info
- Publisher Copyright: © 2023 IEEE.
- id
- 20cf1783-9de4-4d42-8b8f-dc9f283f84f8
- date added to LUP
- 2023-12-21 11:10:58
- date last changed
- 2024-02-09 10:38:45
@inproceedings{20cf1783-9de4-4d42-8b8f-dc9f283f84f8, abstract = {{<p>Safety measures need to be systemically investigated to what extent they evaluate the intended performance of Deep Neural Networks (DNNs) for critical applications. Due to a lack of verification methods for high-dimensional DNNs, a trade-off is needed between accepted performance and handling of out-of-distribution (OOD) samples.This work evaluates rejecting outputs from semantic segmentation DNNs by applying a Mahalanobis distance (MD) based on the most probable class-conditional Gaussian distribution for the predicted class as an OOD score. The evaluation follows three DNNs trained on the Cityscapes dataset and tested on four automotive datasets and finds that classification risk can drastically be reduced at the cost of pixel coverage, even when applied on unseen datasets. The applicability of our findings will support legitimizing safety measures and motivate their usage when arguing for safe usage of DNNs in automotive perception.</p>}}, author = {{Henriksson, Jens and Berger, Christian and Ursing, Stig and Borg, Markus}}, booktitle = {{Proceedings - 5th IEEE International Conference on Artificial Intelligence Testing, AITest 2023}}, isbn = {{9798350336290}}, keywords = {{automotive safety; out-of-distribution detection; semantic segmentation}}, language = {{eng}}, pages = {{74--81}}, publisher = {{IEEE - Institute of Electrical and Electronics Engineers Inc.}}, series = {{Proceedings - 5th IEEE International Conference on Artificial Intelligence Testing, AITest 2023}}, title = {{Evaluation of Out-of-Distribution Detection Performance on Autonomous Driving Datasets}}, url = {{http://dx.doi.org/10.1109/AITest58265.2023.00021}}, doi = {{10.1109/AITest58265.2023.00021}}, year = {{2023}}, }