Classification of point-of-care ultrasound in breast imaging using deep learning
(2023) SPIE Medical Imaging 2023 In Proceedings of SPIE 12465.- Abstract
Early detection of breast cancer is important to reduce morbidity and mortality. Access to breast imaging is limited in low- and middle-income countries compared to high-income countries. This contributes to advance-stage breast cancer presentation with poor survival. Pocket-sized portable ultrasound device, also known as point-of-care ultrasound (POCUS), aided by decision support using deep learning-based algorithms for lesion classification could be a cost-effective way to enable access to breast imaging in low-resource settings. A previous study, where using convolutional neural networks (CNN) to classify breast cancer in conventional ultrasound (US) images, showed promising results. The aim of the present study is to classify POCUS... (More)
Early detection of breast cancer is important to reduce morbidity and mortality. Access to breast imaging is limited in low- and middle-income countries compared to high-income countries. This contributes to advance-stage breast cancer presentation with poor survival. Pocket-sized portable ultrasound device, also known as point-of-care ultrasound (POCUS), aided by decision support using deep learning-based algorithms for lesion classification could be a cost-effective way to enable access to breast imaging in low-resource settings. A previous study, where using convolutional neural networks (CNN) to classify breast cancer in conventional ultrasound (US) images, showed promising results. The aim of the present study is to classify POCUS breast images. A POCUS data set containing 1100 breast images was collected. To increase the size of the data set, a Cycle-Consistent Adversarial Network (CycleGAN) was trained on US images to generate synthetic POCUS images. A CNN was implemented, trained, validated and tested on POCUS images. To improve performance, the CNN was trained with different combinations of data consisting of POCUS images, US images, CycleGAN-generated POCUS images and spatial augmentation. The best result was achieved by a CNN trained on a combination of POCUS images and CycleGAN-generated POCUS images and augmentation. This combination achieved a 95% confidence interval for AUC between 93.5% - 96.6%.
(Less)
- author
- Karlsson, Jennie
LU
; Arvidsson, Ida
LU
; Sahlin, Freja ; Åström, Kalle LU
; Overgaard, Niels Christian LU ; Lång, Kristina LU and Heyden, Anders LU
- organization
-
- Mathematics (Faculty of Engineering)
- eSSENCE: The e-Science Collaboration
- LU Profile Area: Proactive Ageing
- LTH Profile Area: AI and Digitalization
- ELLIIT: the Linköping-Lund initiative on IT and mobile communication
- LU Profile Area: Light and Materials
- LU Profile Area: Natural and Artificial Cognition
- Stroke Imaging Research group (research group)
- Mathematical Imaging Group (research group)
- LTH Profile Area: Engineering Health
- Engineering Mathematics (M.Sc.Eng.)
- Partial differential equations (research group)
- LUCC: Lund University Cancer Centre
- Radiology Diagnostics, Malmö (research group)
- Centre for Mathematical Sciences
- publishing date
- 2023
- type
- Chapter in Book/Report/Conference proceeding
- publication status
- published
- subject
- keywords
- Breast Cancer, Breast Ultrasound, Convolutional Neural Networks, CycleGAN, Point-of-Care Ultrasound
- host publication
- Medical Imaging 2023 : Computer-Aided Diagnosis - Computer-Aided Diagnosis
- series title
- Proceedings of SPIE
- editor
- Iftekharuddin, Khan M. and Chen, Weijie
- volume
- 12465
- article number
- 124650Y
- publisher
- SPIE
- conference name
- SPIE Medical Imaging 2023
- conference dates
- 2023-02-19 - 2023-02-23
- external identifiers
-
- scopus:85160213715
- ISSN
- 2410-9045
- 1605-7422
- ISBN
- 9781510660359
- DOI
- 10.1117/12.2654251
- language
- English
- LU publication?
- yes
- id
- 11be8846-1022-424a-a5d7-93fcfb5e7d7e
- date added to LUP
- 2023-04-26 13:46:25
- date last changed
- 2025-02-09 02:15:09
@inproceedings{11be8846-1022-424a-a5d7-93fcfb5e7d7e, abstract = {{<p>Early detection of breast cancer is important to reduce morbidity and mortality. Access to breast imaging is limited in low- and middle-income countries compared to high-income countries. This contributes to advance-stage breast cancer presentation with poor survival. Pocket-sized portable ultrasound device, also known as point-of-care ultrasound (POCUS), aided by decision support using deep learning-based algorithms for lesion classification could be a cost-effective way to enable access to breast imaging in low-resource settings. A previous study, where using convolutional neural networks (CNN) to classify breast cancer in conventional ultrasound (US) images, showed promising results. The aim of the present study is to classify POCUS breast images. A POCUS data set containing 1100 breast images was collected. To increase the size of the data set, a Cycle-Consistent Adversarial Network (CycleGAN) was trained on US images to generate synthetic POCUS images. A CNN was implemented, trained, validated and tested on POCUS images. To improve performance, the CNN was trained with different combinations of data consisting of POCUS images, US images, CycleGAN-generated POCUS images and spatial augmentation. The best result was achieved by a CNN trained on a combination of POCUS images and CycleGAN-generated POCUS images and augmentation. This combination achieved a 95% confidence interval for AUC between 93.5% - 96.6%.</p>}}, author = {{Karlsson, Jennie and Arvidsson, Ida and Sahlin, Freja and Åström, Kalle and Overgaard, Niels Christian and Lång, Kristina and Heyden, Anders}}, booktitle = {{Medical Imaging 2023 : Computer-Aided Diagnosis}}, editor = {{Iftekharuddin, Khan M. and Chen, Weijie}}, isbn = {{9781510660359}}, issn = {{2410-9045}}, keywords = {{Breast Cancer; Breast Ultrasound; Convolutional Neural Networks; CycleGAN; Point-of-Care Ultrasound}}, language = {{eng}}, publisher = {{SPIE}}, series = {{Proceedings of SPIE}}, title = {{Classification of point-of-care ultrasound in breast imaging using deep learning}}, url = {{http://dx.doi.org/10.1117/12.2654251}}, doi = {{10.1117/12.2654251}}, volume = {{12465}}, year = {{2023}}, }