Detection of Breast Cancer in Pocket Ultrasound Images Using Deep Learning
(2022) In Master's Theses in Mathematical Sciences FMAM05 20221Mathematics (Faculty of Engineering)
- Abstract
- Breast cancer is the most frequently diagnosed form of cancer worldwide. 2 260 000 people were diagnosed with breast cancer year 2020, and 685 000 people deceased from it. In low income countries, breast cancer is commonly detected at a later stage when it is harder to treat, thus entailing a higher mortality rate. This is primarily due to the lack of knowledge and diagnostic tools available. A low cost breast diagnostic tool could therefore be a valuable solution in low income countries.
The objective of this thesis is to create a deep learning algorithm that can classify breast pocket ultrasound images as malignant, benign or normal. In this thesis, two different data sets were used. One ultrasound data set with 2062 images and one... (More) - Breast cancer is the most frequently diagnosed form of cancer worldwide. 2 260 000 people were diagnosed with breast cancer year 2020, and 685 000 people deceased from it. In low income countries, breast cancer is commonly detected at a later stage when it is harder to treat, thus entailing a higher mortality rate. This is primarily due to the lack of knowledge and diagnostic tools available. A low cost breast diagnostic tool could therefore be a valuable solution in low income countries.
The objective of this thesis is to create a deep learning algorithm that can classify breast pocket ultrasound images as malignant, benign or normal. In this thesis, two different data sets were used. One ultrasound data set with 2062 images and one pocket ultrasound data set with 598 images. Four different approaches using convolution neural networks (CNN) were tested in order to produce the best model on the pocket ultrasound data set. In the first part, multiple different CNNs were created and trained on the ultrasound data set. The two models showing best results on the pocket ultrasound validation set were chosen for further evaluation. The second part consisted of augmentation of the ultrasound data set. The augmented images were then used to train the two chosen CNNs. In the third part, transfer learning was used in order to train the CNNs on both data sets. The last part of the thesis consisted of training the CNNs on the pocket ultrasound data set solely.
The best CNN gave an accuracy of 86.8\% and an AUC value of 0.93 on the pocket ultrasound test set. This was achieved by training on both ultrasound and pocket ultrasound breast images using transfer learning. The performance on the pocket ultrasound test set was not improved by training the CNN's on augmented ultrasound images but training solely on pocket ultrasound images could be a good strategy with more data available. The results seem promising for the future and a perfected model, trained on more pocket ultrasound data, could possibly be implemented as a low cost diagnostic tool in countries without breast diagnostics. (Less)
Please use this url to cite or link to this publication:
http://lup.lub.lu.se/student-papers/record/9109103
- author
- Sahlin, Freja LU
- supervisor
-
- Ida Arvidsson LU
- Jennie Karlsson LU
- Kristina Lång LU
- organization
- course
- FMAM05 20221
- year
- 2022
- type
- H2 - Master's Degree (Two Years)
- subject
- keywords
- breast cancer, point-of-care ultrasound, medical image analysis, convolutional neural networks, transfer learning
- publication/series
- Master's Theses in Mathematical Sciences
- report number
- LUTFMA-3466-2022
- ISSN
- 1404-6342
- other publication id
- 2022:E16
- language
- English
- id
- 9109103
- date added to LUP
- 2023-07-03 14:47:32
- date last changed
- 2023-07-03 14:47:32
@misc{9109103, abstract = {{Breast cancer is the most frequently diagnosed form of cancer worldwide. 2 260 000 people were diagnosed with breast cancer year 2020, and 685 000 people deceased from it. In low income countries, breast cancer is commonly detected at a later stage when it is harder to treat, thus entailing a higher mortality rate. This is primarily due to the lack of knowledge and diagnostic tools available. A low cost breast diagnostic tool could therefore be a valuable solution in low income countries. The objective of this thesis is to create a deep learning algorithm that can classify breast pocket ultrasound images as malignant, benign or normal. In this thesis, two different data sets were used. One ultrasound data set with 2062 images and one pocket ultrasound data set with 598 images. Four different approaches using convolution neural networks (CNN) were tested in order to produce the best model on the pocket ultrasound data set. In the first part, multiple different CNNs were created and trained on the ultrasound data set. The two models showing best results on the pocket ultrasound validation set were chosen for further evaluation. The second part consisted of augmentation of the ultrasound data set. The augmented images were then used to train the two chosen CNNs. In the third part, transfer learning was used in order to train the CNNs on both data sets. The last part of the thesis consisted of training the CNNs on the pocket ultrasound data set solely. The best CNN gave an accuracy of 86.8\% and an AUC value of 0.93 on the pocket ultrasound test set. This was achieved by training on both ultrasound and pocket ultrasound breast images using transfer learning. The performance on the pocket ultrasound test set was not improved by training the CNN's on augmented ultrasound images but training solely on pocket ultrasound images could be a good strategy with more data available. The results seem promising for the future and a perfected model, trained on more pocket ultrasound data, could possibly be implemented as a low cost diagnostic tool in countries without breast diagnostics.}}, author = {{Sahlin, Freja}}, issn = {{1404-6342}}, language = {{eng}}, note = {{Student Paper}}, series = {{Master's Theses in Mathematical Sciences}}, title = {{Detection of Breast Cancer in Pocket Ultrasound Images Using Deep Learning}}, year = {{2022}}, }