Anatomically informed deep learning framework for generating fast, low-dose synthetic CBCT for prostate radiotherapy
(2025) In Scientific Reports 15(1).- Abstract
Precise patient positioning and daily anatomical verification are crucial in external beam radiotherapy to ensure accurate dose delivery and minimize harm to healthy tissues. However, Current image-guided radiotherapy techniques struggle to balance high-quality volumetric anatomical visualization and rapid low-dose imaging. Addressing this, reconstructing volumetric images from ultra-sparse X-ray projections holds promise for significantly reducing patient radiation exposure and potentially enabling real-time anatomy verification. Here, we present a novel DL-based framework that generates synthetic volumetric cone-beam CT in real-time from two orthogonal projection views and a reference planning CT for prostate cancer patients. Our... (More)
Precise patient positioning and daily anatomical verification are crucial in external beam radiotherapy to ensure accurate dose delivery and minimize harm to healthy tissues. However, Current image-guided radiotherapy techniques struggle to balance high-quality volumetric anatomical visualization and rapid low-dose imaging. Addressing this, reconstructing volumetric images from ultra-sparse X-ray projections holds promise for significantly reducing patient radiation exposure and potentially enabling real-time anatomy verification. Here, we present a novel DL-based framework that generates synthetic volumetric cone-beam CT in real-time from two orthogonal projection views and a reference planning CT for prostate cancer patients. Our model learns the mapping between 2D and 3D domains and generalizes across patients without retraining. We demonstrate that our framework produces high-fidelity volumetric reconstructions in real-time, potentially supporting clinical workflows without hardware modifications. This approach could reduce imaging dose and treatment time while preserving comprehensive anatomical information, offering a pathway for safer, more efficient prostate radiotherapy workflows.
(Less)
- author
- Kadhim, Mustafa
LU
; Persson, Emilia
LU
; Haraldsson, André
LU
; Gustafsson, Christian Jamtheim
LU
; Nilsson, Mikael
LU
; Kügele, Malin
LU
; Bäck, Sven
LU
and Ceberg, Sofie
LU
- organization
-
- Medical Radiation Physics, Lund
- LUCC: Lund University Cancer Centre
- Radiotherapy Physics (research group)
- Medical Radiation Physics, Malmö (research group)
- Mathematical Imaging Group (research group)
- LTH Profile Area: Engineering Health
- LU Profile Area: Natural and Artificial Cognition
- Computer Vision and Machine Learning (research group)
- publishing date
- 2025-12
- type
- Contribution to journal
- publication status
- published
- subject
- in
- Scientific Reports
- volume
- 15
- issue
- 1
- article number
- 36106
- publisher
- Nature Publishing Group
- external identifiers
-
- pmid:41094004
- scopus:105018836627
- ISSN
- 2045-2322
- DOI
- 10.1038/s41598-025-23781-7
- language
- English
- LU publication?
- yes
- id
- d76cea11-ae54-46f2-af23-98d4d973459c
- date added to LUP
- 2025-12-11 14:23:32
- date last changed
- 2025-12-12 03:07:56
@article{d76cea11-ae54-46f2-af23-98d4d973459c,
abstract = {{<p>Precise patient positioning and daily anatomical verification are crucial in external beam radiotherapy to ensure accurate dose delivery and minimize harm to healthy tissues. However, Current image-guided radiotherapy techniques struggle to balance high-quality volumetric anatomical visualization and rapid low-dose imaging. Addressing this, reconstructing volumetric images from ultra-sparse X-ray projections holds promise for significantly reducing patient radiation exposure and potentially enabling real-time anatomy verification. Here, we present a novel DL-based framework that generates synthetic volumetric cone-beam CT in real-time from two orthogonal projection views and a reference planning CT for prostate cancer patients. Our model learns the mapping between 2D and 3D domains and generalizes across patients without retraining. We demonstrate that our framework produces high-fidelity volumetric reconstructions in real-time, potentially supporting clinical workflows without hardware modifications. This approach could reduce imaging dose and treatment time while preserving comprehensive anatomical information, offering a pathway for safer, more efficient prostate radiotherapy workflows.</p>}},
author = {{Kadhim, Mustafa and Persson, Emilia and Haraldsson, André and Gustafsson, Christian Jamtheim and Nilsson, Mikael and Kügele, Malin and Bäck, Sven and Ceberg, Sofie}},
issn = {{2045-2322}},
language = {{eng}},
number = {{1}},
publisher = {{Nature Publishing Group}},
series = {{Scientific Reports}},
title = {{Anatomically informed deep learning framework for generating fast, low-dose synthetic CBCT for prostate radiotherapy}},
url = {{http://dx.doi.org/10.1038/s41598-025-23781-7}},
doi = {{10.1038/s41598-025-23781-7}},
volume = {{15}},
year = {{2025}},
}