Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

ONIX : an X-ray deep-learning tool for 3D reconstructions from sparse views

Zhang, Yuhe LU ; Yao, Zisheng LU ; Ritschel, Tobias and Villanueva Perez, Pablo LU orcid (2023) In Applied Research 2(4).
Abstract
Time-resolved three-dimensional (3D) X-ray imaging techniques rely on obtaining 3D information for each time point and are crucial for materials-science applications in academia and industry. Standard 3D X-ray imaging techniques like tomography and confocal microscopy access 3D information by scanning the sample with respect to the X-ray source. However, the scanning process limits the temporal resolution when studying dynamics and is not feasible for many materials-science applications, such as cell-wall rupture of metallic foams. Alternatives to obtaining 3D information when scanning is not possible are X-ray stereoscopy and multi-projection imaging, but these approaches suffer from limited volumetric information as they only acquire a... (More)
Time-resolved three-dimensional (3D) X-ray imaging techniques rely on obtaining 3D information for each time point and are crucial for materials-science applications in academia and industry. Standard 3D X-ray imaging techniques like tomography and confocal microscopy access 3D information by scanning the sample with respect to the X-ray source. However, the scanning process limits the temporal resolution when studying dynamics and is not feasible for many materials-science applications, such as cell-wall rupture of metallic foams. Alternatives to obtaining 3D information when scanning is not possible are X-ray stereoscopy and multi-projection imaging, but these approaches suffer from limited volumetric information as they only acquire a very small number of views or projections compared to traditional 3D scanning techniques. Here, we present optimized neural implicit X-ray imaging (ONIX), a deep-learning algorithm capable of retrieving a continuous 3D object representation from only a small and limited set of sparse projections. ONIX is based on an accurate differentiable model of the physics of X-ray propagation. It generalizes across different instances of similar samples to overcome the limited volumetric information provided by limited sparse views. We demonstrate the capabilities of ONIX compared to state-of-the-art tomographic reconstruction algorithms by applying it to simulated and experimental datasets, where a maximum of eight projections are acquired. ONIX, although it does not have access to any volumetric information, outperforms unsupervised reconstruction algorithms, which reconstruct using single instances without generalization over different instances. We anticipate that ONIX will become a crucial tool for the X-ray community by (i) enabling the study of fast dynamics not possible today when implemented together with X-ray multi-projection imaging and (ii) enhancing the volumetric information and capabilities of X-ray stereoscopic imaging. (Less)
Please use this url to cite or link to this publication:
author
; ; and
organization
publishing date
type
Contribution to journal
publication status
published
subject
in
Applied Research
volume
2
issue
4
article number
e202300016
pages
13 pages
publisher
Wiley-VCH Verlag
external identifiers
  • scopus:85161693397
ISSN
2702-4288
DOI
10.1002/appl.202300016
language
English
LU publication?
yes
id
f779d191-9924-48d1-8bd9-105a26592145
date added to LUP
2024-05-20 15:16:29
date last changed
2024-05-21 08:23:50
@article{f779d191-9924-48d1-8bd9-105a26592145,
  abstract     = {{Time-resolved three-dimensional (3D) X-ray imaging techniques rely on obtaining 3D information for each time point and are crucial for materials-science applications in academia and industry. Standard 3D X-ray imaging techniques like tomography and confocal microscopy access 3D information by scanning the sample with respect to the X-ray source. However, the scanning process limits the temporal resolution when studying dynamics and is not feasible for many materials-science applications, such as cell-wall rupture of metallic foams. Alternatives to obtaining 3D information when scanning is not possible are X-ray stereoscopy and multi-projection imaging, but these approaches suffer from limited volumetric information as they only acquire a very small number of views or projections compared to traditional 3D scanning techniques. Here, we present optimized neural implicit X-ray imaging (ONIX), a deep-learning algorithm capable of retrieving a continuous 3D object representation from only a small and limited set of sparse projections. ONIX is based on an accurate differentiable model of the physics of X-ray propagation. It generalizes across different instances of similar samples to overcome the limited volumetric information provided by limited sparse views. We demonstrate the capabilities of ONIX compared to state-of-the-art tomographic reconstruction algorithms by applying it to simulated and experimental datasets, where a maximum of eight projections are acquired. ONIX, although it does not have access to any volumetric information, outperforms unsupervised reconstruction algorithms, which reconstruct using single instances without generalization over different instances. We anticipate that ONIX will become a crucial tool for the X-ray community by (i) enabling the study of fast dynamics not possible today when implemented together with X-ray multi-projection imaging and (ii) enhancing the volumetric information and capabilities of X-ray stereoscopic imaging.}},
  author       = {{Zhang, Yuhe and Yao, Zisheng and Ritschel, Tobias and Villanueva Perez, Pablo}},
  issn         = {{2702-4288}},
  language     = {{eng}},
  month        = {{04}},
  number       = {{4}},
  publisher    = {{Wiley-VCH Verlag}},
  series       = {{Applied Research}},
  title        = {{ONIX : an X-ray deep-learning tool for 3D reconstructions from sparse views}},
  url          = {{http://dx.doi.org/10.1002/appl.202300016}},
  doi          = {{10.1002/appl.202300016}},
  volume       = {{2}},
  year         = {{2023}},
}