Instantaneous 3D imaging of flame species using coded laser illumination

Kristensson, Elias; Li, Zheming; Berrocal, Edouard; Richter, Mattias, et al. (2017). Instantaneous 3D imaging of flame species using coded laser illumination. Proceedings of the Combustion Institute, 36, (3), 4585 - 4591
Download:
DOI:
| Published | English
Authors:
Kristensson, Elias ; Li, Zheming ; Berrocal, Edouard ; Richter, Mattias , et al.
Department:
Combustion Physics
Abstract:

Three-dimensional (3D) imaging of dynamic objects that rapidly undergoes structural changes, such as turbulent combusting flows, has been a long-standing challenge, mainly due to the common need for sequential image acquisitions. To accurately sense the 3D shape of the sample, all acquisitions need to be recorded within a sufficiently short time-scale during which the sample appears stationary. Here we present a versatile diagnostic method, named Frequency Recognition Algorithm for Multiple Exposures (FRAME) that enables instantaneous 3D imaging. FRAME is based on volumetric laser sheet imaging but permits several layers to be probed parallel in time and acquired using a single detector. To differentiate between the signals arising from the different layers FRAME incorporates a line-coding strategy, in which each laser sheet is given a unique spatial intensity modulation. Although the signal from all laser sheets overlap in the spatial domain, this line-coding approach makes them separated in the frequency domain where they can be accessed individually by means of digital filtering. Here we demonstrate this method by studying laser-induced fluorescence from formaldehyde in a flame and present 3D images of the flame topology, instantaneously acquired.

Keywords:
Coded imaging ; Instantaneous imaging ; Structured illumination ; Three-dimensional imaging ; Atom and Molecular Physics and Optics ; Energy Engineering
ISSN:
1540-7489
LUP-ID:
837c9223-16c5-4d10-b138-6d02a491b041 | Link: https://lup.lub.lu.se/record/837c9223-16c5-4d10-b138-6d02a491b041 | Statistics

Cite this