US 12,467,867 B2
Method for estimating a three-dimensional spatial distribution of fluorescence, inside an object
Cédric Allier, Grenoble (FR); and Lionel Herve, Grenoble (FR)
Assigned to Commissariat a l'Energie Atomique et aux Energies Alternatives, Paris (FR)
Filed by Commissariat a l'Energie Atomique et aux Energies Alternatives, Paris (FR)
Filed on Dec. 29, 2022, as Appl. No. 18/147,772.
Claims priority of application No. 2114627 (FR), filed on Dec. 29, 2021.
Prior Publication US 2023/0221255 A1, Jul. 13, 2023
Int. Cl. G01N 21/64 (2006.01); G06T 17/20 (2006.01); G01N 21/17 (2006.01)
CPC G01N 21/6456 (2013.01) [G01N 21/6458 (2013.01); G06T 17/20 (2013.01); G01N 2021/174 (2013.01); G01N 2021/1765 (2013.01); G01N 2021/1782 (2013.01); G01N 2021/1785 (2013.01); G01N 2021/1787 (2013.01); G01N 2021/646 (2013.01)] 11 Claims
OG exemplary drawing
 
1. A method for reconstructing a three-dimensional spatial distribution of fluorescence inside an object, the object being capable of emitting fluorescence light under an effect of illumination in an excitation spectral band, the object being discretized into object voxels, wherein:
each object voxel of the object voxels is capable of emitting light in a fluorescence spectral band;
the three-dimensional spatial distribution of fluorescence inside the object defines an intensity of emission of fluorescence light in each object voxel of the object voxels;
the method comprising the following steps:
a) placing the object facing an image sensor, the image sensor being configured to form an image in an object focal plane, the object focal plane being configured to lie successively at various depths in the object;
b) illuminating the object in the excitation spectral band, and successively acquiring a plurality of elementary images, the object focal plane respectively lying at the various depths in the object, each elementary image of the plurality of elementary images being divided into pixels, to each pixel of the pixels corresponding one measured intensity value, the plurality of elementary images being combined to form a three-dimensional acquired image, each pixel of an elementary image forming one image voxel of the three-dimensional acquired image;
c) initializing the three-dimensional spatial distribution of fluorescence inside the object, so as to form an initialized three-dimensional spatial distribution of fluorescence inside the object;
d) taking into account the initialized three-dimensional spatial distribution of fluorescence inside the object or the three-dimensional spatial distribution of fluorescence inside the object resulting from a preceding iteration;
e) assigning a random phase value to each object voxel of the object voxels, and implementing an algorithm modelling a propagation of light through the object, so as to estimate a complex amplitude of the light detected by each image voxel;
f) repeating step e) W times, W being a positive integer, so as to obtain, for each image voxel, W complex-amplitude estimates;
g) for each image voxel, on a basis of the W complex-amplitude estimates resulting from step f), computing a reconstructed intensity, all of reconstructed intensities of each image voxel together forming a three-dimensional reconstructed image;
h) optionally forming a differential image, the differential image corresponding to a comparison between the three-dimensional reconstructed image and the three-dimensional acquired image;
i) assigning a random phase value to each image voxel of the three-dimensional reconstructed image resulting from step g) or of the differential image resulting from step h), and implementing an algorithm modelling a back-propagation of light through the object, so as to obtain, for each object voxel of the object voxels:
a complex fluorescence-emission amplitude corresponding to the three-dimensional reconstructed image resulting from step g); or
a complex differential fluorescence-emission amplitude corresponding to the differential image resulting from step h);
j) repeating step i) W′ times, W′ being a positive integer, so as to obtain, for each object voxel of the object voxels, W′ complex-amplitude estimates corresponding to the three-dimensional reconstructed image or W′ complex-amplitude estimates corresponding to the differential image;
k) on a basis of the W′ complex-amplitude estimates resulting from step j), computing, for each object voxel of the object voxels, an emission intensity or a differential emission intensity;
l) updating the three-dimensional spatial distribution of fluorescence inside the object on a basis of the emission intensity or of the differential emission intensity computed for each object voxel of the object voxels of the object in step k); and
m) reiterating steps d) to l) until a criterion of stoppage of iterations is met.