US 11,915,438 B2
Method and apparatus for depth-map estimation of a scene
Manu Alibay, Alfortville (FR); Olivier Pothier, Sceaux (FR); Victor Macela, Paris (FR); Alain Bellon, Grenoble (FR); and Arnaud Bourge, Paris (FR)
Assigned to STMicroelectronics France, Montrouge (FR)
Filed by STMicroelectronics France, Montrouge (FR)
Filed on Sep. 17, 2021, as Appl. No. 17/478,643.
Application 17/478,643 is a division of application No. 16/548,138, filed on Aug. 22, 2019, granted, now 11,138,749.
Application 16/548,138 is a continuation of application No. 15/692,794, filed on Aug. 31, 2017, granted, now 10,445,892, issued on Oct. 15, 2019.
Claims priority of application No. 1751539 (FR), filed on Feb. 27, 2017.
Prior Publication US 2022/0005214 A1, Jan. 6, 2022
This patent is subject to a terminal disclaimer.
Int. Cl. G06T 7/521 (2017.01); G01S 17/08 (2006.01); G06T 7/55 (2017.01); G06T 7/593 (2017.01); H04N 13/207 (2018.01); H04N 13/254 (2018.01); H04N 13/271 (2018.01); H04N 13/00 (2018.01); H04N 13/204 (2018.01)
CPC G06T 7/521 (2017.01) [G01S 17/08 (2013.01); G06T 7/55 (2017.01); G06T 7/593 (2017.01); H04N 13/207 (2018.05); H04N 13/254 (2018.05); H04N 13/271 (2018.05); G06T 2207/10012 (2013.01); G06T 2207/10021 (2013.01); G06T 2207/10028 (2013.01); G06T 2207/20212 (2013.01); H04N 2013/0081 (2013.01); H04N 13/204 (2018.05)] 15 Claims
OG exemplary drawing
 
1. A device comprising:
a time of flight sensor configured to generate a distance map of a scene, the time of flight sensor being configured to generate a corresponding distance histogram for each acquisition zone of the scene; and
a stereoscopic image acquisition device configured to acquire two images of the scene at two different viewpoints, wherein the device is configured to
identify regions of a depth map to be generated from the two images that corresponds to the distance map,
generate a range of values of disparities, region by region, from extreme values of the distances of the corresponding histogram, and
extrapolate distances of the scene from the disparities between the two images, wherein, for each region, the extrapolation of the distances of the scene is performed based on the range of values of the disparities for a corresponding region.