US 11,808,841 B2
Fusion of depth imager and radar system in a vehicle
Emanuel Mordechai, Mishmarot (IL); and Igal Bilik, Rehovot (IL)
Assigned to GM GLOBAL TECHNOLOGY OPERATIONS LLC, Detroit, MI (US)
Filed by GM Global Technology Operations LLC, Detroit, MI (US)
Filed on Oct. 5, 2020, as Appl. No. 17/062,965.
Prior Publication US 2022/0107405 A1, Apr. 7, 2022
Int. Cl. G01S 13/86 (2006.01); G01S 13/89 (2006.01); G01S 7/41 (2006.01); G01S 13/931 (2020.01)
CPC G01S 13/865 (2013.01) [G01S 7/411 (2013.01); G01S 7/415 (2013.01); G01S 13/89 (2013.01); G01S 13/931 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A method of performing sensor fusion with a depth imager and a radar system, the method comprising:
transmitting radio frequency (RF) energy from the radar system to a region;
emitting light to the region using a light source simultaneously with transmission of the RF energy;
receiving, at an optical camera including a depth imager aligned with the light source, reflected light from the region resulting from the light emitted by the light source;
receiving, at the radar system, RF reflections resulting from reflection of the RF energy emitted by the radar system by one or more objects in the region;
processing the reflected light to obtain azimuth, elevation, range, variance in range, and reflectivity to each pixel among a plurality of pixels that make up the region;
processing the RF reflections to obtain azimuth, elevation, range, variance in range, velocity, and variance in velocity to a subset of the plurality of pixels of the region corresponding to the one or more objects, the subset of the plurality of pixels in the region representing a region of interest; and
obtaining a high resolution image of the region of interest based on performing the sensor fusion in the region of interest determined by the radar system by using the azimuth, the elevation, the variance in range, and the reflectivity resulting from the depth imager and the range, the velocity, and the variance in velocity resulting from the radar system, wherein performing the sensor fusion includes assigning the azimuth and the elevation resulting from the depth imager to each pixel in the region of interest.