CPC G06V 10/764 (2022.01) [G06F 18/251 (2023.01); G06T 7/11 (2017.01); G06T 7/593 (2017.01); G06T 7/90 (2017.01); G06V 10/82 (2022.01); G06V 20/56 (2022.01); G06T 2207/10012 (2013.01); G06T 2207/30252 (2013.01)] | 17 Claims |
1. A computer-implemented method for detecting sky using a camera unit and an inertial measurement unit (IMU), both carried by a common autonomous vehicle, the method comprising:
segmenting a two-dimensional (2D) color image obtained by the camera unit into a plurality of regions, wherein the color image depicts at least a portion of an environment surrounding the autonomous vehicle;
determining an indication of the horizon relative to the color image based, at least in part, on sensory data obtained from the IMU;
filtering out a first subset of the regions based, at least in part, on the indication of the horizon to generate a reduced set of regions;
identifying a second subset of regions from the reduced set of regions as corresponding to the sky, based, at least in part, on color information associated with the reduced set of regions, wherein the areas above the horizon line are candidate regions, and statistical information of color-based gradients is used to determine which candidate regions constitute the second subset of regions;
transforming the identified second subset of regions into a reference system of a stereo camera unit, undistorting the data obtained by the stereo camera unit before transforming data corresponding to the identified second subset of regions; and
performing environment detection using at least the stereo camera unit based, at least in part, on exclusion of data corresponding to the second subset of regions, such that the stereo camera unit skips sensing certain parts of the environment.
|