US 12,072,443 B2
Segmentation of lidar range images
Nikolai Smolyanskiy, Seattle, WA (US); Ryan Oldja, Redmond, WA (US); Ke Chen, Sunnyvale, CA (US); Alexander Popov, Kirkland, WA (US); Joachim Pehserl, Lynnwood, WA (US); Ibrahim Eden, Redmond, WA (US); Tilman Wekel, Sunnyvale, CA (US); David Wehr, Redmond, WA (US); Ruchi Bhargava, Redmond, WA (US); and David Nister, Bellevue, WA (US)
Filed by NVIDIA Corporation, Santa Clara, CA (US)
Filed on Jul. 15, 2021, as Appl. No. 17/377,053.
Application 17/377,053 is a continuation of application No. 16/915,346, filed on Jun. 29, 2020, granted, now 11,532,168.
Application 17/377,053 is a continuation of application No. 16/836,583, filed on Mar. 31, 2020, granted, now 11,885,907.
Application 17/377,053 is a continuation of application No. 16/836,618, filed on Mar. 31, 2020, granted, now 11,531,088.
Claims priority of provisional application 62/938,852, filed on Nov. 21, 2019.
Claims priority of provisional application 62/936,080, filed on Nov. 15, 2019.
Prior Publication US 2021/0342608 A1, Nov. 4, 2021
Int. Cl. G01S 7/48 (2006.01); B60W 60/00 (2020.01); G01S 17/89 (2020.01); G01S 17/931 (2020.01); G05D 1/00 (2006.01); G06N 3/045 (2023.01); G06T 19/00 (2011.01); G06V 10/25 (2022.01); G06V 10/26 (2022.01); G06V 10/44 (2022.01); G06V 10/764 (2022.01); G06V 10/774 (2022.01); G06V 10/80 (2022.01); G06V 10/82 (2022.01); G06V 20/56 (2022.01); G06V 20/58 (2022.01); G06V 10/10 (2022.01)
CPC G01S 7/4802 (2013.01) [B60W 60/0011 (2020.02); B60W 60/0016 (2020.02); B60W 60/0027 (2020.02); G01S 17/89 (2013.01); G01S 17/931 (2020.01); G05D 1/0088 (2013.01); G06N 3/045 (2023.01); G06T 19/006 (2013.01); G06V 10/25 (2022.01); G06V 10/26 (2022.01); G06V 10/454 (2022.01); G06V 10/764 (2022.01); G06V 10/774 (2022.01); G06V 10/803 (2022.01); G06V 10/82 (2022.01); G06V 20/56 (2022.01); G06V 20/58 (2022.01); G06V 20/584 (2022.01); B60W 2420/403 (2013.01); G06T 2207/10028 (2013.01); G06T 2207/20081 (2013.01); G06T 2207/20084 (2013.01); G06T 2207/30261 (2013.01); G06V 10/16 (2022.01)] 20 Claims
OG exemplary drawing
 
1. A method comprising:
generating, from sensor data comprising LiDAR data of an environment, a representation of a range image;
extracting, based at least on the representation of the range image and using one or more Neural Networks (NNs), classification data representing one or more extracted classifications of elements in the environment;
generating labeled geometry data based at least on associating a projected representation of the one or more extracted classifications with corresponding geometry data of the elements;
generating one or more bounding shapes of the elements based at least on the labeled geometry data; and
providing data representing the one or more bounding shapes to a control component of an autonomous vehicle.