US 11,940,539 B2
Camera-to-LiDAR calibration and validation
Paul Aurel Diederichs, Graefelfing (DE); Maurilio Di Cicco, Singapore (SG); Jun Shern Chan, Singapore (SG); Andreas Jianhao Aumiller, Singapore (SG); and Francisco Alejandro Suarez Ruiz, Singapore (SG)
Assigned to Motional AD LLC, Boston, MA (US)
Filed by Motional AD LLC, Boston, MA (US)
Filed on Dec. 16, 2020, as Appl. No. 17/124,468.
Claims priority of provisional application 62/950,076, filed on Dec. 18, 2019.
Prior Publication US 2021/0192788 A1, Jun. 24, 2021
Int. Cl. G01S 17/931 (2020.01); G01S 7/497 (2006.01); G01S 17/89 (2020.01); G06F 18/22 (2023.01); G06F 18/23 (2023.01); G06T 7/80 (2017.01); G06V 10/762 (2022.01); G06V 20/56 (2022.01)
CPC G01S 17/931 (2020.01) [G01S 7/497 (2013.01); G01S 17/89 (2013.01); G06F 18/22 (2023.01); G06F 18/23 (2023.01); G06T 7/80 (2017.01); G06V 10/7635 (2022.01); G06T 2207/10028 (2013.01); G06T 2207/30252 (2013.01); G06V 20/56 (2022.01)] 21 Claims
OG exemplary drawing
 
1. A method comprising:
receiving, from a light detection and ranging (LiDAR) sensor of a vehicle, a first point cloud including a first set of LiDAR points returned from one or more calibration targets;
receiving, from a camera sensor of the vehicle, a first camera image including the one or more calibration targets;
extracting, using one or more processors of the vehicle, features of the one or more calibration targets from the first set of LiDAR points and the first camera image;
associating, using the one or more processors, the extracted features from the first set of LiDAR points and the first camera image to determine matching features;
estimating, using the one or more processors, extrinsic parameters of a coordinate transformation from LiDAR coordinates to camera coordinates or from the camera coordinates to the LiDAR coordinates, based at least in part on the matching features;
receiving, from the LiDAR sensor, a second point cloud including a second set of LiDAR points returned from one or more validation targets;
receiving, from the camera sensor, a second camera image including the one or more validation targets;
using the coordinate transformation to project the second set of LiDAR points onto the one or more validation targets in the second camera image;
estimating, using the one or more processors, one or more upper bounds on accuracy of the estimated extrinsic parameters;
determining, using the one or more processors, whether a specified number or percentage of LiDAR points in the second set of LiDAR points lie on or within the one or more validation targets included in the second camera image in accordance with the estimated one or more upper bounds on accuracy; and
in accordance with the specified number or percentage of LiDAR points from the second set of LiDAR points lying on or within the one or more validation targets in the second camera image within the one or more upper bounds on accuracy, deeming the estimated extrinsic parameters of the coordinate transformation valid.