US 11,892,560 B2
High precision multi-sensor extrinsic calibration via production line and mobile station
Hiu Hong Yu, Alameda, CA (US); Zhenxiang Jian, Sunnyvale, CA (US); Tong Lin, Cupertino, CA (US); Xu Chen, Livermore, CA (US); Zhongkui Wang, San Jose, CA (US); Antonio Antonellis Rufo, San Jose, CA (US); and Waylon Chen, San Jose, CA (US)
Assigned to NIO Technology (Anhui) Co., Ltd, Hefei (CN)
Filed by NIO Technology (Anhui) Co., Ltd., Anhui (CN)
Filed on Feb. 3, 2020, as Appl. No. 16/780,122.
Prior Publication US 2021/0239793 A1, Aug. 5, 2021
Int. Cl. G01S 7/40 (2006.01); G01S 13/86 (2006.01); G01S 13/931 (2020.01)
CPC G01S 7/40 (2013.01) [G01S 13/865 (2013.01); G01S 13/867 (2013.01); G01S 13/931 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A method for providing multi-sensor extrinsic calibration in a vehicle, the method comprising:
providing the vehicle within an extrinsic calibration station having a two-dimensional (2D) calibration target;
receiving, by one or more processors of the vehicle, sensor data from a first sensor and a second sensor of the vehicle, the first sensor being a LIDAR sensor and the second sensor being an object-detection sensor or an imaging sensor, the sensor data comprising:
a three-dimensional (3D) point cloud including a first representation of the 2D calibration target as sensed by the first sensor, the first representation corresponding to a partial representation of the 2D calibration target sensed by the first sensor; and
other image data including a second representation of the 2D calibration target as sensed by the second sensor;
clustering and segmenting, by the one or more processors, detection points in the 3D point cloud to detect a shape of the 2D calibration target from the partial representation of the 2D calibration target;
determining, by the one or more processors, a center of the 2D calibration target using the detected shape of the 2D calibration target;
determining, by the one or more processors, a center of the 2D calibration target in the second representation based on one or more features of the 2D calibration target in the second representation;
matching, by the one or more processors, the center of the 2D calibration target as determined using the detected shape to the center of the 2D calibration target in the second representation;
computing, by the one or more processors, a six-degree of freedom rigid body transformation of the first sensor and the second sensor based on the matched centers of the 2D calibration target as determined using the detected shape and of the 2D calibration target in the second representation; and
computing, by the one or more processors, a projection of the first sensor to the second sensor based on the computed six-degree of freedom rigid body transformation of the first sensor and the second sensor.