US 12,086,213 B2
Generating fused sensor data through metadata association
Kevin Sheu, Fremont, CA (US); Jie Mao, Santa Clara, CA (US); and Deling Li, Fremont, CA (US)
Assigned to Pony AI Inc., Grand Cayman (KY)
Filed by Pony AI Inc., Grand Cayman (KY)
Filed on Jun. 30, 2023, as Appl. No. 18/345,380.
Application 18/345,380 is a continuation of application No. 16/938,600, filed on Jul. 24, 2020, granted, now 11,693,927.
Prior Publication US 2023/0350979 A1, Nov. 2, 2023
This patent is subject to a terminal disclaimer.
Int. Cl. G06F 18/25 (2023.01); G01S 13/86 (2006.01); G01S 17/89 (2020.01); G06T 11/20 (2006.01); G06V 20/56 (2022.01)
CPC G06F 18/256 (2023.01) [G01S 13/865 (2013.01); G01S 13/867 (2013.01); G01S 17/89 (2013.01); G06T 11/20 (2013.01); G06V 20/56 (2022.01); G06T 2200/04 (2013.01); G06T 2210/12 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A computer-implemented method for fusing sensor data via metadata association, the method comprising:
capturing first sensor data using a first sensor and second sensor data using a second sensor, the first sensor traversing at least a portion of a scan path that includes a 360 degree rotation;
calibrating a set of extrinsics for the first sensor or the second sensor, the calibrated set of extrinsics including rotational or translational transformation data between the first sensor and the second sensor;
performing, based at least in part on the calibrated set of extrinsics, a frame synchronization between the first sensor data and the second sensor data to obtain a set of synchronized frames, wherein each synchronized frame includes a portion or a frame of the first sensor data and a respective portion or a frame of the second sensor data;
generating, based at least in part on an output of the frame synchronization, fused sensor data of the first sensor data and the second sensor data; and
training a machine learning model or classifier based on the fused sensor data.