CPC B60W 60/0027 (2020.02) [G01S 7/4808 (2013.01); G01S 17/58 (2013.01); G01S 17/66 (2013.01); G01S 17/931 (2020.01); G05D 1/0088 (2013.01); G05D 1/0214 (2013.01); G06N 20/00 (2019.01); B60W 2420/408 (2024.01); B60W 2554/4026 (2020.02); B60W 2554/4029 (2020.02); B60W 2554/4042 (2020.02); B60W 2554/4043 (2020.02); B60W 2554/4044 (2020.02)] | 20 Claims |
1. A computer-implemented method, comprising:
obtaining sensor data comprising a plurality of sensor data points descriptive of an environment of an autonomous vehicle;
generating, based on the sensor data, primary perception data by a primary perception system, the primary perception data representing a plurality of classifiable objects and a plurality of paths representing tracked motion of the plurality of classifiable objects, wherein each classifiable object is classified by the primary perception system as a predefined class of a plurality of predefined classes of objects;
generating, by a secondary perception system that is different from the primary perception system, one or more sensor data point clusters and secondary path data based on the plurality of sensor data points, the secondary path data representing tracked motion of the one or more sensor data point clusters; and
determining fused perception data representing the tracked motion of the plurality of classifiable objects and the tracked motion of one or more unclassifiable objects, wherein the one or more unclassifiable objects are identified by deduplicating the plurality of classifiable objects with the one or more sensor data point clusters.
|