US 12,263,849 B2
Data fusion and analysis engine for vehicle sensors
LuAn Tang, Pennington, NJ (US); Yuncong Chen, Plainsboro, NJ (US); Wei Cheng, Princeton Junction, NJ (US); Zhengzhang Chen, Princeton Junction, NJ (US); Haifeng Chen, West Windsor, NJ (US); Yuji Kobayashi, Tokyo (JP); and Yuxiang Ren, Tallahassee, FL (US)
Assigned to NEC Corporation, Tokyo (JP)
Filed by NEC Laboratories America, Inc., Princeton, NJ (US); and NEC Corporation, Tokyo (JP)
Filed on Oct. 6, 2022, as Appl. No. 17/961,169.
Claims priority of provisional application 63/253,164, filed on Oct. 7, 2021.
Prior Publication US 2023/0112441 A1, Apr. 13, 2023
Int. Cl. B60W 40/09 (2012.01); G06F 18/25 (2023.01); G06N 3/084 (2023.01)
CPC B60W 40/09 (2013.01) [G06F 18/25 (2023.01); B60W 2420/403 (2013.01); B60W 2540/30 (2013.01); G06N 3/084 (2013.01)] 17 Claims
OG exemplary drawing
 
1. A method for data fusion and analysis of vehicle sensor data, comprising:
receiving a multiple modality input data stream from a plurality of vehicle sensors;
determining one or more latent features by extracting one or more modality-specific features from the input data stream;
aligning a distribution of the latent features of different modalities by feature-level data fusion;
determining classification probabilities for one or more of the latent features vising a fused modality scene classifier;
training a tree-organized neural network to determine path probabilities and driving pattern judgments, the tree-organized neural network comprising a soft tree model and a hard decision leaf by measuring a similarity between latent features by an inner product after normalization;
issuing one or more driving pattern judgments based on a probability of possible driving patterns derived from the one or more modality-specific features to a pattern analyzer to determine a driving pattern output; and
controlling an operation of an autonomous vehicle based on the driving pattern output.