US 11,987,250 B2
Data fusion method and related device
Huan Yu, Wuhan (CN); Xiao Yang, Beijing (CN); and Yonggang Song, Beijing (CN)
Assigned to HUAWEI TECHNOLOGIES CO., LTD., Guangdong (CN)
Filed by HUAWEI TECHNOLOGIES CO., LTD., Guangdong (CN)
Filed on Sep. 15, 2020, as Appl. No. 17/021,911.
Application 17/021,911 is a continuation of application No. PCT/CN2019/078646, filed on Mar. 19, 2019.
Claims priority of application No. 201810232615.5 (CN), filed on Mar. 20, 2018.
Prior Publication US 2020/0409372 A1, Dec. 31, 2020
Int. Cl. B60W 40/04 (2006.01); G05D 1/00 (2006.01); G06F 18/22 (2023.01); G06F 18/25 (2023.01); G06V 10/80 (2022.01); G06V 20/56 (2022.01); G06V 20/54 (2022.01)
CPC B60W 40/04 (2013.01) [G05D 1/0212 (2013.01); G06F 18/22 (2023.01); G06F 18/25 (2023.01); G06V 10/803 (2022.01); G06V 20/56 (2022.01); G06V 20/54 (2022.01); G06V 2201/07 (2022.01)] 20 Claims
OG exemplary drawing
 
1. A data fusion method comprising:
obtaining vehicle sensing data, wherein the vehicle sensing data is obtained by a vehicle sensing apparatus by sensing a road environment in a first sensing range;
obtaining roadside sensing data, wherein the roadside sensing data is obtained by a roadside sensing apparatus by sensing the road environment in a second sensing range; and
fusing, using the processor, the vehicle sensing data and the roadside sensing data by using a fusion formula to obtain a first fusion result with a wider sensing range than the first sensing range and the second sensing range;
wherein the fusion formula is expressed as:
y=f(resultr,resultv)
wherein resultr is a roadside result set, the roadside result set being used to indicate the roadside sensing data, resultv is a vehicle result set, the vehicle result set being used to indicate the vehicle sensing data, y is the first fusion result, and the function f is used to obtain the first fusion result by mapping based on the roadside result set and the vehicle result set; and
wherein

OG Complex Work Unit Math
wherein wr is a confidence factor of the roadside sensing apparatus, wr=(wr1, wr2, wrM), resultr (roadside1, roadside2, . . . , roadsideM), M is a quantity of target objects in the second sensing range of the roadside sensing apparatus, wri is a confidence factor corresponding to a target object i in the second sensing range of the roadside sensing apparatus, roadsidei is a roadside result unit corresponding to the target object i in the second sensing range of the roadside sensing apparatus, i is a natural number, 0<i≤M, wv is a confidence factor of the vehicle sensing apparatus, wv=(wv1, wv2, wvN), resultv (vehicle1, vehicle2, . . . , vehicleN), N is a quantity of target objects in the first sensing range of the vehicle sensing apparatus, wvj is a confidence factor corresponding to a target object j in the first sensing range of the vehicle sensing apparatus, vehiclei is a vehicle result unit corresponding to the target object j in the first sensing range of the vehicle sensing apparatus, j is a natural number, and 0<j≤N.