US 12,266,269 B2
System and method for fusing asynchronous sensor tracks in a track fusion application
Joshua Yohane Sakamaki, Provo, UT (US); and Matthew Elliott Argyle, Lindon, UT (US)
Assigned to Fortem Technologies, Inc., Pleasant Grove, UT (US)
Filed by Fortem Technologies, Inc., Pleasant Grove, UT (US)
Filed on Feb. 20, 2023, as Appl. No. 18/171,592.
Application 18/171,592 is a continuation in part of application No. 17/699,750, filed on Mar. 21, 2022, granted, now 11,587,445.
Application 17/699,750 is a continuation of application No. 16/368,432, filed on Mar. 28, 2019, granted, now 11,282,397, issued on Mar. 22, 2022.
Prior Publication US 2023/0215277 A1, Jul. 6, 2023
Int. Cl. G01S 13/70 (2006.01); G01S 7/41 (2006.01); G01S 13/72 (2006.01); G01S 13/86 (2006.01); G08G 5/00 (2006.01)
CPC G08G 5/0008 (2013.01) [G01S 7/41 (2013.01); G01S 13/72 (2013.01); G01S 13/86 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A method comprising:
generating, via a first sensor, a first group of output tracks associated with a motion of a first target object;
generating, via a second sensor, a second group of output tracks associated with the motion of a second target object;
analyzing, via a track analysis module, the first group of output tracks and the second group of output tracks to determine whether the first target object and the second target object are a same object to yield a determination; and
when the determination indicates that the first target object and the second target object are the same object, presenting a graphical user interface on a computing device that enables a user to select whether to display on the graphical user interface: (1) a single track from the first group of output tracks or the second group of output tracks and (2) a fused group of tracks selected from the first group of output tracks or the second group of output tracks.