US 12,177,572 B2
Object tracking by event camera
Hadar Cohen-Duwek, Tel-Aviv (IL); and Elishai Ezra Tsur, Jerusalem (IL)
Assigned to The Open University, Raanana (IL)
Filed by The Open University, Raanana (IL)
Filed on Jul. 21, 2022, as Appl. No. 17/814,036.
Claims priority of provisional application 63/224,107, filed on Jul. 21, 2021.
Prior Publication US 2023/0021408 A1, Jan. 26, 2023
Int. Cl. H04N 23/695 (2023.01); B25J 9/16 (2006.01); G06T 1/00 (2006.01); G06T 7/20 (2017.01); G06T 7/50 (2017.01); H04N 23/54 (2023.01); H04N 23/80 (2023.01)
CPC H04N 23/695 (2023.01) [B25J 9/1664 (2013.01); G06T 1/0014 (2013.01); G06T 7/20 (2013.01); G06T 7/50 (2017.01); H04N 23/54 (2023.01); H04N 23/80 (2023.01); G06T 2207/20084 (2013.01)] 22 Claims
OG exemplary drawing
 
1. A tracking system comprising:
one or more dynamic vision sensors configured to generate luminance-transition events associated with a target object;
a depth estimation unit configured to employ a Laplacian pyramid-based monocular depth estimation neural network to generate based on the luminance-transition events generated by said one or more dynamic vision sensors, depth data/signals indicative of a distance of said target object from said one or more dynamic vision sensors;
a spatial tracking unit configured to employ channel and spatial reliability tracking techniques to generate based on the luminance-transition events generated by said one or more dynamic vision sensors, spatial tracking signals/data indicative of transitions of said target object in a scene of said target object; and
an error correction unit configured to process the depth and spatial tracking data/signals and generate based thereon error correcting data/signals for the tracking of said target object by said one or more dynamic vision sensors.