US 11,677,910 B2
Computer implemented system and method for high performance visual tracking
Supratik Mukhopadhyay, Baton Rouge, LA (US); Saikat Basu, Baton Rouge, LA (US); Malcolm Stagg, Baton Rouge, LA (US); Robert Dibiano, Baton Rouge, LA (US); Manohar Karki, Baton Rouge, LA (US); and Jerry Weltman, Baton Rouge, LA (US)
Assigned to Board of Supervisors of Louisiana State University and Agricultural and Mechanical College, Baton Rouge, LA (US)
Filed by Board of Supervisors of Louisiana State University and Agricultural and Mechanical College, Baton Rouge, LA (US)
Filed on Aug. 24, 2020, as Appl. No. 17/1,049.
Application 17/001,049 is a continuation of application No. 14/047,833, filed on Oct. 7, 2013, granted, now 10,757,369.
Claims priority of provisional application 61/798,182, filed on Mar. 15, 2013.
Claims priority of provisional application 61/728,126, filed on Nov. 19, 2012.
Claims priority of provisional application 61/711,102, filed on Oct. 8, 2012.
Prior Publication US 2020/0389625 A1, Dec. 10, 2020
This patent is subject to a terminal disclaimer.
Int. Cl. H04N 7/18 (2006.01)
CPC H04N 7/18 (2013.01) 20 Claims
OG exemplary drawing
 
1. A computer system for identifying an activity, event, or chain of events in a video system, the computer system comprising:
a computer containing a plurality of software modules,
a video stream input, and
wherein the plurality of software modules include
a video capturing module;
an object tracking module;
a track identification module;
a periodic motion tracking module;
a periodic motion identifying module; and
a reasoning module;
wherein
the video capturing module receives images from a video stream and outputs image data directly to both the object tracking module and the periodic motion tracking module;
the object tracking module receives the image data from the video capturing module and produces a tracked path of an object in the video stream based on the image data; and
the track identification module receives the tracked path from the object tracking module and performs a comparison of the tracked path to a model;
the periodic motion tracking module receives the image data from the video capturing module and creates and outputs a data structure representing motion;
the periodic motion identifying module identifies periodic motion and non-periodic motion within the image data based on the output from the periodic motion tracking module; and
the reasoning module receives both the comparison from the track identification module and data regarding the identification of periodic motion and non-periodic motion from the periodic motion identifying module, and based thereon the reasoning module identifies motion characteristics of the activity, event, or chain of events.