US 12,423,834 B1
Systems and methods for monitoring user performance in launching an object at a sporting event
Alan W. Marty, Menlo Park, CA (US); and John Carter, Elkmont, AL (US)
Assigned to Pillar Vision, Inc., Menlo Park, CA (US)
Filed by Pillar Vision, Inc., Menlo Park, CA (US)
Filed on Apr. 23, 2025, as Appl. No. 19/187,378.
Application 19/187,378 is a continuation of application No. 18/217,264, filed on Jun. 30, 2023, granted, now 12,288,344.
Application 18/217,264 is a continuation of application No. 16/938,601, filed on Jul. 24, 2020, granted, now 11,715,214, issued on Aug. 1, 2023.
Application 16/938,601 is a continuation of application No. 16/777,838, filed on Jan. 30, 2020, granted, now 10,762,642, issued on Sep. 1, 2020.
Application 16/777,838 is a continuation of application No. 15/624,527, filed on Jun. 15, 2017, abandoned.
Application 15/624,527 is a continuation of application No. 14/579,916, filed on Dec. 22, 2014, granted, now 9,697,617, issued on Jul. 4, 2017.
Application 14/579,916 is a continuation of application No. 13/921,162, filed on Jun. 18, 2013, granted, now 8,948,457, issued on Feb. 3, 2015.
Claims priority of provisional application 61/808,061, filed on Apr. 3, 2013.
Int. Cl. G06T 7/246 (2017.01); G06K 9/00 (2022.01); G06T 7/80 (2017.01); H04N 23/51 (2023.01)
CPC G06T 7/246 (2017.01) [G06T 7/80 (2017.01); H04N 23/51 (2023.01); G06T 2207/10016 (2013.01); G06T 2207/30224 (2013.01); G06T 2207/30241 (2013.01)] 18 Claims
OG exemplary drawing
 
10. A system, comprising:
a camera positioned on glasses to be worn by a player participating in a sporting event, the camera configured to capture a plurality of two-dimensional (2-D) images of an object launched by a player participating in the sporting event; and
at least one processor programmed with instructions that, when executed by the at least one processor, cause the at least one processor to:
identify the launched object in the plurality of 2-D images;
determine an orientation of the camera based on at least one accelerometer and at least one of the plurality of 2-D images;
track the launched object based on the plurality of 2-D images by determining positions of the launched object in three-dimensional (3-D) space based on the determined orientation;
identify a stationary object in the plurality of 2-D images;
determine a position of the launched object at a point along the trajectory relative to the stationary object;
provide an output based on the determined position of the launched object at the point along the trajectory relative to the stationary object; and
render a simulation of the trajectory based on tracking of the launched object by the at least one processor.