US 11,861,892 B2
Object tracking by an unmanned aerial vehicle using visual sensors
Saumitro Dasgupta, Redwood City, CA (US); Hayk Martirosyan, San Francisco, CA (US); Hema Koppula, Palo Alto, CA (US); Alex Kendall, Cambridge (GB); Austin Stone, San Francisco, CA (US); Matthew Donahoe, Redwood City, CA (US); Abraham Galton Bachrach, Emerald Hills, CA (US); and Adam Parker Bry, Redwood City, CA (US)
Assigned to Skydio, Inc., San Mateo, CA (US)
Filed by Skydio, Inc., Redwood City, CA (US)
Filed on Apr. 4, 2022, as Appl. No. 17/712,613.
Application 17/712,613 is a continuation of application No. 15/827,945, filed on Nov. 30, 2017, granted, now 11,295,458.
Claims priority of provisional application 62/428,972, filed on Dec. 1, 2016.
Prior Publication US 2022/0309687 A1, Sep. 29, 2022
This patent is subject to a terminal disclaimer.
Int. Cl. G06V 20/13 (2022.01); G06T 7/292 (2017.01); B64C 39/02 (2023.01); G06T 7/73 (2017.01); G06T 7/10 (2017.01); G06T 3/60 (2006.01); G05D 1/00 (2006.01); H04N 13/243 (2018.01); H04N 13/239 (2018.01); G06T 7/579 (2017.01); H04N 13/282 (2018.01); G06T 7/11 (2017.01); G06T 7/20 (2017.01); G06V 30/262 (2022.01); G06F 18/2431 (2023.01); G06V 10/82 (2022.01); G06V 20/17 (2022.01); H04N 13/00 (2018.01); G06N 3/045 (2023.01); B64U 101/00 (2023.01); B64U 101/30 (2023.01)
CPC G06V 20/13 (2022.01) [B64C 39/024 (2013.01); G05D 1/0011 (2013.01); G05D 1/0094 (2013.01); G06F 18/2431 (2023.01); G06T 3/60 (2013.01); G06T 7/10 (2017.01); G06T 7/11 (2017.01); G06T 7/20 (2013.01); G06T 7/292 (2017.01); G06T 7/579 (2017.01); G06T 7/75 (2017.01); G06V 10/82 (2022.01); G06V 20/17 (2022.01); G06V 30/274 (2022.01); H04N 13/239 (2018.05); H04N 13/243 (2018.05); H04N 13/282 (2018.05); B64U 2101/00 (2023.01); B64U 2101/30 (2023.01); B64U 2201/10 (2023.01); G06N 3/045 (2023.01); G06T 2207/10012 (2013.01); G06T 2207/10028 (2013.01); G06T 2207/10032 (2013.01); G06T 2207/20084 (2013.01); G06T 2207/20088 (2013.01); G06T 2207/30196 (2013.01); G06T 2207/30241 (2013.01); H04N 2013/0081 (2013.01); H04N 2013/0085 (2013.01); H04N 2013/0092 (2013.01)] 20 Claims
OG exemplary drawing
 
1. An autonomous aerial vehicle comprising:
an image capture system configured to capture images of a physical environment while the autonomous aerial vehicle is in flight;
a propulsion system configured to maneuver the autonomous aerial vehicle through the physical environment;
a visual navigation system configured to:
process the captured images to extract semantic information relating to one or more physical objects in the physical environment;
process the semantic information to identify a particular class of physical objects relating to a particular one of the one or more physical objects;
identify a motion model associated with the particular class of physical objects relating to the particular one of the one or more physical objects;
determine, based on the captured images and the motion model, a predicted trajectory associated with the particular one of the one or more physical objects through three-dimensional (3D) space of the physical environment;
track an actual trajectory of the particular one of the one or more physical objects through the 3D space of the physical environment based, at least in part, on the predicted trajectory;
generate and continually update a planned trajectory for the autonomous aerial vehicle through the physical environment that follows the actual trajectory of the particular physical object instance; and
a flight control system configured to generate control commands configured to cause the propulsion system of the autonomous aerial vehicle to maneuver along the planned trajectory.