US 12,367,670 B2
Object tracking by an unmanned aerial vehicle using visual sensors
Saumitro Dasgupta, Redwood City, CA (US); Hayk Martirosyan, San Francisco, CA (US); Hema Koppula, Palo Alto, CA (US); Alex Kendall, Cambridge (GB); Austin Stone, San Francisco, CA (US); Matthew Donahoe, Redwood City, CA (US); Abraham Galton Bachrach, Emerald Hills, CA (US); and Adam Parker Bry, Redwood City, CA (US)
Assigned to Skydio, Inc., San Mateo, CA (US)
Filed by Skydio, Inc., San Mateo, CA (US)
Filed on Dec. 29, 2023, as Appl. No. 18/400,113.
Application 18/400,113 is a continuation of application No. 17/712,613, filed on Apr. 4, 2022, granted, now 11,861,892.
Application 17/712,613 is a continuation of application No. 15/827,945, filed on Nov. 30, 2017, granted, now 11,295,458, issued on Apr. 5, 2022.
Claims priority of provisional application 62/428,972, filed on Dec. 1, 2016.
Prior Publication US 2024/0273894 A1, Aug. 15, 2024
This patent is subject to a terminal disclaimer.
Int. Cl. G06V 20/13 (2022.01); B64C 39/02 (2023.01); B64U 10/14 (2023.01); G05D 1/00 (2024.01); G06F 18/2431 (2023.01); G06N 3/045 (2023.01); G06T 3/60 (2024.01); G06T 7/10 (2017.01); G06T 7/11 (2017.01); G06T 7/20 (2017.01); G06T 7/292 (2017.01); G06T 7/579 (2017.01); G06T 7/73 (2017.01); G06V 10/82 (2022.01); G06V 20/17 (2022.01); G06V 30/262 (2022.01); H04N 13/00 (2018.01); H04N 13/239 (2018.01); H04N 13/243 (2018.01); H04N 13/282 (2018.01); B64U 101/00 (2023.01); B64U 101/30 (2023.01); B64U 101/31 (2023.01)
CPC G06V 20/13 (2022.01) [B64C 39/024 (2013.01); B64U 10/14 (2023.01); G05D 1/0011 (2013.01); G05D 1/0094 (2013.01); G06F 18/2431 (2023.01); G06T 3/60 (2013.01); G06T 7/10 (2017.01); G06T 7/11 (2017.01); G06T 7/20 (2013.01); G06T 7/292 (2017.01); G06T 7/579 (2017.01); G06T 7/75 (2017.01); G06V 10/82 (2022.01); G06V 20/17 (2022.01); G06V 30/274 (2022.01); H04N 13/239 (2018.05); H04N 13/243 (2018.05); H04N 13/282 (2018.05); B64U 2101/00 (2023.01); B64U 2101/30 (2023.01); B64U 2101/31 (2023.01); B64U 2201/10 (2023.01); G06N 3/045 (2023.01); G06T 2207/10012 (2013.01); G06T 2207/10028 (2013.01); G06T 2207/10032 (2013.01); G06T 2207/20084 (2013.01); G06T 2207/20088 (2013.01); G06T 2207/30196 (2013.01); G06T 2207/30241 (2013.01); H04N 2013/0081 (2013.01); H04N 2013/0085 (2013.01); H04N 2013/0092 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A method comprising:
receiving, by a computer system of an autonomous vehicle, images of a physical environment captured by one or more image capture devices coupled to the autonomous vehicle;
processing, by the computer system, the received images to:
detect an object in the physical environment associated with a particular class of objects; and
extract semantic information including information related to the detected object in the physical environment and information related to the physical environment itself;
predicting, by the computer system, a trajectory of the detected object in three-dimensional (3D) space of the physical environment based, at least in part, on the extracted semantic information; and
causing, by the computer system, the autonomous vehicle to track the object through the 3D space of the physical environment based, at least in part, on the predicted trajectory.