US 11,940,795 B2
Performing 3D reconstruction via an unmanned aerial vehicle
Peter Henry, San Francisco, CA (US); Jack Zhu, San Francisco, CA (US); Brian Richman, San Francisco, CA (US); Harrison Zheng, Palo Alto, CA (US); Hayk Martirosyan, San Francisco, CA (US); Matthew Donahoe, Redwood City, CA (US); Abraham Bachrach, Redwood City, CA (US); Adam Bry, Redwood City, CA (US); Ryan David Kennedy, Redwood City, CA (US); Himel Mondal, Windsor (CA); and Quentin Allen Wah Yen Delepine, Cupertino, CA (US)
Assigned to SKYDIO, INC., Redwood City, CA (US)
Filed by SKYDIO, INC., Redwood City, CA (US)
Filed on Jan. 20, 2023, as Appl. No. 18/099,571.
Application 18/099,571 is a continuation of application No. 17/174,585, filed on Feb. 12, 2021, granted, now 11,573,544.
Claims priority of provisional application 63/050,860, filed on Jul. 12, 2020.
Claims priority of provisional application 62/976,231, filed on Feb. 13, 2020.
Prior Publication US 2023/0324911 A1, Oct. 12, 2023
This patent is subject to a terminal disclaimer.
Int. Cl. G01C 1/00 (2006.01); B64C 39/02 (2023.01); B64D 31/06 (2006.01); B64D 47/08 (2006.01); G05B 13/02 (2006.01); G05B 17/02 (2006.01); G05D 1/00 (2006.01); G06T 7/55 (2017.01); G06T 7/73 (2017.01); G06T 17/00 (2006.01); G06T 19/20 (2011.01); G06V 20/13 (2022.01); G06V 20/64 (2022.01); H04N 23/60 (2023.01); H04N 23/695 (2023.01); H04N 23/90 (2023.01); B64U 10/13 (2023.01); B64U 101/30 (2023.01)
CPC G05D 1/0094 (2013.01) [B64C 39/024 (2013.01); B64D 31/06 (2013.01); B64D 47/08 (2013.01); G05B 13/0265 (2013.01); G05B 17/02 (2013.01); G05D 1/0088 (2013.01); G05D 1/101 (2013.01); G06T 7/55 (2017.01); G06T 7/74 (2017.01); G06T 17/00 (2013.01); G06T 19/20 (2013.01); G06V 20/13 (2022.01); G06V 20/64 (2022.01); H04N 23/64 (2023.01); H04N 23/695 (2023.01); H04N 23/90 (2023.01); B64U 10/13 (2023.01); B64U 2101/30 (2023.01); G06T 2207/10032 (2013.01); G06T 2207/20221 (2013.01); G06T 2219/2004 (2013.01)] 20 Claims
OG exemplary drawing
 
1. An unmanned aerial vehicle (UAV) comprising:
one or more image sensors;
a propulsion mechanism; and
one or more processors configured by executable instructions to:
receive, from a computing device, an indication of a target and one or more scan parameters for scanning the target;
capture, with the one or more image sensors, while the UAV is in flight, a plurality of images of the target;
compare a first image of the plurality of images with a second image of the plurality of images to determine a difference between a current frame of reference position for the UAV and an estimate of an actual frame of reference position for the UAV; and
determine, based at least on the difference, and while the UAV is in flight, an update to a three-dimensional (3D) model of the target.