US 12,002,193 B2
Inspection device for inspecting a building or structure
Kristian Klausen, Trondheim (NO); Øystein Skotheim, Trondheim (NO); and Morten Fyhn Amundsen, Trondheim (NO)
Assigned to SCOUTDI AS, Trondheim (NO)
Filed by ScoutDI AS, Trondheim (NO)
Filed on Aug. 26, 2022, as Appl. No. 17/896,771.
Application 17/896,771 is a continuation of application No. PCT/EP2021/067576, filed on Jun. 25, 2021.
Claims priority of application No. 2010686 (GB), filed on Jul. 10, 2020.
Prior Publication US 2023/0073689 A1, Mar. 9, 2023
Int. Cl. G06T 7/00 (2017.01); G06T 7/11 (2017.01)
CPC G06T 7/0004 (2013.01) [G06T 7/11 (2017.01); G06T 2207/30184 (2013.01)] 24 Claims
OG exemplary drawing
 
1. A method for processing data captured by a drone during inspection of a building or structure, the method comprising:
receiving sets of inspection data captured by one or more inspection sensors onboard the drone, each set of inspection data being captured from a different region of the building or structure, and wherein each set of inspection data is descriptive of a condition of the building or structure in the respective region, each set of inspection data being captured at a respective point in time as the drone maneuvers relative to the building or structure;
logging the respective points in time at which each set of inspection data is captured;
receiving sets of 3D data of the drone's surroundings, each set of 3D data being captured by one or more depth sensors onboard the drone at a respective point in time, wherein each set of 3D data comprises information indicating a distance of the drone from one or more surfaces of the building or structure at the time of capture of the set of 3D data;
logging the respective points in time at which each set of 3D data is captured;
combining the sets of 3D data to obtain a 3D map of the drone's surroundings when carrying out the inspection;
for each set of inspection data:
comparing the time of capture of the respective set of inspection data with the time of capture of different ones of the sets of 3D data, thereby to identify a set of 3D data whose time of capture corresponds to that of the respective set of inspection data; and
using the identified set of 3D data to determine a location in the 3D map where the drone was located when the respective set of inspection data was captured;
tagging each set of inspection data by associating the respective set of inspection data with the position of the drone in the 3D map at which the set of inspection data was captured, thereby to provide tagged inspection data, whereby the tagged inspection data serves to indicate the location of the drone within the 3D map at the time the respective set of inspection data was captured; and
generating a 3D view of an environment based on the 3D map, wherein the 3D view of the environment shows the location of the drone in the 3D environment at the time each set of inspection data was captured.