| CPC G01C 21/10 (2013.01) [B64U 10/13 (2023.01); B64U 70/95 (2023.01); B64U 80/84 (2023.01); B64U 2101/30 (2023.01); B64U 2101/64 (2023.01); B64U 2201/10 (2023.01); B64U 2201/104 (2023.01); B64U 2201/20 (2023.01)] | 20 Claims |

|
1. An unmanned aircraft system, comprising:
a communications unit accessible by a user, the communications unit having a user interface, a storage medium, and a communication element; and
an unmanned aerial vehicle (UAV) comprising:
a non-transitory computer-readable medium configured to store information and executable programmed modules;
a lift system comprising one or more lift mechanisms configured to propel the UAV;
a sensor system configured to obtain sensor information related to an environment of the UAV and at least one of store sensor information in the non-transitory computer-readable medium and transmit sensor information to the communications unit;
a processor configured to control operation of the lift system and the sensor system, the processor communicatively coupled with the non-transitory computer-readable medium and configured to execute programmed modules stored therein;
an object detection module stored in the non-transitory computer-readable medium and configured to be executed by the processor, the object detection module configured to obtain sensor information stored in the non-transitory computer-readable medium by the sensor system and detect potential mobile landing structures within the environment of the UAV based on an analysis of the sensor information, wherein each of the potential mobile landing structures is capable of accommodating the UAV;
a mobile landing area recognition module stored in the non-transitory computer-readable medium and configured to be executed by the processor, the mobile landing area recognition module configured to obtain sensor information stored in the non-transitory computer-readable medium by the sensor system and identify a mobile landing area on a target mobile landing structure of the potential mobile landing structures based on an analysis of the sensor information obtained from a signal transmitted via a transmitter of the target mobile landing structure and an analysis of an input of a visual identifier located on the mobile landing area; and
a navigation module stored in the non-transitory computer-readable medium and configured to be executed by the processor, the navigation module configured to estimate a real-time state, including a location, of the mobile landing area based on an analysis of the sensor information, the identification of the mobile landing area, or both,
the navigation module further configured to navigate the UAV to the mobile landing area based on a location of the target mobile landing structure obtained via the transmitter and a horizontal distance between a location of the transmitter and the estimated real-time state of the mobile landing area, and control operation of the lift system to bring the UAV into contact with a surface of the mobile landing area,
wherein the navigation module is configured to refine the estimated real-time state from a previous input of the visual identifier,
wherein the navigation module is further configured to control operation of the lift system to allow the UAV to release a payload onto the mobile landing area,
wherein the real-time state of the mobile landing area comprises at least one of: a position, a velocity, an acceleration, and an orientation,
wherein when the target mobile landing structure is automatically detected, the UAV is positioned directly overhead of a landing pad with the visual identifier and begins to descend the UAV in a relative guidance mode between the target mobile landing structure and the UAV, such that velocity and attitude commands are issued to the navigation module based on the real-time state of the mobile landing area to match the motion of the target mobile landing structure to track the target mobile landing structure throughout the descent, and
wherein the user interface is configured to display the detected potential mobile landing structures to the user for visual confirmation, and wherein the user interface is configured to receive a user input in response to the displayed potential mobile landing structure.
|