US 12,112,645 B2
Unmanned aerial vehicle positioning method based on millimeter-wave radar
Bin He, Shanghai (CN); Gang Li, Shanghai (CN); Runjie Shen, Shanghai (CN); Yanmin Zhou, Shanghai (CN); Jie Chen, Shanghai (CN); and Shuping Song, Shanghai (CN)
Assigned to TONGJI UNIVERSITY, Shanghai (CN)
Filed by TONGJI UNIVERSITY, Shanghai (CN)
Filed on Jul. 21, 2022, as Appl. No. 17/870,592.
Application 17/870,592 is a continuation of application No. PCT/CN2022/071564, filed on Apr. 2, 2022.
Claims priority of application No. 202110588348.7 (CN), filed on May 28, 2021.
Prior Publication US 2022/0383755 A1, Dec. 1, 2022
Int. Cl. G08G 5/00 (2006.01); B64C 39/02 (2023.01)
CPC G08G 5/006 (2013.01) [B64C 39/024 (2013.01); B64U 2201/10 (2023.01)] 13 Claims
OG exemplary drawing
 
1. A positioning method of an unmanned aerial vehicle (UAV) based on a millimeter-wave radar, wherein the UAV is equipped with the millimeter-wave radar and an inertial measurement unit (IMU), comprising a calibration stage and a positioning stage; wherein the calibration stage comprises:
S1, obtaining a map, radar point cloud data measured by the millimeter-wave radar, and ground coordinates of unmanned aerial vehicles;
S2, pre-processing the radar point cloud data; extracting key points from the radar point cloud data, obtaining characteristic line segment based on the key points, and recording the key points on the characteristic line segment as feature points;
S3, projecting the feature points on the map according to the ground coordinates of the UAV to obtain the ground coordinates of each feature point;
wherein the positioning stage comprises:
S4, acquiring the radar point cloud data of a current frame measured by the millimeter-wave radar, and pre-processing the radar point cloud data;
S5, acquiring UAV motion data measured by IMU, and fusing the radar point cloud data with UAV motion data to obtain corrected radar point cloud data;
S6, extracting key points from the radar point cloud data, obtaining characteristic line segment based on the key points, and recording the key points on the characteristic line segment as feature points;
S7, registering the characteristic line segment of the radar point cloud data of the current frame with the characteristic line segment of the radar point cloud data of the previous frame, finding out the feature points matching the current frame with the previous frame, and marking them as a matched feature point, finding out a feature point increased by the current frame relative to the previous frame, and marking the point as newly added feature point;
S8, obtaining the ground coordinates of the unmanned aerial vehicle and the ground coordinates of newly added feature points based on the ground coordinates of the matched feature points on the map; and
S9, repeating S4.