US 12,142,001 B1
Systems and methods for determining a pose and motion of a carrier vehicle
Jerel Nielsen, Spanish Fork, UT (US); Tianxue Chen, Kirkland, WA (US); and Shaocheng Wang, Mountain View, CA (US)
Assigned to Hayden AI Technologies, Inc., San Francisco, CA (US)
Filed by Hayden AI Technologies, Inc., San Francisco, CA (US)
Filed on Dec. 28, 2023, as Appl. No. 18/399,159.
Int. Cl. G06K 9/00 (2022.01); G06T 7/20 (2017.01); G06T 7/70 (2017.01); G06V 10/44 (2022.01)
CPC G06T 7/70 (2017.01) [G06T 7/20 (2013.01); G06V 10/44 (2022.01); G06V 2201/07 (2022.01)] 27 Claims
OG exemplary drawing
 
1. A machine-based method of determining a pose and motion of a carrier vehicle, comprising:
capturing a video of an external environment surrounding a carrier vehicle using a camera of an edge device coupled to the carrier vehicle, wherein the video comprises a plurality of video frames comprising a keyframe and a subsequent video frame captured after the keyframe;
determining, using one or more processors of the edge device, a full vehicle pose of the carrier vehicle with respect to the keyframe based on visual odometry measurements, wherein the visual odometry measurements are made by:
matching image points between the keyframe and the subsequent video frame to obtain a plurality of tracked points,
providing the tracked points as inputs to a random sample consensus-based (RANSAC-based) nonlinear least squares solver to remove outliers and obtain an up-to-scale camera pose,
determining a camera translation magnitude based on the up-to-scale camera pose using at least one of a homography-based reprojection error minimization technique, a triangulated point-based reprojection error minimization technique, and a constant depth-based reprojection error minimization technique,
combining the up-to-scale camera pose and the camera translation magnitude to obtain a full camera pose, and
converting the full camera pose to the full vehicle pose based on a known relationship between the full camera pose and the full vehicle pose;
determining, using the one or more processors, a vehicle position and motion of the carrier vehicle based on positioning data obtained from a positioning unit of the edge device; and
providing the full vehicle pose with respect to the keyframe obtained from the visual odometry measurements and the vehicle position and motion determined from the positioning data to a filter running on the edge device to obtain a fused vehicle pose and motion of the carrier vehicle, wherein the motion of the carrier vehicle comprises a velocity and acceleration of the carrier vehicle.