US 12,136,239 B2
Electro-hydraulic varifocal lens-based method for tracking three-dimensional trajectory of moving object
Shaorong Xie, Shanghai (CN); Hengyu Li, Shanghai (CN); Jingyi Liu, Shanghai (CN); Yueying Wang, Shanghai (CN); Shuang Han, Shanghai (CN); and Jun Luo, Shanghai (CN)
Assigned to SHANGHAI UNIVERSITY, Shanghai (CN)
Filed by SHANGHAI UNIVERSITY, Shanghai (CN)
Filed on Aug. 30, 2022, as Appl. No. 17/898,861.
Claims priority of application No. 202111009365.7 (CN), filed on Aug. 30, 2021; and application No. 202111176063.9 (CN), filed on Oct. 9, 2021.
Prior Publication US 2023/0111657 A1, Apr. 13, 2023
Int. Cl. G06T 7/80 (2017.01); G02B 3/14 (2006.01); G02B 7/28 (2021.01); H04N 23/67 (2023.01)
CPC G06T 7/80 (2017.01) [G02B 3/14 (2013.01); G02B 7/28 (2013.01); H04N 23/67 (2023.01)] 8 Claims
OG exemplary drawing
 
1. An electro-hydraulic varifocal lens-based method for tracking a three-dimensional (3D) trajectory of a moving object, comprising:
step 1, calibrating an electro-hydraulic varifocal lens under different focal distances to obtain a functional relation between a focusing control current and a camera's intrinsic parameters;
step 2, establishing an electro-hydraulic varifocal lens-based optical imaging system model to obtain a functional relation between a focusing control current of the electro-hydraulic varifocal lens and an optimal object distance;
step 3, initializing an object tracking algorithm, generating an object tracking box, and selecting a to-be-tracked object;
step 4, carrying out autofocusing, and recording a focusing control current after the autofocusing is completed, as well as a size of the object tracking box in an image and center point coordinates after undistortion; wherein the autofocusing in step 4 comprises first autofocusing and subsequent autofocusing, and the first autofocusing specifically comprises:
searching an initial focusing control current at a certain stride, calculating a sharpness evaluation value of an internal image region of the object tracking box, obtaining a maximum sharpness evaluation value and a focusing control current corresponding to the maximum sharpness evaluation value, and setting a sharpness evaluation threshold:
K=αDmax  (3)
wherein a denotes a preset sharpness confidence level, and α<1; K denotes a sharpness evaluation threshold used in the subsequent autofocusing; and Dmax denotes a maximum sharpness evaluation value; and
after the first autofocusing is finished, recording a size of the object tracking box in an image and center point coordinates after undistortion;
wherein the subsequent autofocusing specifically comprises:
calculating a sharpness evaluation value Di of the internal image region of the object tracking box; and
if Di>K, directly recording the focusing control current Ii at this moment, as well as a size sizei of the object tracking box in an image and center point coordinates after undistortion; or
if Di<K, reading a size sizei of the object tracking box in the image at this moment, comparing the size with a size sizei−1 of the object tracking box at last successful focusing, and adjusting a focusing control current to complete focusing; and after the focusing is completed, recording the focusing control current and the size of the object tracking box in the image after focusing and center point coordinates after undistortion;
step 5, calculating and recording, by a camera projection model, coordinates of the object in 3D space; and
wherein the camera projection model in step 5 is:
wherein (xi, yi) denote center point coordinates of an object tracking box in an undistorted image, cx and cy denote coordinates of a camera's optical center on a pixel plane, s denotes a slant parameter between horizontal and vertical edges of a camera's photosensitive element, fxi, fyi, denote equivalent focal distances of a camera in x and y directions corresponding to a focusing control current Ii at this moment respectively, and (Xi, Yi, Zi) denote 3D coordinates of a center point of a tracked object; and Zi=ui, wherein ui denotes an optimal object distance corresponding to the focusing control current Ii at this moment;
step 6, repeating steps 4-5 for the same tracked object, and sequentially connecting the recorded coordinates of the object in 3D space into a trajectory.