US 12,078,475 B2
Three-dimensional measurement method, device, and storage medium
Jian Gao, Guangzhou (CN); Yizhong Zhuang, Guangzhou (CN); Lanyu Zhang, Guangzhou (CN); Haixiang Deng, Guangzhou (CN); Yun Chen, Guangzhou (CN); and Xin Chen, Guangzhou (CN)
Assigned to GUANGDONG UNIVERSITY OF TECHNOLOGY, Guangzhou (CN)
Filed by GUANGDONG UNIVERSITY OF TECHNOLOGY, Guangzhou (CN)
Filed on Jan. 25, 2024, as Appl. No. 18/423,054.
Application 18/423,054 is a continuation of application No. PCT/CN2023/093885, filed on May 12, 2023.
Claims priority of application No. 202210649253.6 (CN), filed on Jun. 9, 2022.
Prior Publication US 2024/0159521 A1, May 16, 2024
Int. Cl. G01B 11/25 (2006.01); G06T 7/73 (2017.01); G06T 7/80 (2017.01); G06T 17/00 (2006.01); H04N 13/239 (2018.01)
CPC G01B 11/2545 (2013.01) [G06T 7/75 (2017.01); G06T 7/85 (2017.01); G06T 17/00 (2013.01); H04N 13/239 (2018.05); G06T 2207/10012 (2013.01); G06T 2207/10028 (2013.01); G06T 2207/30204 (2013.01); G06T 2210/56 (2013.01)] 9 Claims
OG exemplary drawing
 
1. A three-dimensional measurement method, wherein, the three-dimensional measurement method comprising the following steps of:
combining with a three-step phase shift method, marker line information is embedded into several sinusoidal stripe patterns to obtain several target stripe patterns;
projecting each of the target stripe patterns onto a surface of an object to be measured through a projector, and collecting stripe patterns on the surface of the object to be measured through left and right cameras;
calculating wrapped phase images, mean intensity images and modulated intensity images of the stripe patterns collected by the left and right cameras;
calculating mask images corresponding to the left and right cameras based on the mean intensity images and modulated intensity images of the stripe patterns collected by the left and right cameras, and performing a global search on the mask images corresponding to the left and right cameras through a minimum valve filter, to extract marker lines corresponding to the left and right cameras;
according to the marker lines corresponding to the left and right cameras, adopting the marker lines as starting lines and performing a spatial phase unwrapping on the wrapped phase images corresponding to the left and right cameras to obtain the spatial phase of the left and right cameras based on the marker lines;
according to an unique correspondence between the spatial phase of the left and right cameras based on the marker lines, performing a coarse matching of wrapped phases based on geometric constraints to obtain candidate points of a right camera, and performing a fine spatial phase matching on the candidate points of the right camera to obtain a matching point of the right camera;
according to a conversion relationship between pixel coordinates of the right camera and pixel coordinates of the projector, obtaining an abscissa value of the matching point under the pixel coordinates of the projector, and calculating an absolute phase of the left camera based on the abscissa value of the matching point under the pixel coordinates of the projector;
based on the absolute phase of the left camera, reconstructing a three-dimensional point cloud according to a triangulation ranging technology, and a three-dimensional model of the object to be measured is obtained.