US 12,322,142 B2
Method of synthesizing 3D joint data based on multi-view RGB-D camera
Jong Sung Kim, Daejeon (KR); Seong Il Yang, Daejeon (KR); and Minsung Yoon, Daejeon (KR)
Assigned to ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE, Daejeon (KR)
Filed by Electronics and Telecommunications Research Institute, Daejeon (KR)
Filed on Nov. 29, 2022, as Appl. No. 18/070,924.
Claims priority of application No. 10-2022-0091761 (KR), filed on Jul. 25, 2022.
Prior Publication US 2024/0029307 A1, Jan. 25, 2024
Int. Cl. G06T 7/80 (2017.01); G06V 40/10 (2022.01)
CPC G06T 7/85 (2017.01) [G06V 40/10 (2022.01); G06T 2207/10024 (2013.01)] 15 Claims
OG exemplary drawing
 
1. A method of automatically calibrating a multi-view red green blue-depth (RGB-D) camera, the method comprising:
converting joint data for calibration collected from a depth camera of each of a plurality of RGB-D cameras from a depth camera coordinate system of each of the RGB-D cameras to a color camera coordinate system of each of the RGB-D cameras;
calculating a binary gate value of the converted joint data for calibration based on a confidence level and a confidence threshold; and
based on the converted joint data for calibration and the binary gate value, calculating a rotation matrix and a translation vector for converting joint data collected from the depth camera of each of the RGB-D cameras from the color camera coordinate system of each of the RGB-D cameras to a predetermined reference coordinate system.