US 12,142,012 B2
Method and system for re-projecting and combining sensor data for visualization
Dae Hyun Lee, Etobicoke (CA); and Tyler James Doyle, Toronto (CA)
Assigned to INTERAPTIX INC., Toronto (CA)
Filed by INTERAPTIX INC., Toronto (CA)
Filed on Jun. 13, 2023, as Appl. No. 18/208,937.
Application 18/208,937 is a continuation of application No. 17/680,802, filed on Feb. 25, 2022, granted, now 11,715,236.
Application 17/680,802 is a continuation of application No. 16/791,203, filed on Feb. 14, 2020, granted, now 11,288,842, issued on Mar. 29, 2022.
Claims priority of provisional application 62/806,324, filed on Feb. 15, 2019.
Prior Publication US 2023/0326078 A1, Oct. 12, 2023
This patent is subject to a terminal disclaimer.
Int. Cl. H04N 17/00 (2006.01); G06T 7/00 (2017.01); G06T 7/80 (2017.01)
CPC G06T 7/80 (2017.01) [G06T 7/97 (2017.01); H04N 17/002 (2013.01)] 14 Claims
OG exemplary drawing
 
1. A computer-implemented method of re-projecting and combining sensor data of a scene from a plurality of sensors for visualization, the method comprising:
receiving the sensor data from the plurality of sensors; the sensor data comprising red-green-blue-depth (RGB-D) channel values for each of a plurality of pixels and the values of the RGB-D channels originating from a combination of two or more sensors;
re-projecting the sensor data from each of the sensors into a new viewpoint by:
calibrating each of the sensors to determine one or more calibration values for the respective sensor; and
determining a mapping function to obtain an undistorted image representative of the new viewpoint for each sensor;
generating a point cloud for the respective sensor by applying the calibration values to each of the depth D channel values for each pixel; and
applying a matrix representative of the new viewpoint to the point cloud for each sensor;
localizing each of the re-projected sensor data;
combining the localized re-projected sensor data into a combined image; and
outputting the combined image.