US 12,348,698 B2
Capturing and aligning panoramic image and depth data
Kyle Simek, San Jose, CA (US); David Alan Gausebeck, Mountain View, CA (US); and Matthew Tschudy Bell, Palo Alto, CA (US)
Assigned to Matterport, Inc., Sunnyvale, CA (US)
Filed by Matterport, Inc., Sunnyvale, CA (US)
Filed on Apr. 27, 2023, as Appl. No. 18/308,639.
Application 18/308,639 is a continuation of application No. 16/559,135, filed on Sep. 3, 2019, granted, now 11,677,920.
Application 18/308,639 is a continuation of application No. 15/417,162, filed on Jan. 26, 2017, granted, now 10,848,731, issued on Nov. 24, 2020.
Application 18/308,639 is a continuation of application No. 14/070,426, filed on Nov. 1, 2013, granted, now 10,482,679, issued on Nov. 19, 2019.
Application 18/308,639 is a continuation of application No. 13/776,688, filed on Feb. 25, 2013, granted, now 9,324,190, issued on Apr. 26, 2016.
Claims priority of provisional application 61/603,221, filed on Feb. 24, 2012.
Prior Publication US 2023/0269353 A1, Aug. 24, 2023
This patent is subject to a terminal disclaimer.
Int. Cl. H04N 13/106 (2018.01); H04N 5/265 (2006.01); H04N 13/232 (2018.01); H04N 13/239 (2018.01); H04N 13/254 (2018.01); H04N 13/271 (2018.01); H04N 23/45 (2023.01); H04N 23/698 (2023.01)
CPC H04N 13/106 (2018.05) [H04N 5/265 (2013.01); H04N 13/232 (2018.05); H04N 13/239 (2018.05); H04N 13/254 (2018.05); H04N 13/271 (2018.05); H04N 23/45 (2023.01); G06T 2207/20221 (2013.01); H04N 23/698 (2023.01)] 21 Claims
OG exemplary drawing
 
1. A device having a vertical y axis extending through a center point, the device comprising:
a housing including:
at least one camera having a fisheye camera lens configured to capture 2D image data of an environment from a fixed location, the at least one camera positioned at a first fixed distance from the center point;
at least one depth sensor device including at least one light imaging detection and ranging (LiDAR) device configured to capture 3D depth data, the at least one depth sensor device positioned at a second fixed distance from the center point;
a horizontal rotatable mount configured to enable the fisheye camera lens of the at least one camera to rotate about a vertical y axis, the at least one camera being capable of capturing a plurality of images with mutually overlapping fields of view at different viewpoints; and
at least one processor configured to map, during capture of the 2D image data by the at least one camera and the 3D depth data by the at least one depth sensor device, the 2D image data from the at least one camera and the 3D depth data from the at least one depth sensor device to a common spatial 3D coordinate space based on known capture positions and orientations of the at least one camera and the at least one depth sensor device determined based on the first fixed distance and the second fixed distance to facilitate associating 3D coordinates with respective visual features included in the 2D image data relative to the common spatial 3D coordinate space.