US 11,734,861 B2
2D and 3D floor plan generation
Shreya Goyal, Jaipur (IN); Chiranjoy Chattopadhyay, Jodhpur (IN); Naimul Mefraz Khan, Toronto (CA); Gaurav Bhatnagar, Jodhpur (IN); Srinivas Krishna, Toronto (CA); Laura Thomas, Toronto (CA); and Daniel Chantal Mills, Toronto (CA)
Assigned to AWE Company Limited, Toronto (CA)
Filed by AWE Company Limited, Toronto (CA)
Filed on Sep. 22, 2021, as Appl. No. 17/482,111.
Prior Publication US 2023/0106339 A1, Apr. 6, 2023
Int. Cl. G06T 11/00 (2006.01); G06T 7/13 (2017.01); G06T 7/00 (2017.01); G06T 7/30 (2017.01); G06T 7/50 (2017.01); G06T 3/00 (2006.01); G06F 30/13 (2020.01); G06T 7/70 (2017.01); G06V 10/44 (2022.01); G06V 20/10 (2022.01); G06F 18/23 (2023.01)
CPC G06T 11/00 (2013.01) [G06F 18/23 (2023.01); G06F 30/13 (2020.01); G06T 3/0031 (2013.01); G06T 7/13 (2017.01); G06T 7/30 (2017.01); G06T 7/50 (2017.01); G06T 7/70 (2017.01); G06T 7/97 (2017.01); G06V 10/44 (2022.01); G06V 20/10 (2022.01); G06T 2207/10016 (2013.01); G06T 2207/10024 (2013.01); G06T 2207/10028 (2013.01); G06T 2207/20084 (2013.01); G06T 2207/30244 (2013.01); G06T 2210/04 (2013.01)] 28 Claims
OG exemplary drawing
 
1. A modelling method comprising:
receiving two-dimensional (2D) images of corners of an interior space captured by a camera;
generating, using a positioning module, a corresponding camera position and camera orientation in a three-dimensional (3D) coordinate system in the interior space for each 2D image;
generating a corresponding depth map for each 2D image by using a depth module to estimate depth for each pixel in each 2D image;
generating a corresponding edge map for each 2D image by using an edge module to identify whether each pixel in each 2D image is a wall or an edge;
generating, using a reconstruction module, a 3D point cloud for each 2D image using the corresponding depth map and a focal length and center coordinates of the camera;
transforming, using a transformation module, the 3D point clouds with the corresponding edge map into a 2D space in the 3D coordinate system from a perspective of the camera;
regularizing, using a regularization module, the 3D point clouds in the 2D space into boundary lines; and
generating a 2D plan of the interior space from the boundary lines;
wherein coordinates for each pixel in each 3D point cloud is generated by:

OG Complex Work Unit Math
wherein X, Y are coordinates corresponding to a real world,
Z is a depth coordinate,
Du,v is a depth value corresponding to the (u,v) pixel in the depth map,
S is a scaling factor of each corresponding 2D image,
f is the focal length of the camera, and
Cx, Cy are the center coordinates of the camera.