US 11,842,444 B2
Visualization of camera location in a real-time synchronized 3D mesh
Sean M. Adkinson, North Plains, OR (US); Teressa Chizeck, Portland, OR (US); and Ryan R. Fink, Vancouver, WA (US)
Assigned to STREEM, LLC, Portland, OR (US)
Filed by STREEM, LLC, Portland, OR (US)
Filed on Jun. 2, 2021, as Appl. No. 17/336,797.
Prior Publication US 2022/0392167 A1, Dec. 8, 2022
Int. Cl. G06T 17/20 (2006.01); G06T 7/593 (2017.01); G06T 7/73 (2017.01); G06T 19/00 (2011.01)
CPC G06T 17/205 (2013.01) [G06T 7/593 (2017.01); G06T 7/73 (2017.01); G06T 19/006 (2013.01)] 16 Claims
OG exemplary drawing
 
1. A method for representing a capturing device location within a 3D mesh, comprising:
receiving, at a receiving device, a video stream and augmented reality (AR) data correlated to the video stream from the capturing device, the video stream and AR data representing an environment around the capturing device;
generating incrementally, from the video stream and AR data, a 3D mesh representation of the environment;
determining, from the video stream and AR data, a 3D location and a spatial orientation of the capturing device within the environment;
indicating, on a display of the 3D mesh with a graphic widget, the 3D location and spatial orientation of the capturing device within the 3D mesh, the spatial orientation indicating the rotation and tilt of the capturing device on each of an X, Y, and Z axis; and
updating, iteratively, the 3D location and spatial orientation of the capturing device within the 3D mesh.