US 12,229,977 B2
Augmented reality guided depth estimation
Georgios Evangelidis, Vienna (AT); Branislav Micusik, St.Andrae-Woerdern (AT); and Sagi Katz, Yokneam Ilit (IL)
Assigned to SNAP INC., Santa Monica, CA (US)
Filed by Snap Inc., Santa Monica, CA (US)
Filed on Nov. 18, 2021, as Appl. No. 17/529,527.
Claims priority of provisional application 63/189,980, filed on May 18, 2021.
Prior Publication US 2022/0375110 A1, Nov. 24, 2022
Int. Cl. G06T 7/50 (2017.01); G06F 3/01 (2006.01); G06T 3/18 (2024.01); G06T 19/00 (2011.01)
CPC G06T 7/50 (2017.01) [G06F 3/012 (2013.01); G06T 3/18 (2024.01); G06T 19/006 (2013.01)] 19 Claims
OG exemplary drawing
 
1. A method comprising:
identifying a virtual object rendered in a first frame, a location of the virtual object in the first frame based on a first pose of an augmented reality (AR) device;
displaying the virtual object in the first frame in an optically-transparent screen of the AR device;
determining a second pose of the AR device, the second pose following the first pose;
applying a warping transformation of the virtual object based on the location of the virtual object rendered in the first frame and the second pose of the AR device to identify a projected location of the virtual object in a second frame;
identifying a projected path of the virtual object based on a preconfigured dynamics behavior of the virtual object;
identifying, using a processor of the AR device, an augmentation area in the second frame, the augmentation area comprising the projected location based on the projected path of the virtual object in the second frame;
determining depth information limited to the augmentation area in the second frame;
rendering, using a graphical processing unit of the AR device, the virtual object in the second frame based on the depth information; and
displaying the virtual object in the second frame in the optically-transparent screen of the AR device.