US 12,347,027 B2
Systems and methods for rendering virtual objects using editable light-source parameter estimation
Mathieu Garon, Montreal (CA); Henrique Weber, Quebec (CA); and Jean-Francois Lalonde, Quebec (CA)
Assigned to Technologies Depix Inc., Montreal (CA)
Filed by TECHNOLOGIES DEPIX INC., Montreal (CA)
Filed on May 12, 2023, as Appl. No. 18/196,887.
Claims priority of provisional application 63/364,588, filed on May 12, 2022.
Prior Publication US 2023/0368459 A1, Nov. 16, 2023
Int. Cl. G06T 15/50 (2011.01); G06T 15/04 (2011.01); G06T 15/60 (2006.01); G06V 10/82 (2022.01)
CPC G06T 15/506 (2013.01) [G06T 15/04 (2013.01); G06T 15/60 (2013.01); G06V 10/82 (2022.01)] 20 Claims
OG exemplary drawing
 
1. A method for rendering a virtual object at a designated position in an input digital image corresponding to a perspective of a scene, the method comprising:
estimating a set of lighting parameters representing a light source in the scene, the lighting parameters being estimated using a lighting neural network trained to map the input digital image to the set of lighting parameters;
estimating a scene layout corresponding to a parametric representation of the scene, the scene layout being estimated using a layout neural network trained to map at least the input digital image to the parametric representation of the scene;
generating an environment texture map corresponding to predicted textures of surfaces in an environment of the scene, the environment texture map being generated using a texture neural network trained to predict a texture conditioned on an input comprising the input digital image, the lighting parameters, and the scene layout;
rendering the virtual object in a virtual scene constructed using the estimated lighting parameters, the scene layout, and the environment texture map; and
compositing the rendered virtual object on the input digital image at the designated position.