US 12,340,457 B2
Radiance field gradient scaling for unbiased near-camera training
Julien Philip, London (GB); and Valentin Deschaintre, London (GB)
Assigned to Adobe Inc., San Jose, CA (US)
Filed by Adobe Inc., San Jose, CA (US)
Filed on Jun. 9, 2023, as Appl. No. 18/207,923.
Prior Publication US 2024/0412444 A1, Dec. 12, 2024
Int. Cl. G06T 15/06 (2011.01); G06T 3/40 (2006.01); G06T 7/70 (2017.01); G06T 7/90 (2017.01); G06V 10/74 (2022.01)
CPC G06T 15/06 (2013.01) [G06T 3/40 (2013.01); G06T 7/70 (2017.01); G06T 7/90 (2017.01); G06V 10/761 (2022.01); G06T 2207/20081 (2013.01); G06T 2207/20084 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A method comprising one or more computing devices performing operations comprising:
accessing an input image of a three-dimensional (3D) environment, the input image comprising a plurality of pixels, wherein each pixel of the plurality of pixels comprises a pixel color;
determining a camera location based on the input image of the 3D environment;
determining a ray from the camera location in a direction of a pixel of the plurality of pixels;
integrating sampled information from a volumetric representation of the 3D environment along the ray from the camera location to obtain an integrated color corresponding to the pixel;
training a machine learning (ML) model configured to predict a density and a color of the 3D environment, the training comprising minimizing a loss function using a scaling factor that is determined based on a distance between the camera location and a point along the ray, the loss function defined based on a difference between the integrated color and the pixel color of the pixel; and
outputting the trained ML model for use in rendering an output image of the 3D environment.