US 12,223,611 B2
Generating styles for neural style transfer in three-dimensional shapes
Hooman Shayani, Longfield (GB); Marco Fumero, Rome (IT); and Aditya Sanghi, Toronto (CA)
Assigned to AUTODESK, INC., San Francisco, CA (US)
Filed by AUTODESK, INC., San Francisco, CA (US)
Filed on Jan. 3, 2023, as Appl. No. 18/149,609.
Claims priority of provisional application 63/328,658, filed on Apr. 7, 2022.
Prior Publication US 2023/0326159 A1, Oct. 12, 2023
Int. Cl. G06T 19/20 (2011.01); G06N 3/0455 (2023.01); G06N 3/0475 (2023.01); G06N 3/08 (2023.01); G06N 3/092 (2023.01); G06T 17/00 (2006.01); G06T 17/10 (2006.01)
CPC G06T 19/20 (2013.01) [G06N 3/0455 (2023.01); G06N 3/0475 (2023.01); G06N 3/08 (2013.01); G06N 3/092 (2023.01); G06T 17/00 (2013.01); G06T 17/10 (2013.01); G06T 2210/56 (2013.01); G06T 2219/2021 (2013.01); G06T 2219/2024 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A computer-implemented method for performing style transfer, the method comprising:
determining a distribution associated with a plurality of style codes for a plurality of three-dimensional (3D) shapes, wherein each style code included in the plurality of style codes represents a difference between a first 3D shape and a second 3D shape, and wherein the second 3D shape is generated by applying one or more augmentations to the first 3D shape;
sampling from the distribution to generate an additional style code;
executing a first trained machine learning model based on the additional style code to generate an output 3D shape having one or more style-based attributes associated with the additional style code and one or more content-based attributes associated with an object; and
generating a 3D model of the object based on the output 3D shape.