US 11,954,779 B2
Animation generation method for tracking facial expression and neural network training method thereof
Chin-Yu Chien, Central (HK); Yu-Hsien Li, Central (HK); and Yi-Chi Cheng, Central (HK)
Assigned to DIGITAL DOMAIN ENTERPRISES GROUP LIMITED, Hong Kong (CN)
Filed by Digital Domain Enterprises Group Limited, Central (HK)
Filed on Mar. 8, 2022, as Appl. No. 17/689,495.
Claims priority of provisional application 63/158,384, filed on Mar. 9, 2021.
Prior Publication US 2022/0292753 A1, Sep. 15, 2022
Int. Cl. G06T 13/40 (2011.01); G06F 18/214 (2023.01); G06V 10/82 (2022.01); G06V 40/16 (2022.01)
CPC G06T 13/40 (2013.01) [G06F 18/214 (2023.01); G06V 10/82 (2022.01); G06V 40/174 (2022.01)] 9 Claims
OG exemplary drawing
 
1. An animation generation method for tracking a facial expression, comprising: applying an expression parameter set to a first 3D role model, by a 3D engine, so that the first 3D role model presents a facial expression corresponding to an expression parameter set;
rendering the first 3D role model to obtain a virtual expression image corresponding to the expression parameter set, the virtual expression image presenting the facial expression of the first 3D role model;
applying a plurality of real facial images to the virtual expression image corresponding to the facial expression respectively to generate a plurality of real expression images having the same facial expression as the virtual expression image;
training a tracking neural network according to the expression parameter set, which corresponds to the facial expression of the first 3D role model, and the real expression images having the same facial expression as the virtual expression image;
inputting a target facial image to the trained tracking neural network to obtain a predicted expression parameter set; and
applying the predicted expression parameter set to a second 3D role model, by the 3D engine, so that the second 3D role model presents an expression similar to the target facial image.