US 12,322,015 B2
Dynamic locomotion adaptation in runtime generated environments
Wolfram Sebastian Starke, Edinburgh (GB); and Harold Henry Chaput, Castro Valley, CA (US)
Assigned to Electronic Arts Inc., Redwood City, CA (US)
Filed by Electronic Arts Inc., Redwood City, CA (US)
Filed on Dec. 14, 2021, as Appl. No. 17/550,973.
Prior Publication US 2023/0186543 A1, Jun. 15, 2023
Int. Cl. G06T 13/40 (2011.01); A63F 13/56 (2014.01); G06N 20/00 (2019.01); G06T 7/20 (2017.01); G06T 7/70 (2017.01)
CPC G06T 13/40 (2013.01) [A63F 13/56 (2014.09); G06T 7/20 (2013.01); G06T 7/70 (2017.01)] 19 Claims
OG exemplary drawing
 
1. A computer-implemented method comprising:
as implemented by one or more hardware processors of a computing system configured with specific computer-executable instructions,
receiving a first character pose associated with a first frame of an animation depicting a character in a virtual environment of a video game;
receiving one or more virtual environment labels associated with the virtual environment of the video game;
applying at least the first character pose and the one or more virtual environment labels to a pose prediction model to obtain a second character pose associated with the character in the virtual environment from the pose prediction model; and
generating a second frame of the animation based at least in part on the second character pose, wherein the second frame occurs subsequent to the first frame, wherein obtaining the second character pose further comprises receiving an indication of one or more contacts of the character with a surface within the virtual environment, wherein generating the second frame comprises post-processing the second frame based at least in part on the one or more contacts to resolve motion artifacts of the second character pose with the virtual environment.