US 12,403,400 B2
Learning character motion alignment with periodic autoencoders
Wolfram Sebastian Starke, Edinburgh (GB); and Harold Henry Chaput, Castro Valley, CA (US)
Assigned to Electronic Arts Inc., Redwood City, CA (US)
Filed by Electronic Arts Inc., Redwood City, CA (US)
Filed on Mar. 31, 2022, as Appl. No. 17/657,591.
Prior Publication US 2023/0310998 A1, Oct. 5, 2023
Int. Cl. A63F 13/57 (2014.01); G06T 13/40 (2011.01); G06T 13/80 (2011.01)
CPC A63F 13/57 (2014.09) [G06T 13/40 (2013.01); G06T 13/80 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A computer-implemented method comprising:
as implemented by a computing system having at least one processor configured with specific computer-executable instructions,
accessing first animation control information generated for a first frame of an electronic game, the first animation control information including a first pose of an in-game character model;
executing a motion matching process using a motion phase manifold comprising a plurality of local motion phase channels, wherein each local motion phase channel comprises spatial and temporal data for movement of a segment of the in-game character model, the motion matching process results in a plurality of matched local poses, the motion matching process comprising:
determining motion matching criteria for matching the local motion phase to existing local poses within a local pose animation dataset for the corresponding local motion phase channel;
performing a search of the local motion phase channel to identify a plurality of local poses within the local pose animation dataset based on the motion matching criteria;
calculating a score for the plurality of local poses based on reference features associated with the local motion phase; and
selecting a local pose from the plurality of local poses corresponding to the local motion phase based on the score;
generating a second pose of the in-game character model based on the plurality of matched local poses for a second frame of the electronic game;
computing second animation control information for the second frame; and
rendering the second frame including at least a portion of the second pose of the in-game character model within an in-game environment based, at least in part, on the second animation control information.