US 12,148,081 B2
Immersive analysis environment for human motion data
Frederik Brudy, Toronto (CA); Fraser Anderson, Camrose (CA); Raimund Dachselt, San Rafael, CA (US); George Fitzmaurice, Toronto (CA); Justin Frank Matejka, Newmarket (CA); and Patrick Reipschläger, San Rafael, CA (US)
Assigned to AUTODESK, INC., San Francisco, CA (US)
Filed by AUTODESK, INC., San Francisco, CA (US)
Filed on Feb. 22, 2022, as Appl. No. 17/677,826.
Prior Publication US 2023/0267667 A1, Aug. 24, 2023
Int. Cl. G06T 13/40 (2011.01); G06F 3/01 (2006.01); G06T 7/70 (2017.01); G06T 19/00 (2011.01); G06V 40/20 (2022.01)
CPC G06T 13/40 (2013.01) [G06F 3/011 (2013.01); G06T 7/70 (2017.01); G06T 19/006 (2013.01); G06V 40/23 (2022.01); G06T 2207/30196 (2013.01)] 18 Claims
OG exemplary drawing
 
1. A computer-implemented method for analyzing human motion data, the method comprising:
receiving a set of motion data that indicates one or more movements of a first person within a real-world environment;
generating a first virtual avatar corresponding to the first person based on the set of motion data;
determining a first position of the first virtual avatar within an extended reality (ER) environment based on the one or more movements;
determining a first orientation of the first virtual avatar within the ER environment based on the one or more movements, wherein the first orientation of the first virtual avatar indicates a direction the first virtual avatar is facing and is different from a pose of the first virtual avatar; and
displaying the first virtual avatar within the ER environment based on the first position and the first orientation.