US 12,243,238 B1
Hand pose estimation for machine learning based gesture recognition
Jonathan Marsden, San Mateo, CA (US); Raffi Bedikian, San Francisco, CA (US); and David Samuel Holz, San Francisco, CA (US)
Assigned to ULTRAHAPTICS IP TWO LIMITED, Bristol (GB)
Filed by ULTRAHAPTICS IP TWO LIMITED, Bristol (GB)
Filed on Jul. 20, 2023, as Appl. No. 18/224,551.
Application 18/224,551 is a division of application No. 16/508,231, filed on Jul. 10, 2019, granted, now 11,714,880.
Application 16/508,231 is a continuation of application No. 15/432,872, filed on Feb. 14, 2017, abandoned.
Claims priority of provisional application 62/335,534, filed on May 12, 2016.
Claims priority of provisional application 62/296,561, filed on Feb. 17, 2016.
This patent is subject to a terminal disclaimer.
Int. Cl. G06T 7/13 (2017.01)
CPC G06T 7/13 (2017.01) [G06T 2207/10028 (2013.01)] 22 Claims
OG exemplary drawing
 
1. A method of preparing sample hand positions for training of neural network systems, the method including:
accessing simulation parameters that specify at least one of:
a range of hand positions and position sequences,
a range of hand anatomies, including palm size, fattiness, stubbiness, and skin tone, and
a range of backgrounds;
accessing a camera perspective specification that specifies one or more of:
a focal length,
a field of view of the camera,
a wavelength sensitivity, and
artificial lighting conditions;
generating a plurality of hand position-hand anatomy-background simulations, each simulation labeled with hand position parameters, including joint locations of joints of the hand in three dimensions (3D), in a ground truth vector for training a convolutional neural network, the simulations organized in sequences;
applying the camera perspective specification to render from the simulations at least a corresponding set of simulated hand position images; and
saving the simulated hand position images with one or more labelled hand position parameters from the corresponding simulations.