| CPC G06F 3/0383 (2013.01) [G06F 3/017 (2013.01); G06F 3/0346 (2013.01); G06F 3/011 (2013.01); G06F 3/014 (2013.01); G06F 2203/0383 (2013.01); G06F 2203/0384 (2013.01); G06N 20/00 (2019.01)] | 20 Claims |

|
1. A computer-implemented method for handheld interactive object gesture recognition, comprising:
receiving, by a handheld interactive object via one or more inertial measurement units, movement data indicative of a movement of the handheld interactive object;
selecting, by the handheld interactive object, one or more local machine-learned models of the handheld interactive object or one or more remote machine-learned models of a remote computing device communicatively coupled to the handheld interactive object for processing the movement data, wherein the one or more local machine-learned models are trained to identify a first subset of gesture actions performed with the handheld interactive object and the one or more remote machine-learned models are trained to identify a second subset of gesture actions performed with the handheld interactive object;
in response to selecting the one or more local machine-learned models, processing, by the handheld interactive object, the movement data according to the one or more local machine-learned models and communicating, by the handheld interactive object, a first message to the remote computing device based at least in part on processing the movement data according to the one or more local machine-learned models; and
in response to selecting the one or more remote machine-learned models, communicating, by the handheld interactive object, a second message to the remote computing device, the second message comprising the movement data for processing by the remote computing device according to the one or more remote machine-learned models.
|