US 12,216,822 B2
Biopotential-based gesture interpretation with machine labeling
Dexter W. Ang, Boston, MA (US); and David O. Cipoletta, Boston, MA (US)
Assigned to Pison Technology, Inc., Boston, MA (US)
Filed by Pison Technology, Inc., Boston, MA (US)
Filed on Oct. 17, 2022, as Appl. No. 18/047,023.
Application 18/047,023 is a continuation of application No. 17/404,075, filed on Aug. 17, 2021, granted, now 11,474,604.
Application 17/404,075 is a continuation of application No. 16/246,964, filed on Jan. 14, 2019, granted, now 11,099,647, issued on Aug. 24, 2021.
Application 16/246,964 is a continuation in part of application No. 16/055,777, filed on Aug. 6, 2018, granted, now 10,627,914, issued on Apr. 21, 2020.
Application 16/055,777 is a continuation of application No. 16/055,123, filed on Aug. 5, 2018, granted, now 10,802,598, issued on Oct. 13, 2020.
Prior Publication US 2023/0113991 A1, Apr. 13, 2023
Int. Cl. G06F 3/01 (2006.01); G02B 27/01 (2006.01); G06F 3/16 (2006.01); H04W 4/80 (2018.01)
CPC G06F 3/015 (2013.01) [G02B 27/0172 (2013.01); G06F 3/016 (2013.01); G06F 3/017 (2013.01); G06F 3/167 (2013.01); H04W 4/80 (2018.02); G02B 2027/0178 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A system for gesture-based control, the system comprising:
a wearable device configured to be worn at a wrist of a person, the wearable device comprising:
a biopotential sensor, the biopotential sensor being configured to detect biopotentials indicating a state of the hand of the person; and
a wrist motion sensor, the wrist motion sensor being configured to obtain wrist motion data indicating a motion of the wrist;
wherein the biopotential sensor is configured to output a first data stream, the first data stream being configured to indicate actions of the hand of the person; and
a second device configured to output a second data stream, the second data stream also being configured to indicate the actions of the hand of the person;
wherein the system is configured to:
store first event data comprising (i) a plurality of biopotential data points from the first data stream and (ii) biopotential timestamps indicating a time that a respective biopotential data point was generated in the first data stream;
store second event data comprising (i) one or more data points from the second data stream and (ii) one or more second device timestamps indicating when the one or more data points from the second device were generated in the second data stream;
determine, based on an analysis of the second event data, that the hand of the person performed a first action;
determine, based on the one or more second device timestamps, a first action time at which the first action occurred;
based on the first action time and the biopotential timestamps, associate a label with a subset of the first event data, the label indicating that the first action occurred while the first event data was collected; and
using at least the label and the first event data, train a machine learning interpreter to generate interpreted outputs based on at least biopotential data.