US 12,033,056 B2
Multi-task recurrent neural networks
Milad Olia Hashemi, San Francisco, CA (US); Jamie Alexander Smith, Boston, MA (US); and Kevin Jordan Swersky, Toronto (CA)
Assigned to Google LLC, Mountain View, CA (US)
Filed by Google LLC, Mountain View, CA (US)
Filed on Aug. 15, 2022, as Appl. No. 17/887,745.
Application 17/887,745 is a continuation of application No. 16/262,785, filed on Jan. 30, 2019, granted, now 11,416,733.
Claims priority of provisional application 62/769,512, filed on Nov. 19, 2018.
Prior Publication US 2023/0033000 A1, Feb. 2, 2023
Int. Cl. G06N 3/044 (2023.01); G06F 3/06 (2006.01); G06N 3/08 (2023.01)
CPC G06N 3/044 (2023.01) [G06F 3/0604 (2013.01); G06F 3/0659 (2013.01); G06F 3/0673 (2013.01); G06N 3/08 (2013.01)] 20 Claims
OG exemplary drawing
 
12. A system comprising an application-specific hardware integrated circuit, one or more storage devices storing instructions that, when executed by a processor of the integrated circuit, causes performance of operations comprising:
receiving a current input at a recurrent neural network (RNN) implemented on the integrated circuit;
generating, for the current input, a feature representation that represents an embedding vector derived from an embedding space of the RNN based on the current input;
providing the feature representation to a RNN cell of the RNN, wherein the RNN cell processes the feature representation in hardware;
selecting, from a plurality of state registers and based on sequence identifying data, a particular internal state that the RNN uses; and
generating, using the feature representation, the particular internal state, and a set of fixed weights, a prediction and next state of the RNN for a particular task the RNN is to perform.