US 12,293,261 B2
K-LSTM architecture for purchase prediction
Yuanqiao Wu, Montreal (CA); Janahan Ramanan, Montreal (CA); Jaspreet Sahota, Montreal (CA); Cathal Smyth, Montreal (CA); and Yik Chau Lui, Montreal (CA)
Assigned to ROYAL BANK OF CANADA, Montreal (CA)
Filed by ROYAL BANK OF CANADA, Montreal (CA)
Filed on Jun. 13, 2019, as Appl. No. 16/440,116.
Claims priority of provisional application 62/684,545, filed on Jun. 13, 2018.
Prior Publication US 2019/0385080 A1, Dec. 19, 2019
Int. Cl. G06N 20/00 (2019.01); G06Q 30/0601 (2023.01)
CPC G06N 20/00 (2019.01) [G06Q 30/0631 (2013.01)] 9 Claims
OG exemplary drawing
 
1. A computer system for preserving long-term memory of a machine learning architecture, the system comprising:
a processor;
a memory in communication with the processor, the memory storing instructions that, when executed by the processor cause the processor to:
receive event data associated with a series of events occurring over a period of time;
create structured data based on the event data;
identify events in the structured data that are associated with an event category;
label the events associated with the event category;
instantiate or access from the memory a recurrent neural network (RNN) architecture comprising:
a series of nodes, each node in the series of nodes including a plurality of neural network layers for storing a hidden state and a cell state; and
a pair of kronos gates between each pair of sequential nodes in the series of nodes, the pair of kronos gates configured to toggle between preserving a current hidden or cell state and updating the hidden or cell state based on at least one parameterized input;
train the recurrent neural network with the structured data and the labels; wherein training the RNN includes:
based on a toggling of a first kronos gate of the corresponding pair of kronos gates, updating a hidden state for a current time node in the series of nodes, based on a weighted average of the hidden state at a previous time node and the hidden state at the current time node;
based on a toggling of a second kronos gate of the corresponding pair of kronos gates, updating a cell state for the current time node in the RNN architecture, based on a weighted average of the cell state of a previous time mode and the cell state of the current time mode;
store the RNN architecture including the updated hidden state and the updated cell state; and
input to the RNN architecture data representing a second series of events to generate output data for communicating a likelihood of a subsequent occurrence of an event associated with the event category.