US 12,333,005 B2
Efficient transformer for content-aware anomaly detection in event sequences
Yanchi Liu, Monmouth Junction, NJ (US); Xuchao Zhang, Elkridge, MD (US); Haifeng Chen, West Windsor, NJ (US); Wei Cheng, Princeton Junction, NJ (US); and Shengming Zhang, Kearny, NJ (US)
Assigned to NEC Corporation, Tokyo (JP)
Filed by NEC Laboratories America, Inc., Princeton, NJ (US)
Filed on Jan. 20, 2023, as Appl. No. 18/157,180.
Claims priority of provisional application 63/308,512, filed on Feb. 10, 2022.
Prior Publication US 2023/0252139 A1, Aug. 10, 2023
Int. Cl. G06F 21/55 (2013.01)
CPC G06F 21/554 (2013.01) 20 Claims
OG exemplary drawing
 
1. A method for implementing a self-attentive encoder-decoder transformer framework for anomaly detection in event sequences, the method comprising:
feeding event content information into a content-awareness layer to generate event representations;
inputting, into an encoder, event sequences of two hierarchies to capture long-term and short-term patterns and to generate feature maps;
adding, in the decoder, a special sequence token at a beginning of an input sequence under detection;
during a training stage, applying a one-class objective to bound the decoded special sequence token with a reconstruction loss for sequence forecasting using the generated feature maps from the encoder; and
during a testing stage, labeling any event representation whose decoded special sequence token lies outside a hypersphere as an anomaly.