US 12,086,708 B2
Methods and systems for producing neural sequential models
Tetiana Parshakova, Stanford, CA (US); Marc Dymetman, Grenoble (FR); and Jean-Marc Andréoli, Meylan (FR)
Assigned to NAVER CORPORATION, Seongnam-Si (KR)
Filed by NAVER CORPORATION, Seongnam-si (KR)
Filed on Sep. 11, 2020, as Appl. No. 17/018,754.
Prior Publication US 2022/0083852 A1, Mar. 17, 2022
Int. Cl. G06F 40/10 (2020.01); G06F 18/214 (2023.01); G06F 18/2321 (2023.01); G06F 40/44 (2020.01); G06N 3/08 (2023.01)
CPC G06N 3/08 (2013.01) [G06F 18/2148 (2023.01); G06F 18/2321 (2023.01); G06F 40/10 (2020.01); G06F 40/44 (2020.01)] 48 Claims
OG exemplary drawing
 
1. A natural language processing method for producing a normalized sequential model using a processor, the method comprising:
providing a sequential energy-based model computed by a parameterized neural network, the sequential energy-based model defining an unnormalized probability distribution over a target sequence of text for a context source of text; and
producing the normalized sequential model by projecting the sequential energy-based model onto a target autoregressive model that approximates a normalized distribution associated with the sequential energy-based model;
wherein the normalized sequential model is adapted to generate a target sequence of text from a context sequence of text; and
wherein the normalized sequential model is configured to perform one of language modeling, dialog, natural language generation, and machine translation.