US 11,816,136 B2
Natural question generation via reinforcement learning based graph-to-sequence model
Lingfei Wu, Elmsford, NY (US); Yu Chen, Troy, NY (US); and Mohammed J. Zaki, Troy, NY (US)
Assigned to International Business Machines Corporation, Armonk, NY (US); and RENSSELAER POLYTECHNIC INSTITUTE, Troy, NY (US)
Filed by International Business Machines Corporation, Armonk, NY (US); and RENSSELAER POLYTECHNIC INSTITUTE, Troy, NY (US)
Filed on Oct. 23, 2022, as Appl. No. 17/971,635.
Application 17/971,635 is a continuation of application No. 16/843,975, filed on Apr. 9, 2020, granted, now 11,481,418.
Claims priority of provisional application 62/956,488, filed on Jan. 2, 2020.
Prior Publication US 2023/0055666 A1, Feb. 23, 2023
This patent is subject to a terminal disclaimer.
Int. Cl. G06F 16/95 (2019.01); G06F 16/9032 (2019.01); G06N 5/02 (2023.01); G06F 16/332 (2019.01); G06N 20/00 (2019.01); G06F 40/30 (2020.01); G06F 17/16 (2006.01); G06F 16/901 (2019.01); G06F 17/18 (2006.01); G06N 3/04 (2023.01)
CPC G06F 16/3329 (2019.01) [G06F 16/9024 (2019.01); G06F 17/16 (2013.01); G06F 17/18 (2013.01); G06F 40/30 (2020.01); G06N 3/04 (2013.01); G06N 20/00 (2019.01)] 15 Claims
OG exemplary drawing
 
1. A method comprising:
obtaining contextualized passage embeddings and contextualized answer embeddings for a text pair;
obtaining a passage embedding matrix;
constructing a corresponding passage graph based on said passage embedding matrix;
applying a bidirectional gated graph neural network to said corresponding passage graph until a final state embedding is determined, during which application intermediate node embeddings are fused from both incoming and outgoing edges of said graph;
obtaining a graph-level embedding from said final state embedding;
decoding said final state embedding to generate an output sequence; and
training a machine learning model to generate at least one question corresponding to said text pair by evaluating said output sequence.