US 12,260,185 B2
Coarse-to-fine abstractive dialogue summarization with controllable granularity
Chien-Sheng Wu, Mountain View, CA (US); Wenhao Liu, Redwood City, CA (US); Caiming Xiong, Menlo Park, CA (US); and Linqing Liu, Menlo Park, CA (US)
Assigned to Salesforce, Inc., San Francisco, CA (US)
Filed by Salesforce, Inc., San Francisco, CA (US)
Filed on Jan. 27, 2021, as Appl. No. 17/159,625.
Claims priority of provisional application 63/087,024, filed on Oct. 2, 2020.
Prior Publication US 2022/0108086 A1, Apr. 7, 2022
Int. Cl. G06F 40/56 (2020.01); G06F 40/205 (2020.01)
CPC G06F 40/56 (2020.01) [G06F 40/205 (2020.01)] 20 Claims
OG exemplary drawing
 
1. A method for training a neural network model to generate a dialogue summary, comprising:
dividing, using a similarity module executing on a processor, a dialogue conversation history into dialogue segments, wherein dividing the dialogue conversation history into the dialogue segments further comprises:
matching, using the similarity module, a plurality of dialogue segments in a dialogue conversation history against segment summaries associated with a training summary, wherein that matching generates similarity scores;
selecting the dialogue segments from the plurality of dialogue segments that correspond to highest similarity scores from the similarity scores, wherein a dialogue segment in the selected dialogue segments includes including at least one dialogue turn from dialogue turns in the dialogue conversation history;
dividing, using the similarity module, the training summary into training segment summaries based on dialogue turns in the dialogue conversation history, wherein the training summary summarizes the dialogue conversation history;
generating a summary draft from the dialogue segments in a dialogue conversation history and the training segment summaries, wherein the summary draft includes turn indexes corresponding to the dialogue turns, labels for action categories and key phrases associated with a subset of the dialogue turns;
generating, using an encoder of a generative language model executing on the processor, encodings from the dialogue segments; and
generating, using a decoder of the generative language model, segment summaries for the dialogue conversation history from the encodings from the dialogue segments and the summary draft.