US 12,314,670 B2
Semantic parsing of utterance using contractive paraphrasing
Benjamin David Van Durme, Baltimore, MD (US); Adam D. Pauls, San Francisco, CA (US); Daniel Louis Klein, Orinda, CA (US); Eui Chul Shin, San Francisco, CA (US); Christopher H. Lin, Bellevue, WA (US); Pengyu Chen, Union City, CA (US); Subhro Roy, Walnut Creek, CA (US); Emmanouil Antonios Platanios, Pittsburgh, PA (US); Jason Michael Eisner, Baltimore, MD (US); Benjamin Lev Snyder, Bellevue, WA (US); and Samuel McIntire Thomson, Berkeley, CA (US)
Assigned to Microsoft Technology Licensing, LLC, Redmond, WA (US)
Filed by Microsoft Technology Licensing, LLC, Redmond, WA (US)
Filed on Apr. 13, 2021, as Appl. No. 17/229,637.
Prior Publication US 2022/0327288 A1, Oct. 13, 2022
Int. Cl. G06F 40/30 (2020.01); G06F 40/205 (2020.01); G06F 40/55 (2020.01); G06F 40/58 (2020.01)
CPC G06F 40/30 (2020.01) [G06F 40/205 (2020.01); G06F 40/55 (2020.01); G06F 40/58 (2020.01)] 20 Claims
OG exemplary drawing
 
1. A computer-implemented method of automatically generating an instruction code based on a natural language utterance using a pre-trained natural language model, the method comprising:
receiving the natural language utterance;
identifying, based on relevance between the received natural language utterance and a pair of a query utterance and an answer utterance, the pair of the query utterance and the answer utterance, wherein the query utterance is distinct from the received natural language utterance;
generating, based at least on a combination of the received natural language utterance and the identified pair of the query utterance and the answer utterance as input to the pre-trained natural language model, a canonical utterance using the pre-trained natural language model, wherein the pre-trained natural language model performs prediction of words based on the received natural language utterance, the identified pair of the query utterance and the answer utterance represents an example of a natural language utterance input and output to influence predicting the words by the pre-trained natural language model without further training the pre-trained natural language model, the output is in canonical form and is previously generated at least in part by the pre-trained natural language model in response to the query utterance, and the canonical utterance includes, at least in part, a sequence of words based at least on the words predicted by the pre-trained natural language model and a constraint associated with the sequence of words;
generating, based on the generated canonical utterance, the instruction code, wherein the instruction code is executable by a processor;
executing, by the processor, the instruction code;
generating, based on a result of the executing the instruction code, a response; and
transmitting the response.