US 12,086,539 B2
System and method for natural language processing using neural network with cross-task training
Wenpeng Yin, Palo Alto, CA (US); Nazneen Rajani, Mountain View, CA (US); Richard Socher, Menlo Park, CA (US); and Caiming Xiong, Menlo Park, CA (US)
Assigned to Salesforce, Inc., San Francisco, CA (US)
Filed by Salesforce, Inc., San Francisco, CA (US)
Filed on Nov. 9, 2020, as Appl. No. 17/093,478.
Claims priority of provisional application 62/945,789, filed on Dec. 9, 2019.
Prior Publication US 2021/0174204 A1, Jun. 10, 2021
Int. Cl. G06F 40/20 (2020.01); G06F 16/33 (2019.01); G06F 16/332 (2019.01); G06F 40/279 (2020.01); G06F 40/30 (2020.01); G06N 3/08 (2023.01)
CPC G06F 40/20 (2020.01) [G06F 16/3329 (2019.01); G06F 16/3344 (2019.01); G06F 40/279 (2020.01); G06F 40/30 (2020.01); G06N 3/08 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A method for using a neural network model for natural language processing (NLP), comprising:
receiving training data associated with a source domain and a target domain;
generating one or more query batches,
wherein a source subquery batch includes one or more source tasks associated with the source domain, and
wherein a target subquery batch includes one or more target tasks associated with the target domain;
for each query batch, generating combination class representations associated with a combination of classes in the source domain and the target domain;
generating a source loss using the source subquery batch and the combination class representations;
generating a target loss using the target subquery batch and the combination class representations; and
generating a query batch loss using the source loss and target loss; and
performing an optimization on the neural network model by adjusting its network parameters based on the query batch loss,
wherein the optimized neural network model is used to perform one or more new NLP tasks.