US 12,204,857 B2
Systems and methods for text classification using label modular prompts
Hailin Chen, Singapore (SG); Amrita Saha, Singapore (SG); Shafiq Rayhan Joty, Singapore (SG); and Chu Hong Hoi, Singapore (SG)
Assigned to Salesforce, Inc., San Francisco, CA (US)
Filed by Salesforce, Inc., San Francisco, CA (US)
Filed on Nov. 28, 2022, as Appl. No. 18/059,234.
Claims priority of provisional application 63/355,476, filed on Jun. 24, 2022.
Prior Publication US 2023/0419049 A1, Dec. 28, 2023
Int. Cl. G06F 40/284 (2020.01); G06F 18/214 (2023.01); G06F 18/2413 (2023.01); G06F 40/295 (2020.01); G06F 40/40 (2020.01)
CPC G06F 40/284 (2020.01) [G06F 18/214 (2023.01); G06F 18/2413 (2023.01); G06F 40/295 (2020.01); G06F 40/40 (2020.01)] 20 Claims
OG exemplary drawing
 
15. A system, comprising:
a non-transitory memory; and
one or more hardware processors coupled to the non-transitory memory and configured to read instructions from the non-transitory memory to cause the system to perform a method comprising:
receiving, via a data interface, a first training dataset associated with a first plurality of class labels for a first training process;
generating, for a first instance of the first training dataset, a set of labels of interest by sampling from a set of possible class labels including the first plurality of class labels;
generating, by a prompt generator, a first prompt based on the set of labels of interest;
generating, by a pretrained language model, a task output in response to an input of the first instance prepended with the first prompt;
computing a loss objective based on the task output and the set of labels of interest; and
updating parameters of the prompt generator based on a computed loss function via backpropagation while the pretrained language model is frozen.