The patent badge is an abbreviated version of the USPTO patent document. The patent badge does contain a link to the full patent document.

The patent badge is an abbreviated version of the USPTO patent document. The patent badge covers the following: Patent number, Date patent was issued, Date patent was filed, Title of the patent, Applicant, Inventor, Assignee, Attorney firm, Primary examiner, Assistant examiner, CPCs, and Abstract. The patent badge does contain a link to the full patent document (in Adobe Acrobat format, aka pdf). To download or print any patent click here.

Date of Patent:
Jan. 21, 2025

Filed:

Nov. 28, 2022
Applicant:

Salesforce, Inc., San Francisco, CA (US);

Inventors:

Hailin Chen, Singapore, SG;

Amrita Saha, Singapore, SG;

Shafiq Rayhan Joty, Singapore, SG;

Chu Hong Hoi, Singapore, SG;

Assignee:

Salesforce, Inc., San Francisco, CA (US);

Attorney:
Primary Examiner:
Int. Cl.
CPC ...
G06F 40/284 (2020.01); G06F 18/214 (2023.01); G06F 18/2413 (2023.01); G06F 40/295 (2020.01); G06F 40/40 (2020.01);
U.S. Cl.
CPC ...
G06F 40/284 (2020.01); G06F 18/214 (2023.01); G06F 18/2413 (2023.01); G06F 40/295 (2020.01); G06F 40/40 (2020.01);
Abstract

Embodiments described herein provide training a prompt generator for text classification. A first training dataset associated with a first plurality of class labels is received for a first training process. For a first instance of the first training dataset, a set of labels of interest is generated by sampling from a set of possible class labels including the first plurality of class labels. The prompt generator generates a first prompt based on the set of labels of interest. A pretrained language model generates a task output in response to an input of the first instance prepended with the first prompt. A loss objective is generated based on the task output and the set of labels of interest. Parameters of the prompt generator are updated based on the computed loss function via backpropagation while the pretrained language model is frozen.


Find Patent Forward Citations

Loading…