The patent badge is an abbreviated version of the USPTO patent document. The patent badge does contain a link to the full patent document.

The patent badge is an abbreviated version of the USPTO patent document. The patent badge covers the following: Patent number, Date patent was issued, Date patent was filed, Title of the patent, Applicant, Inventor, Assignee, Attorney firm, Primary examiner, Assistant examiner, CPCs, and Abstract. The patent badge does contain a link to the full patent document (in Adobe Acrobat format, aka pdf). To download or print any patent click here.

Date of Patent:
May. 20, 2025

Filed:

Aug. 16, 2022
Applicant:

Salesforce, Inc., San Francisco, CA (US);

Inventors:

Rishabh Bhardwaj, Singapore, SG;

Amrita Saha, Singapore, SG;

Chu Hong Hoi, Singapore, SG;

Assignee:

Salesforce, Inc., San Francisco, CA (US);

Attorney:
Primary Examiner:
Int. Cl.
CPC ...
G06F 40/40 (2020.01); G06F 40/12 (2020.01); G06F 40/284 (2020.01); G06F 40/289 (2020.01); G06N 20/00 (2019.01);
U.S. Cl.
CPC ...
G06F 40/284 (2020.01); G06F 40/12 (2020.01); G06F 40/289 (2020.01); G06F 40/40 (2020.01); G06N 20/00 (2019.01);
Abstract

Embodiments described herein provide a soft prompt tuning technique referred to as the Vector quantized Input-contextualized Prompt (VIP). The VIP techniques has two integral properties i) instead of learning a fixed set of prompt tokens irrespective of the input, it generates a contextualized version of the soft prompts, conditional on the input text ii) it further passes the input-contextualized prompt tokens through a quantization network, inspired by Vector Quantized Transformers. The quantization network uses nearest neighbor search over a learnable codebook to train a discrete latent variable model over the prompt-space, thus generating quantized version of contextual prompt tokens. These quantized contextual prompt tokens are finally fed into the frozen language model along with the original input text.


Find Patent Forward Citations

Loading…