The patent badge is an abbreviated version of the USPTO patent document. The patent badge does contain a link to the full patent document.

The patent badge is an abbreviated version of the USPTO patent document. The patent badge covers the following: Patent number, Date patent was issued, Date patent was filed, Title of the patent, Applicant, Inventor, Assignee, Attorney firm, Primary examiner, Assistant examiner, CPCs, and Abstract. The patent badge does contain a link to the full patent document (in Adobe Acrobat format, aka pdf). To download or print any patent click here.

Date of Patent:
Oct. 10, 2023

Filed:

Aug. 27, 2021
Applicant:

Salesforce.com, Inc., San Francisco, CA (US);

Inventors:

Yue Wang, Singapore, SG;

Weishi Wang, Singapore, SG;

Shafiq Rayhan Joty, Singapore, SG;

Chu Hong Hoi, Singapore, SG;

Assignee:

SALESFORCE.COM, INC., San Francisco, CA (US);

Attorney:
Primary Examiner:
Int. Cl.
CPC ...
G06F 9/44 (2018.01); G06F 8/41 (2018.01); G06F 40/20 (2020.01); G06N 3/084 (2023.01); G06F 18/214 (2023.01); G06N 3/047 (2023.01);
U.S. Cl.
CPC ...
G06F 8/427 (2013.01); G06F 18/214 (2023.01); G06F 40/20 (2020.01); G06N 3/047 (2023.01); G06N 3/084 (2013.01);
Abstract

Embodiments described herein a code generation and understanding model that builds on a Transformer-based encoder-decoder framework. The code generation and understanding model is configured to derive generic representations for programming language (PL) and natural language (NL) in code domain via pre-training on unlabeled code corpus, and then to benefit many code-related downstream tasks with fine-tuning. Apart from the denoising sequence-to-sequence objectives widely adopted for pre-training on natural language, identifier tagging and prediction pre-training objective is adopted to enable the model to better leverage the crucial token type information from PL, which specifically are the identifiers assigned by developers.


Find Patent Forward Citations

Loading…