The patent badge is an abbreviated version of the USPTO patent document. The patent badge does contain a link to the full patent document.

The patent badge is an abbreviated version of the USPTO patent document. The patent badge covers the following: Patent number, Date patent was issued, Date patent was filed, Title of the patent, Applicant, Inventor, Assignee, Attorney firm, Primary examiner, Assistant examiner, CPCs, and Abstract. The patent badge does contain a link to the full patent document (in Adobe Acrobat format, aka pdf). To download or print any patent click here.

Date of Patent:
Feb. 06, 2024

Filed:

May. 05, 2021
Applicant:

International Business Machines Corporation, Armonk, NY (US);

Inventors:

Hui Wan, White Plains, NY (US);

Xiaodong Cui, Chappaqua, NY (US);

Luis A. Lastras-Montano, Cortlandt Manor, NY (US);

Attorneys:
Primary Examiner:
Assistant Examiner:
Int. Cl.
CPC ...
G06F 40/284 (2020.01); G06F 40/205 (2020.01); G06F 40/30 (2020.01); G06F 40/42 (2020.01); G06F 40/237 (2020.01); G06V 30/194 (2022.01);
U.S. Cl.
CPC ...
G06F 40/284 (2020.01); G06F 40/205 (2020.01); G06F 40/237 (2020.01); G06F 40/30 (2020.01); G06F 40/42 (2020.01); G06V 30/194 (2022.01);
Abstract

From metadata of a corpus of natural language text documents, a relativity matrix is constructed, a row-column intersection in the relativity matrix corresponding to a relationship between two instances of a type of metadata. An encoder model is trained, generating a trained encoder model, to compute an embedding corresponding to a token of a natural language text document within the corpus and the relativity matrix, the encoder model comprising a first encoder layer, the first encoder layer comprising a token embedding portion, a relativity embedding portion, a token self-attention portion, a metadata self-attention portion, and a fusion portion, the training comprising adjusting a set of parameters of the encoder model.


Find Patent Forward Citations

Loading…