The patent badge is an abbreviated version of the USPTO patent document. The patent badge does contain a link to the full patent document.

The patent badge is an abbreviated version of the USPTO patent document. The patent badge covers the following: Patent number, Date patent was issued, Date patent was filed, Title of the patent, Applicant, Inventor, Assignee, Attorney firm, Primary examiner, Assistant examiner, CPCs, and Abstract. The patent badge does contain a link to the full patent document (in Adobe Acrobat format, aka pdf). To download or print any patent click here.

Date of Patent:
Apr. 02, 2024

Filed:

Apr. 07, 2021
Applicant:

Baidu Usa, Llc, Sunnyvale, CA (US);

Inventors:

Shaogang Ren, Redmond, WA (US);

Ping Li, Bellevue, WA (US);

Assignee:

Baidu USA LLC, Sunnyvale, CA (US);

Attorney:
Primary Examiner:
Assistant Examiner:
Int. Cl.
CPC ...
G06F 40/216 (2020.01); G06F 16/901 (2019.01); G06F 40/211 (2020.01); G06F 40/284 (2020.01); G06F 40/30 (2020.01);
U.S. Cl.
CPC ...
G06F 40/216 (2020.01); G06F 16/9024 (2019.01); G06F 40/211 (2020.01); G06F 40/284 (2020.01); G06F 40/30 (2020.01);
Abstract

Described herein are system and method embodiments to improve word representation learning. Embodiments of a probabilistic prior may seamlessly integrate statistical disentanglement with word embedding. Different from previous deterministic methods, word embedding may be taken as a probabilistic generative model, and it enables imposing a prior that may identify independent factors generating word representation vectors. The probabilistic prior not only enhances the representation of word embedding, but also improves the model's robustness and stability. Furthermore, embodiments of the disclosed method may be flexibly plugged in various word embedding models. Extensive experimental results show that embodiments of the presented method may improve word representation on different tasks.


Find Patent Forward Citations

Loading…