The patent badge is an abbreviated version of the USPTO patent document. The patent badge does contain a link to the full patent document.

The patent badge is an abbreviated version of the USPTO patent document. The patent badge covers the following: Patent number, Date patent was issued, Date patent was filed, Title of the patent, Applicant, Inventor, Assignee, Attorney firm, Primary examiner, Assistant examiner, CPCs, and Abstract. The patent badge does contain a link to the full patent document (in Adobe Acrobat format, aka pdf). To download or print any patent click here.

Date of Patent:
Mar. 04, 2025

Filed:

Jan. 27, 2021
Applicant:

Google Llc, Mountain View, CA (US);

Inventors:

Yanping Huang, Mountain View, CA (US);

Dmitry Lepikhin, Sunnyvale, CA (US);

Maxim Krikun, Castro Valley, CA (US);

Orhan Firat, Mountain View, CA (US);

Ankur Bapna, Sunnyvale, CA (US);

Thang Luong, Santa Clara, CA (US);

Sneha Kudugunta, Sunnyvale, CA (US);

Assignee:

GOOGLE LLC, Mountain View, CA (US);

Attorney:
Primary Examiner:
Assistant Examiner:
Int. Cl.
CPC ...
G06N 3/045 (2023.01); G06N 3/08 (2023.01);
U.S. Cl.
CPC ...
G06N 3/045 (2023.01); G06N 3/08 (2013.01);
Abstract

Systems and methods for routing in mixture-of-expert models. In some aspects of the technology, a transformer may have at least one Mixture-of-Experts ('MoE') layer in each of its encoder and decoder, with the at least one MoE layer of the encoder having a learned gating function configured to route each token of a task to two or more selected expert feed-forward networks, and the at least one MoE layer of the decoder having a learned gating function configured to route each task to two or more selected expert feed-forward networks.


Find Patent Forward Citations

Loading…