The patent badge is an abbreviated version of the USPTO patent document. The patent badge does contain a link to the full patent document.
The patent badge is an abbreviated version of the USPTO patent document. The patent badge covers the following: Patent number, Date patent was issued, Date patent was filed, Title of the patent, Applicant, Inventor, Assignee, Attorney firm, Primary examiner, Assistant examiner, CPCs, and Abstract. The patent badge does contain a link to the full patent document (in Adobe Acrobat format, aka pdf). To download or print any patent click here.
Patent No.:
Date of Patent:
Jul. 11, 2023
Filed:
Jan. 22, 2020
Andreas Spanias, Tempe, AZ (US);
Huan Song, Tempe, AZ (US);
Jayaraman J. Thiagarajan, Milpitas, CA (US);
Deepta Rajan, Bellevue, WA (US);
Andreas Spanias, Tempe, AZ (US);
Huan Song, Tempe, AZ (US);
Jayaraman J. Thiagarajan, Milpitas, CA (US);
Deepta Rajan, Bellevue, WA (US);
Arizona Board of Regents On Behalf Of Arizona State University, Scottsdale, AZ (US);
Lawrence Livermore National Security. LLC, Livermore, CA (US);
Abstract
A system for time series analysis using attention models is disclosed. The system may capture dependencies across different variables through input embedding and may map the order of a sample appearance to a randomized lookup table via positional encoding. The system may capture capturing dependencies within a single sequence through a self-attention mechanism and determine a range of dependency to consider for each position being analyzed. The system may obtain an attention weighting to other positions in the sequence through computation of an inner product and utilize the attention weighting to acquire a vector representation for a position and mask the sequence to enable causality. The system may employ a dense interpolation technique for encoding partial temporal ordering to obtain a single vector representation and a linear layer to obtain logits from the single vector representation. The system may use a type dependent final prediction layer.