The patent badge is an abbreviated version of the USPTO patent document. The patent badge does contain a link to the full patent document.

The patent badge is an abbreviated version of the USPTO patent document. The patent badge covers the following: Patent number, Date patent was issued, Date patent was filed, Title of the patent, Applicant, Inventor, Assignee, Attorney firm, Primary examiner, Assistant examiner, CPCs, and Abstract. The patent badge does contain a link to the full patent document (in Adobe Acrobat format, aka pdf). To download or print any patent click here.

Date of Patent:
Dec. 30, 2025

Filed:

Jul. 26, 2023
Applicant:

Nec Laboratories America, Inc., Princeton, NJ (US);

Inventors:

Yuncong Chen, Plainsboro, NJ (US);

Yanchi Liu, Monmouth Junction, NJ (US);

Wenchao Yu, Plainsboro, NJ (US);

Haifeng Chen, West Windsor, NJ (US);

Assignee:

NEC Corporation, Tokyo, JP;

Attorneys:
Primary Examiner:
Int. Cl.
CPC ...
G06F 40/242 (2020.01); G06F 40/205 (2020.01); G06F 40/284 (2020.01); G06F 40/56 (2020.01);
U.S. Cl.
CPC ...
G06F 40/242 (2020.01); G06F 40/205 (2020.01); G06F 40/284 (2020.01); G06F 40/56 (2020.01);
Abstract

A computer-implemented method for employing a time-series-to-text generation model to generate accurate description texts is provided. The method includes passing time series data through a time series encoder and a multilayer perceptron (MLP) classifier to obtain predicted concept labels, converting the predicted concept labels, by a serializer, to a text token sequence by concatenating an aspect term and an option term of every aspect, inputting the text token sequence into a pretrained language model including a bidirectional encoder and an autoregressive decoder, and using adapter layers to fine-tune the pretrained language model to generate description texts.


Find Patent Forward Citations

Loading…