The patent badge is an abbreviated version of the USPTO patent document. The patent badge does contain a link to the full patent document.

The patent badge is an abbreviated version of the USPTO patent document. The patent badge covers the following: Patent number, Date patent was issued, Date patent was filed, Title of the patent, Applicant, Inventor, Assignee, Attorney firm, Primary examiner, Assistant examiner, CPCs, and Abstract. The patent badge does contain a link to the full patent document (in Adobe Acrobat format, aka pdf). To download or print any patent click here.

Date of Patent:
Dec. 17, 2024

Filed:

Jun. 18, 2021
Applicant:

Emc Ip Holding Company Llc, Hopkinton, MA (US);

Inventors:

Shiri Gaber, Beer Sheva, IL;

Ohad Arnon, Beit-Nir, IL;

Dany Shapiro, Alfei Menashe, IL;

Assignee:

EMC IP HOLDING COMPANY LLC, Hopkinton, MA (US);

Attorneys:
Primary Examiner:
Int. Cl.
CPC ...
H04L 9/00 (2022.01); G06F 21/55 (2013.01); G06N 3/045 (2023.01); G06N 3/08 (2023.01); G06N 20/20 (2019.01);
U.S. Cl.
CPC ...
G06F 21/55 (2013.01); G06N 3/045 (2023.01); G06N 3/08 (2013.01); G06N 20/20 (2019.01);
Abstract

Techniques described herein relate to a method for predicting results using ensemble models. The method may include receiving trained model data sets from a model source nodes, each trained model data set comprising a trained model, an important feature list, and a missing feature generator; receiving a prediction request data set; making a determination that the prediction request data set does not include an input feature for a trained model; generating, based on the determination and using a missing feature generator, a substitute feature to replace the input feature; executing the trained model using the prediction request data set and the substitute feature to obtain a first prediction; executing a second trained model using the prediction request data set to obtain a second prediction; and obtaining a final prediction using the first prediction, the second prediction, and an ensemble model.


Find Patent Forward Citations

Loading…