The patent badge is an abbreviated version of the USPTO patent document. The patent badge does contain a link to the full patent document.

The patent badge is an abbreviated version of the USPTO patent document. The patent badge covers the following: Patent number, Date patent was issued, Date patent was filed, Title of the patent, Applicant, Inventor, Assignee, Attorney firm, Primary examiner, Assistant examiner, CPCs, and Abstract. The patent badge does contain a link to the full patent document (in Adobe Acrobat format, aka pdf). To download or print any patent click here.

Date of Patent:
Mar. 07, 2023

Filed:

Jun. 22, 2020
Applicant:

International Business Machines Corporation, Armonk, NY (US);

Inventors:

Kanthi Sarpatwar, Elmsford, NY (US);

Nalini K. Ratha, Yorktown Heights, NY (US);

Karthikeyan Shanmugam, Elmsford, NY (US);

Karthik Nandakumar, Singapore, SG;

Sharathchandra Pankanti, Darien, CT (US);

Roman Vaculin, Larchmont, NY (US);

James Thomas Rayfield, Ridgefield, CT (US);

Attorneys:
Primary Examiner:
Int. Cl.
CPC ...
G06N 5/04 (2006.01); H04L 9/00 (2022.01); G06N 3/04 (2023.01); G06K 9/62 (2022.01);
U.S. Cl.
CPC ...
G06N 5/04 (2013.01); G06K 9/6256 (2013.01); G06K 9/6262 (2013.01); G06N 3/04 (2013.01); H04L 9/008 (2013.01);
Abstract

This disclosure provides a method, apparatus and computer program product to create a full homomorphic encryption (FHE)-friendly machine learning model. The approach herein leverages a knowledge distillation framework wherein the FHE-friendly (student) ML model closely mimics the predictions of a more complex (teacher) model, wherein the teacher model is one that, relative to the student model, is more complex and that is pre-trained on large datasets. In the approach herein, the distillation framework uses the more complex teacher model to facilitate training of the FHE-friendly model, but using synthetically-generated training data in lieu of the original datasets used to train the teacher.


Find Patent Forward Citations

Loading…