The patent badge is an abbreviated version of the USPTO patent document. The patent badge does contain a link to the full patent document.

The patent badge is an abbreviated version of the USPTO patent document. The patent badge covers the following: Patent number, Date patent was issued, Date patent was filed, Title of the patent, Applicant, Inventor, Assignee, Attorney firm, Primary examiner, Assistant examiner, CPCs, and Abstract. The patent badge does contain a link to the full patent document (in Adobe Acrobat format, aka pdf). To download or print any patent click here.

Date of Patent:
Jun. 09, 2020

Filed:

Jan. 27, 2020
Applicant:

King Abdulaziz University, Jeddah, SA;

Inventors:

Yusuf Al-Turki, Jeddah, SA;

Abdullah Abusorrah, Jeddah, SA;

XuDong Shi, Shanghai, CN;

Qi Kang, Shanghai, CN;

MengChu Zhou, Newark, NJ (US);

Assignee:
Attorney:
Primary Examiner:
Int. Cl.
CPC ...
G05B 13/04 (2006.01); G05B 13/02 (2006.01); G06N 20/00 (2019.01); G06K 9/62 (2006.01);
U.S. Cl.
CPC ...
G05B 13/041 (2013.01); G05B 13/0265 (2013.01); G06K 9/6259 (2013.01); G06N 20/00 (2019.01);
Abstract

Soft sensing of nonlinear and multimode industrial processes given a limited number of labeled data samples is disclosed. Methods include a semi-supervised probabilistic density-based regression approach, called Semi-supervised Weighted Gaussian Regression (SWGR). In SWGR, different weights are assigned to each training sample based on their similarities to a query sample. Then a local weighted Gaussian density is built for capturing the joint probability of historical samples around the query sample. The training process of parameters in SWGR incorporates both labeled and unlabeled data samples via a maximum likelihood estimation algorithm. In this way, the soft sensor model is able to approximate the nonlinear mechanics of input and output variables and remedy the insufficiency of labeled samples. At last, the output prediction as well as the uncertainty of prediction can be obtained by the conditional distribution.


Find Patent Forward Citations

Loading…