The patent badge is an abbreviated version of the USPTO patent document. The patent badge does contain a link to the full patent document.

The patent badge is an abbreviated version of the USPTO patent document. The patent badge covers the following: Patent number, Date patent was issued, Date patent was filed, Title of the patent, Applicant, Inventor, Assignee, Attorney firm, Primary examiner, Assistant examiner, CPCs, and Abstract. The patent badge does contain a link to the full patent document (in Adobe Acrobat format, aka pdf). To download or print any patent click here.

Date of Patent:
Aug. 11, 2015

Filed:

Jul. 30, 2012
Applicants:

Yi Wu, San Jose, CA (US);

Wei Sun, San Jose, CA (US);

Michael M. Chu, Cupertino, CA (US);

Ermal Dreshaj, Santa Clara, CA (US);

Philip Muse, Folsom, CA (US);

Lucas B. Ainsworth, Portland, OR (US);

Garth Shoemaker, Sunnyvale, CA (US);

Igor V. Kozintsev, San Jose, CA (US);

Inventors:

Yi Wu, San Jose, CA (US);

Wei Sun, San Jose, CA (US);

Michael M. Chu, Cupertino, CA (US);

Ermal Dreshaj, Santa Clara, CA (US);

Philip Muse, Folsom, CA (US);

Lucas B. Ainsworth, Portland, OR (US);

Garth Shoemaker, Sunnyvale, CA (US);

Igor V. Kozintsev, San Jose, CA (US);

Assignee:

Intel Corporation, Santa Clara, CA (US);

Attorney:
Primary Examiner:
Assistant Examiner:
Int. Cl.
CPC ...
H04N 15/00 (2006.01); H04N 13/00 (2006.01); H04N 13/04 (2006.01); H04N 5/222 (2006.01); H04N 5/232 (2006.01);
U.S. Cl.
CPC ...
H04N 13/0477 (2013.01); H04N 5/2226 (2013.01); H04N 5/23219 (2013.01); H04N 13/004 (2013.01); H04N 13/0048 (2013.01); H04N 13/0059 (2013.01); H04N 2013/0092 (2013.01); H04N 2213/006 (2013.01);
Abstract

Generally, this disclosure provides methods and systems for real-time video communication with three dimensional perception image rendering through generated parallax effects based on identification, segmentation and tracking of foreground and background layers of an image. The system may include an image segmentation module configured to segment a current local video frame into a local foreground layer and a local background layer and to generate a local foreground mask based on an estimated boundary between the local foreground layer and the local background layer; a face tracking module configured to track a position of a local user's face; a background layer estimation module configured to estimate a remote background layer; and an image rendering module configured to render a 3D perception image based on the estimated remote background layer, the current remote video frame and the remote foreground mask.


Find Patent Forward Citations

Loading…