The patent badge is an abbreviated version of the USPTO patent document. The patent badge does contain a link to the full patent document.

The patent badge is an abbreviated version of the USPTO patent document. The patent badge covers the following: Patent number, Date patent was issued, Date patent was filed, Title of the patent, Applicant, Inventor, Assignee, Attorney firm, Primary examiner, Assistant examiner, CPCs, and Abstract. The patent badge does contain a link to the full patent document (in Adobe Acrobat format, aka pdf). To download or print any patent click here.

Date of Patent:
Jul. 12, 2022

Filed:

Feb. 20, 2020
Applicant:

Wipro Limited, Bangalore, IN;

Inventors:

Rahul Yadav, Alwar, IN;

Gopichand Agnihotram, Bangalore, IN;

Assignee:

Wipro Limited, Bangalore, IN;

Attorney:
Primary Examiner:
Int. Cl.
CPC ...
G06K 9/00 (2022.01); G06V 40/20 (2022.01); G10L 25/63 (2013.01); G10L 25/21 (2013.01); G10L 15/22 (2006.01); G10L 15/26 (2006.01); G06V 20/40 (2022.01); G06V 40/16 (2022.01);
U.S. Cl.
CPC ...
G06V 40/20 (2022.01); G06V 20/46 (2022.01); G06V 40/174 (2022.01); G10L 15/22 (2013.01); G10L 15/26 (2013.01); G10L 25/21 (2013.01); G10L 25/63 (2013.01);
Abstract

The present invention discloses method and system for multimodal analysis based emotion recognition. The method comprising segmenting video data of a user into a plurality of video segments. A plurality of visual features, voice features and text features from the plurality of video segments is extracted. Autocorrelation values among each of the plurality of visual features, the voice features, and the text features is determined. Each of the plurality of visual features, the voice features and the text features is aligned based on video segment identifier and the autocorrelation values to obtain a plurality of aligned multimodal features. One of two classes of emotions is determined for each of the plurality of aligned multimodal features. The determined emotion for each of the plurality of aligned multimodal features is compared with historic multimodal features from a database, and emotion of the user is determined at real time based on comparison.


Find Patent Forward Citations

Loading…