The patent badge is an abbreviated version of the USPTO patent document. The patent badge does contain a link to the full patent document.

The patent badge is an abbreviated version of the USPTO patent document. The patent badge covers the following: Patent number, Date patent was issued, Date patent was filed, Title of the patent, Applicant, Inventor, Assignee, Attorney firm, Primary examiner, Assistant examiner, CPCs, and Abstract. The patent badge does contain a link to the full patent document (in Adobe Acrobat format, aka pdf). To download or print any patent click here.

Date of Patent:
Nov. 14, 2023

Filed:

May. 28, 2021
Applicant:

Humane, Inc., San Francisco, CA (US);

Inventors:

Imran A. Chaudhri, San Francisco, CA (US);

Bethany Bongiorno, San Francisco, CA (US);

Patrick Gates, San Francisco, CA (US);

Wangju Tsai, Pleasanton, CA (US);

Monique Relova, South San Francisco, CA (US);

Nathan Lord, San Jose, CA (US);

Yanir Nulman, San Francisco, CA (US);

Ralph Brunner, Los Gatos, CA (US);

Lilynaz Hashemi, San Francisco, CA (US);

Britt Nelson, Sausalito, CA (US);

Assignee:

Humane, Inc., San Francisco, CA (US);

Attorney:
Primary Examiner:
Int. Cl.
CPC ...
G06F 3/01 (2006.01); G06V 40/20 (2022.01); G06V 40/10 (2022.01); G06V 10/762 (2022.01); G06T 7/50 (2017.01); G06F 17/17 (2006.01); G06F 1/16 (2006.01); G06N 5/04 (2023.01); G06T 7/70 (2017.01);
U.S. Cl.
CPC ...
G06F 3/017 (2013.01); G06F 1/163 (2013.01); G06F 17/17 (2013.01); G06N 5/04 (2013.01); G06T 7/50 (2017.01); G06T 7/70 (2017.01); G06V 10/762 (2022.01); G06V 40/10 (2022.01); G06V 40/28 (2022.01); G06T 2207/10028 (2013.01); G06T 2210/12 (2013.01);
Abstract

Systems, methods, devices and non-transitory, computer-readable storage mediums are disclosed for gesture recognition for a wearable multimedia device using real-time data streams. In an embodiment, a method comprises: detecting a trigger event from one or more real-time data streams running on a wearable multimedia device; taking one or more data snapshots of the one or more real-time data streams; inferring user intent from the one or more data snapshots; and selecting a service or preparing content for the user based on the inferred user intent. In an embodiment, a hand and finger pointing direction is determined from a depth image, a 2D bounding box for the hand/finger is projected into a 2D image space and compared to bounding boxes for identified/labeled objects in the 2D image space to identify an object that the hand is holding or the finger is pointing toward.


Find Patent Forward Citations

Loading…