The patent badge is an abbreviated version of the USPTO patent document. The patent badge does contain a link to the full patent document.
The patent badge is an abbreviated version of the USPTO patent document. The patent badge covers the following: Patent number, Date patent was issued, Date patent was filed, Title of the patent, Applicant, Inventor, Assignee, Attorney firm, Primary examiner, Assistant examiner, CPCs, and Abstract. The patent badge does contain a link to the full patent document (in Adobe Acrobat format, aka pdf). To download or print any patent click here.
Patent No.:
Date of Patent:
Oct. 22, 2019
Filed:
Feb. 28, 2017
Applicant:
California Institute of Technology, Pasadena, CA (US);
Inventors:
Christopher Assad, Monrovia, CA (US);
Jaakko T. Karras, Pasadena, CA (US);
Michael T. Wolf, La Crescenta, CA (US);
Adrian Stoica, Pasadena, CA (US);
Assignee:
CALIFORNIA INSTITUTE OF TECHNOLOGY, Pasadena, CA (US);
Attorney:
Primary Examiner:
Int. Cl.
CPC ...
B25J 9/16 (2006.01); A61B 5/00 (2006.01); A61B 5/0488 (2006.01); G06F 19/00 (2018.01); A61B 5/0492 (2006.01); A61B 17/00 (2006.01); B25J 13/08 (2006.01); G06N 20/00 (2019.01); B25J 13/02 (2006.01); A61B 5/11 (2006.01); G16H 40/67 (2018.01); G05B 19/409 (2006.01); A61B 34/00 (2016.01);
U.S. Cl.
CPC ...
B25J 9/163 (2013.01); A61B 5/0022 (2013.01); A61B 5/0492 (2013.01); A61B 5/04888 (2013.01); A61B 5/1122 (2013.01); A61B 5/486 (2013.01); A61B 5/6806 (2013.01); A61B 5/6824 (2013.01); A61B 5/7267 (2013.01); A61B 34/74 (2016.02); B25J 13/02 (2013.01); B25J 13/087 (2013.01); G05B 19/409 (2013.01); G06N 20/00 (2019.01); G16H 40/67 (2018.01); A61B 2017/00039 (2013.01); A61B 2017/00207 (2013.01); A61B 2034/741 (2016.02); A61B 2560/0223 (2013.01); A61B 2562/0219 (2013.01); G05B 2219/35448 (2013.01); G05B 2219/35464 (2013.01); G05B 2219/36418 (2013.01); G05B 2219/40195 (2013.01);
Abstract
A sleeve worn on an arm allows detection of gestures by an array of sensors. Electromyography, inertial, and magnetic field sensors provide data that is processed to categorize gestures and translate the gestures into commands for robotic systems. Machine learning allows training of gestures to increase accuracy of detection for different users.