New York, NY, United States of America

David Jangraw


Average Co-Inventor Count = 5.6

ph-index = 2

Forward Citations = 71(Granted Patents)


Company Filing History:


Years Active: 2014-2023

where 'Filed Patents' based on already Granted Patents

3 patents (USPTO):

Title: Innovations of David Jangraw

Introduction

David Jangraw is an accomplished inventor based in New York, NY (US). He has made significant contributions to the field of artificial intelligence and brain-computer interfaces. With a total of 3 patents, his work focuses on enhancing the interaction between humans and machines.

Latest Patents

One of his latest patents is titled "Systems and methods for deep reinforcement learning using a brain-artificial intelligence interface." This patent describes a hybrid brain-computer interface (hBCI) that detects an individual's reinforcement signals, such as emotional reactivity and cognitive state. The system aims to improve AI agents, particularly in environments like autonomous vehicles. Another notable patent is "Rapid image annotation via brain state decoding and visual pattern mining." This invention combines human visual perception with machine vision to enhance multimedia labeling, categorization, searching, and navigation.

Career Highlights

David Jangraw is affiliated with Columbia University, where he continues to push the boundaries of research in artificial intelligence and human-computer interaction. His innovative work has garnered attention in both academic and industry circles.

Collaborations

He has collaborated with notable colleagues such as Paul Sajda and Shih-Fu Chang, contributing to advancements in their respective fields.

Conclusion

David Jangraw's innovative patents and research at Columbia University highlight his significant role in the evolution of artificial intelligence and brain-computer interfaces. His work continues to pave the way for future advancements in technology.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…