New York, NY, United States of America

Barbara Hanna


Average Co-Inventor Count = 6.0

ph-index = 2

Forward Citations = 71(Granted Patents)


Company Filing History:


Years Active: 2014-2017

Loading Chart...
2 patents (USPTO):

Title: Barbara Hanna: Innovator in Image Annotation Technology

Introduction

Barbara Hanna is a prominent inventor based in New York, NY (US). She has made significant contributions to the field of image processing and machine vision. With a total of 2 patents, her work focuses on enhancing multimedia labeling and categorization.

Latest Patents

Hanna's latest patents include "Rapid image annotation via brain state decoding and visual pattern mining." This innovative technology addresses the limitations of human visual perception, which can recognize a wide range of targets but has limited throughput. It also tackles the challenges faced by machine vision, which can process images quickly but often lacks adequate recognition accuracy for general target classes. The systems and methods she developed combine the strengths of both human and machine capabilities, leading to improved multimedia processing systems that enhance labeling, categorization, searching, and navigation.

Career Highlights

Barbara Hanna is affiliated with Columbia University, where she continues to advance her research and development in image processing technologies. Her work has garnered attention for its potential applications in various fields, including artificial intelligence and multimedia systems.

Collaborations

Hanna has collaborated with notable colleagues such as Shih-Fu Chang and Jun Wang, contributing to the advancement of her innovative projects.

Conclusion

Barbara Hanna's contributions to image annotation technology exemplify the intersection of human perception and machine efficiency. Her work continues to influence the field and pave the way for future innovations in multimedia processing.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…