New York, NY, United States of America

Ugo Jardonnet


Average Co-Inventor Count = 4.0

ph-index = 3

Forward Citations = 42(Granted Patents)


Company Filing History:


Years Active: 2018-2022

Loading Chart...
7 patents (USPTO):

Title: Ugo Jardonnet: Innovator in Video Data Processing

Introduction

Ugo Jardonnet is a prominent inventor based in New York, NY, known for his significant contributions to the field of video data processing. With a total of seven patents to his name, Jardonnet has developed innovative systems and methods that enhance the monitoring of activities through advanced video analysis.

Latest Patents

One of Jardonnet's latest patents focuses on "Systems and methods for processing video data for activity monitoring." This invention enables the capture of video data streams from multiple sources and processes them efficiently. The system merges various data protocols, allowing for seamless processing across different operating systems. Another notable patent is centered on "Robust, adaptive and efficient object detection, classification and tracking." This method captures video data streams and automatically detects and tracks moving objects, providing valuable insights into urban environments.

Career Highlights

Jardonnet is currently associated with Placemeter Inc., where he applies his expertise in video data processing. His work has significantly impacted how video data is utilized for monitoring and analysis, making urban spaces safer and more efficient.

Collaborations

Jardonnet has collaborated with notable colleagues, including Alexandre Winter and Tuan Hue Thi, contributing to various projects that leverage their combined expertise in video technology.

Conclusion

Ugo Jardonnet's innovative work in video data processing has led to advancements that enhance activity monitoring and object detection. His contributions continue to shape the future of video analysis technology.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…