San Jose, CA, United States of America

Hongxu Yin

USPTO Granted Patents = 2 

Average Co-Inventor Count = 4.0

ph-index = 1


Company Filing History:


Years Active: 2025

Loading Chart...
2 patents (USPTO):

Title: Innovations of Hongxu Yin

Introduction

Hongxu Yin is a notable inventor based in San Jose, CA (US). He has made significant contributions to the field of technology, particularly in video processing and neural networks. With a total of 2 patents, his work showcases innovative approaches to solving complex problems in image and video analysis.

Latest Patents

One of Hongxu Yin's latest patents is titled "Hallucinating details for over-exposed pixels in videos using learned reference frame selection." This invention involves receiving frames of a live video captured by a device, identifying reference frames with varying exposure levels, and using neural networks to determine missing details in the current frame. The updated version of the current frame is then generated and output in real-time. Another significant patent is the "Neural network training method," which focuses on training neural networks to identify and generate images of objects within various images.

Career Highlights

Hongxu Yin is currently employed at Nvidia Corporation, a leading company in graphics processing and AI technology. His work at Nvidia has allowed him to push the boundaries of what is possible in video processing and machine learning.

Collaborations

He has collaborated with talented individuals such as Jose Manuel Alvarez Lopez and Akshay Chawla, contributing to the advancement of technology through teamwork and shared expertise.

Conclusion

Hongxu Yin's innovative patents and contributions to the field of technology highlight his role as a significant inventor. His work continues to influence advancements in video processing and neural networks, showcasing the importance of innovation in today's digital landscape.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…