Beijing, China

Weihong Deng

USPTO Granted Patents = 3 

Average Co-Inventor Count = 4.6

ph-index = 1

Forward Citations = 1(Granted Patents)


Company Filing History:


Years Active: 2021-2024

Loading Chart...
3 patents (USPTO):Explore Patents

Title: Weihong Deng - Innovator in Object Recognition Technology

Introduction

Weihong Deng is a prominent inventor based in Beijing, China. He has made significant contributions to the field of object recognition technology, holding a total of 3 patents. His work focuses on enhancing the capabilities of neural networks for better object recognition.

Latest Patents

Weihong Deng's latest patents include a "Method and apparatus for training an object recognition model." This patent describes a training sample optimization apparatus for a neural network model, which includes a fluctuation determination unit and an optimization unit. These components work together to assess and optimize training samples for improved model performance. Another notable patent is the "Training method and training apparatus for a neural network for object recognition." This invention outlines a comprehensive training method that involves inputting a training image set, classifying image samples, and updating neural network parameters based on calculated losses.

Career Highlights

Weihong Deng is currently employed at Canon Kabushiki Kaisha, where he continues to innovate in the field of object recognition. His expertise in neural networks has positioned him as a key player in advancing technology in this area.

Collaborations

Weihong has collaborated with notable colleagues, including Jiani Hu and Dongyue Zhao, who contribute to his research and development efforts.

Conclusion

Weihong Deng's work in object recognition technology exemplifies the innovative spirit of modern inventors. His patents and contributions continue to shape the future of artificial intelligence and machine learning.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…