Shenzhen, China

Dengke Dong


Average Co-Inventor Count = 7.0

ph-index = 1

Forward Citations = 2(Granted Patents)


Company Filing History:


Years Active: 2021

Loading Chart...
1 patent (USPTO):Explore Patents

Title: The Innovative Mind of Dengke Dong

Introduction

Dengke Dong is a prominent inventor based in Shenzhen, China. He has made significant contributions to the field of machine vision through his innovative patent. His work focuses on enhancing the training processes of machine vision models, particularly in dealing with noisy datasets.

Latest Patents

Dengke Dong holds a patent titled "Complexity-based progressive training for machine vision models." This patent describes methods and systems for training machine vision models (MVMs) using noisy training datasets. The invention involves receiving a noisy set of images, where some labels may be incorrect. A progressively-sequenced learning curriculum is designed, allowing the MVM to learn from the easiest examples first and gradually tackle more complex images. This approach ensures that the MVM accumulates knowledge effectively, enhancing its learning capabilities.

Career Highlights

Dengke Dong is associated with Shenzhen Malong Technologies Co., Ltd., where he applies his expertise in machine vision. His innovative approach to training models has positioned him as a key figure in the company. His work not only contributes to the advancement of technology but also sets a standard for future developments in the field.

Collaborations

Dengke Dong collaborates with talented individuals such as Sheng Guo and Weilin Huang. Their combined efforts foster a creative environment that drives innovation and enhances the quality of their projects.

Conclusion

Dengke Dong's contributions to machine vision through his patent demonstrate his innovative spirit and dedication to advancing technology. His work is a testament to the potential of progressive learning in artificial intelligence.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…