Beijing, China

Shao Zeng


Average Co-Inventor Count = 16.0

ph-index = 1


Company Filing History:


Years Active: 2025

Loading Chart...
1 patent (USPTO):Explore Patents

Title: The Innovations of Shao Zeng

Introduction

Shao Zeng is a prominent inventor based in Beijing, China. He has made significant contributions to the field of artificial intelligence, particularly in deep learning and image recognition technologies. His innovative work has led to the development of a unique patent that enhances the capabilities of feature extraction models.

Latest Patents

Shao Zeng holds a patent for a "Method for training feature extraction model, method for classifying image, and related apparatuses." This patent outlines a comprehensive approach to training feature extraction models, which includes extracting image features from sample images, normalizing these features, and guiding the training process through a high discriminative loss function. This method aims to improve the accuracy and efficiency of image classification tasks.

Career Highlights

Shao Zeng is currently employed at Beijing Baidu Netcom Science Technology Co., Ltd. His role at this leading technology company allows him to work on cutting-edge projects that push the boundaries of artificial intelligence. His expertise in feature extraction and image classification has positioned him as a valuable asset in the tech industry.

Collaborations

Shao Zeng has collaborated with notable colleagues, including Shuilong Dong and Sensen He. These partnerships have fostered an environment of innovation and creativity, leading to advancements in their respective fields.

Conclusion

Shao Zeng's contributions to artificial intelligence through his patent and work at Baidu highlight his role as a key innovator in the technology sector. His methods for training feature extraction models are paving the way for future advancements in image recognition.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…