Beijing, China

Anlin Zheng


Average Co-Inventor Count = 5.8

ph-index = 1


Company Filing History:


Years Active: 2019

Loading Chart...
2 patents (USPTO):Explore Patents

Title: Anlin Zheng: Innovator in Image Processing and Video Object Detection

Introduction

Anlin Zheng is a prominent inventor based in Beijing, China. He has made significant contributions to the fields of image processing and video object detection. With a total of two patents to his name, Zheng's work focuses on enhancing the efficiency of extracting salient information from images and videos.

Latest Patents

Zheng's latest patents include innovative methods and apparatuses for image salient object detection. One patent describes a method for extracting a saliency map from an original image through a series of convolution and pooling processes. This method utilizes eye fixation information to indicate regions of interest, ultimately improving the efficiency of saliency map extraction. Another patent focuses on detecting and segmenting primary video objects with neighborhood reversibility. This method involves dividing video frames into super pixel blocks and employing a deep neural network to predict foreground values, making it particularly suitable for complex datasets.

Career Highlights

Anlin Zheng is affiliated with Beihang University, where he continues to advance research in his field. His work has garnered attention for its practical applications in various industries, particularly in enhancing visual data processing.

Collaborations

Zheng collaborates with notable colleagues, including Xiaowu Chen and Jia Qi Li, contributing to a dynamic research environment that fosters innovation.

Conclusion

Anlin Zheng's contributions to image processing and video object detection exemplify the impact of innovative thinking in technology. His patents reflect a commitment to improving efficiency and accuracy in visual data analysis.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…