Beijing, China

Liangrui Peng


Average Co-Inventor Count = 6.2

ph-index = 1

Forward Citations = 46(Granted Patents)


Company Filing History:


Years Active: 2007-2024

Loading Chart...
3 patents (USPTO):

Title: The Innovative Contributions of Liangrui Peng

Introduction

Liangrui Peng is a prominent inventor based in Beijing, China. He has made significant contributions to the field of scene text detection and recognition. With a total of 3 patents, his work showcases the intersection of technology and practical applications in image processing.

Latest Patents

Liangrui Peng's latest patents include a "Scene text detection method and system based on sequential deformation." This innovative method involves extracting a first feature map from a scene image using a convolutional neural network. It further processes the feature map through a sequential deformation module to enhance text detection accuracy. Another notable patent is the "Multi-directional scene text recognition method and system based on multi-element attention mechanism." This method utilizes a deep convolutional neural network to normalize and extract features from text images, ultimately converting them into recognized text outputs.

Career Highlights

Throughout his career, Liangrui Peng has worked with esteemed organizations such as Tsinghua University and Hyundai Motor Company. His experience in these institutions has allowed him to develop and refine his innovative ideas, contributing to advancements in technology.

Collaborations

Liangrui has collaborated with talented individuals like Ruijie Yan and Shanyu Xiao. These partnerships have fostered a creative environment that encourages the exchange of ideas and expertise.

Conclusion

Liangrui Peng's contributions to scene text detection and recognition highlight his innovative spirit and dedication to advancing technology. His patents reflect a commitment to improving the efficiency and accuracy of text recognition systems.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…