Shanghai, China

Fupeng Chen


Average Co-Inventor Count = 3.8

ph-index = 1


Company Filing History:


Years Active: 2021-2025

Loading Chart...
3 patents (USPTO):Explore Patents

Title: Fupeng Chen: Innovator in Adaptive Stereo Matching and Parallel Computing

Introduction

Fupeng Chen is a prominent inventor based in Shanghai, China. He has made significant contributions to the fields of adaptive stereo matching and efficient parallel computing. With a total of two patents to his name, Chen's work showcases his innovative approach to solving complex problems in image processing.

Latest Patents

Fupeng Chen's latest patents include an "Adaptive Stereo Matching Optimization Method and Apparatus" and an "Efficient Parallel Computing Method for Box Filter." The adaptive stereo matching patent provides a method for acquiring images from multiple perspectives and optimizing depth value ranges in real time. This is achieved through an adaptive stereo matching model that adjusts execution cycles based on resource constraints. The efficient parallel computing method focuses on establishing architectures for box filters, allowing for the calculation of pixel averages while minimizing resource consumption.

Career Highlights

Chen is currently affiliated with ShanghaiTech University, where he continues to advance his research and development efforts. His work has garnered attention for its practical applications in various technological fields, particularly in enhancing image processing techniques.

Collaborations

Fupeng Chen collaborates with notable colleagues, including Yajun Ha and Heng Yu. Their combined expertise contributes to the innovative research environment at ShanghaiTech University.

Conclusion

Fupeng Chen's contributions to adaptive stereo matching and parallel computing highlight his role as a leading inventor in the field. His patents reflect a commitment to advancing technology and improving computational efficiency.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…