Guangzhou, China

Liangyi Chen


 

Average Co-Inventor Count = 4.0

ph-index = 1

Forward Citations = 4(Granted Patents)


Company Filing History:


Years Active: 2021-2025

Loading Chart...
Loading Chart...
3 patents (USPTO):Explore Patents

Title: Innovations of Liangyi Chen in Image Processing

Introduction

Liangyi Chen is a notable inventor based in Guangzhou, China. He has made significant contributions to the field of image processing, holding a total of 3 patents. His work focuses on developing systems and methods that enhance image quality through advanced processing techniques.

Latest Patents

Liangyi Chen's latest patents revolve around innovative systems and methods for image processing. These systems are designed to generate a preliminary image by filtering data from an image acquisition device. The process involves creating an intermediate image through iterative operations based on a first objective function. This function includes terms that address the differences between images, continuity, and sparsity. Ultimately, the systems produce a target image by applying a second objective function, which is linked to the system matrix of the image acquisition device.

Career Highlights

Throughout his career, Liangyi Chen has worked with prominent companies such as Guangzhou Computational Super-resolution Biotech Co., Ltd. and Guangzhou Computational Super-resolutions Biotech Co., Ltd. His expertise in image processing has positioned him as a key player in the biotechnology sector.

Collaborations

Liangyi Chen has collaborated with talented individuals in his field, including Haoyu Li and Weisong Zhao. These partnerships have contributed to the advancement of his innovative projects.

Conclusion

Liangyi Chen's contributions to image processing through his patents and collaborations highlight his role as a significant inventor in the field. His work continues to influence advancements in technology and image quality enhancement.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…