Beijing, China

Hongyi Chen


Average Co-Inventor Count = 2.1

ph-index = 5

Forward Citations = 65(Granted Patents)


Company Filing History:


Years Active: 1997-2007

Loading Chart...
11 patents (USPTO):Explore Patents

Title: Innovations of Hongyi Chen

Introduction

Hongyi Chen is a prominent inventor based in Beijing, China. He holds a total of 11 patents that showcase his contributions to the field of motion estimation and image processing. His work has significantly impacted the efficiency and quality of image encoding.

Latest Patents

Among his latest patents, one notable invention is the "Method for performing motion estimation with Walsh-Hadamard transform (WHT)." This method utilizes a Walsh-Hadamard transform algorithm to enhance motion estimation by transforming current and reference image pixel content. The algorithm computes a matching criterion that improves the accuracy of motion detection. Another significant patent is the "Method for motion estimation using a low-bit edge image." This invention employs a low-bit resolution integrated edge image to optimize motion estimation, resulting in improved encoding quality while reducing operational costs.

Career Highlights

Hongyi Chen has worked with several notable companies, including United Microelectronics Corporation and Winbond Electronics Corporation. His experience in these organizations has allowed him to refine his skills and contribute to various innovative projects.

Collaborations

Throughout his career, Hongyi has collaborated with talented individuals such as Qingming Shu and Weixin Gai. These partnerships have fostered a creative environment that has led to the development of groundbreaking technologies.

Conclusion

Hongyi Chen's innovative work in motion estimation and image processing has made a significant impact in the field. His patents reflect a commitment to enhancing technology and improving efficiency in image encoding.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…