Hangzhou, China

Yao Chen


Average Co-Inventor Count = 5.5

ph-index = 1


Company Filing History:


Years Active: 2025

Loading Chart...
2 patents (USPTO):Explore Patents

Title: Yao Chen: Innovator in Video Processing Technologies

Introduction

Yao Chen is a prominent inventor based in Hangzhou, China, known for his contributions to video processing technologies. With a total of two patents to his name, he has made significant strides in the field of image encoding and prediction methods.

Latest Patents

Yao Chen's latest patents include "Systems and method for inter prediction based on a merge mode" and "Systems and methods for video processing." The first patent focuses on systems and methods for inter prediction based on a merge mode, which involves determining whether a current block in an image frame meets a division condition. If the condition is satisfied, the current block is divided into two sub-blocks, allowing for a more accurate prediction result. The second patent addresses video encoding, where the systems determine a current string in a coding unit of an image frame. This involves calculating costs based on pixel values and determining a prediction string for the current string.

Career Highlights

Yao Chen is currently employed at Zhejiang Dahua Technology Co., Ltd., where he continues to innovate in the field of video processing. His work has contributed to advancements in how video data is encoded and processed, enhancing the efficiency and quality of video transmission.

Collaborations

Yao Chen collaborates with talented coworkers, including Dong Jiang and Jucai Lin, who contribute to the innovative environment at Zhejiang Dahua Technology Co., Ltd.

Conclusion

Yao Chen's work in video processing technologies showcases his dedication to innovation and improvement in the field. His patents reflect a commitment to advancing the capabilities of video encoding and prediction methods, making a significant impact in the industry.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…