Beijing, China

Ning Yan

USPTO Granted Patents = 2 

Average Co-Inventor Count = 8.0

ph-index = 1


Company Filing History:


Years Active: 2025

Loading Chart...
2 patents (USPTO):Explore Patents

Title: Ning Yan: Innovator in Video Decoding Technologies

Introduction

Ning Yan is a prominent inventor based in Beijing, China, known for his contributions to video decoding technologies. With a total of 2 patents, he has made significant advancements in the field, particularly in enhancing coding methods for video data.

Latest Patents

Ning Yan's latest patents include "Coding enhancement in cross-component sample adaptive offset" and "Methods and devices for geometric partition mode with motion vector refinement." The first patent describes an electronic apparatus that decodes video data by reconstructing samples of different components and applying various filters to optimize the output. The second patent outlines methods for video decoding that involve adaptive switching between motion vector refinement offset sets, allowing for improved prediction samples based on refined motion vectors.

Career Highlights

Ning Yan is currently employed at Beijing Dajia Internet Information Technology Co., Ltd., where he continues to innovate in the realm of video technology. His work has been instrumental in developing more efficient video decoding methods that enhance the quality and performance of video playback.

Collaborations

Ning has collaborated with notable colleagues, including Che-Wei Kuo and Hong-Jheng Jhu, who contribute to the innovative environment in which he works. Their combined expertise fosters a collaborative spirit that drives technological advancements in their field.

Conclusion

Ning Yan's contributions to video decoding technologies exemplify the impact of innovation in enhancing digital media experiences. His patents reflect a commitment to improving video quality and efficiency, marking him as a significant figure in the industry.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…