Shenzhen, China

Bocheng Xin


Average Co-Inventor Count = 7.0

ph-index = 1

Forward Citations = 2(Granted Patents)


Company Filing History:


Years Active: 2018

Loading Chart...
1 patent (USPTO):Explore Patents

Title: Bocheng Xin - Innovator in Video Encoding Technology

Introduction

Bocheng Xin is a prominent inventor based in Shenzhen, China. He has made significant contributions to the field of video technology, particularly in encoding and decoding methods. His innovative approach has led to the development of a unique patent that enhances video processing efficiency.

Latest Patents

Bocheng Xin holds a patent for a "Hybrid-resolution encoding and decoding method and a video apparatus using the same." This invention provides a hybrid-resolution encoding and decoding method that includes performing full-resolution standard coding on an I frame that adopts only intra-frame coding in a video frame sequence. The method also involves frame reconstruction to obtain a reconstructed frame of the I frame, down-sampling the reconstructed frame, and performing standard coding on non-I frames using the first sampling image as a reference frame. This innovative approach results in a more efficient video code stream that includes both full-resolution and standard coded frames.

Career Highlights

Bocheng Xin is currently employed at Tencent Technology (Shenzhen) Company Limited, where he continues to work on advancements in video technology. His expertise in hybrid-resolution methods has positioned him as a key player in the industry.

Collaborations

He collaborates with talented coworkers, including Chenchen Gu and Xunan Mao, who contribute to the innovative projects at Tencent.

Conclusion

Bocheng Xin's work in hybrid-resolution encoding and decoding methods showcases his commitment to advancing video technology. His contributions are paving the way for more efficient video processing solutions in the industry.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…