Nanjing, China

Baocheng Gu


Average Co-Inventor Count = 6.2

ph-index = 1


Company Filing History:


Years Active: 2023-2025

Loading Chart...
3 patents (USPTO):Explore Patents

Title: Innovations of Baocheng Gu

Introduction

Baocheng Gu is a notable inventor based in Nanjing, China. He has made significant contributions to the field of image rendering and scene recognition. With a total of three patents to his name, Gu's work is influential in the development of advanced electronic devices.

Latest Patents

Gu's latest patents include an "Image rendering method and apparatus, and electronic device." This patent describes a method where a central processing unit (CPU) obtains an image command stream for rendering an image frame. It involves comparing this stream with a previous one and instructing a graphics processing unit (GPU) to render based on previously generated drawing targets when similar drawing commands are detected. Another significant patent is the "Scene recognition method and apparatus, terminal, and storage medium." This application focuses on game scene recognition, detailing a method that includes obtaining drawing instructions and determining target objects based on description parameters.

Career Highlights

Gu is currently employed at Huawei Technologies Co., Limited, a leading global provider of information and communications technology (ICT) infrastructure and smart devices. His work at Huawei has allowed him to push the boundaries of technology in image processing and recognition.

Collaborations

Gu collaborates with talented coworkers, including Fan Zhang and Dong Wei, who contribute to his innovative projects and research.

Conclusion

Baocheng Gu's contributions to image rendering and scene recognition are noteworthy, showcasing his expertise and innovative spirit in the technology sector. His patents reflect a commitment to advancing electronic devices and enhancing user experiences.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…