Hubei, China

Qihuang Zhong


Average Co-Inventor Count = 4.0

ph-index = 1

Forward Citations = 1(Granted Patents)


Company Filing History:


Years Active: 2023

Loading Chart...
1 patent (USPTO):Explore Patents

Title: Qihuang Zhong: Innovator in Semantic Segmentation Technology

Introduction

Qihuang Zhong is a prominent inventor based in Hubei, China. He has made significant contributions to the field of image processing and semantic segmentation. His innovative approach focuses on bridging the gap between different domains in image analysis.

Latest Patents

Qihuang Zhong holds a patent for an "Attention-based joint image and feature adaptive semantic segmentation method." This invention addresses the challenges of domain gaps in cross-modal image semantic segmentation. The method utilizes an image adaptation procedure to transform source domain images into target-domain-like images, thereby reducing the domain gap at the appearance level. Additionally, it employs a feature adaptation procedure to align features in both the semantic prediction space and the image generation space. The introduction of an attention module enhances the focus on significant image regions during the adaptation process. This innovative approach has shown improved performance across multiple public datasets.

Career Highlights

Qihuang Zhong is affiliated with Wuhan University, where he continues to advance research in image processing technologies. His work has garnered attention for its practical applications in various fields, including computer vision and artificial intelligence.

Collaborations

Qihuang Zhong collaborates with Juhua Liu, contributing to the development of cutting-edge technologies in their field.

Conclusion

Qihuang Zhong's contributions to semantic segmentation technology exemplify the impact of innovative thinking in image processing. His work not only addresses existing challenges but also paves the way for future advancements in the field.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…