Liaoning, China

Yu Qiao


Average Co-Inventor Count = 5.8

ph-index = 1


Company Filing History:


Years Active: 2021-2023

Loading Chart...
2 patents (USPTO):Explore Patents

Title: Innovations of Yu Qiao in Computer Vision

Introduction

Yu Qiao is a prominent inventor based in Liaoning, China. He has made significant contributions to the field of computer vision, particularly through his innovative patents. With a total of 2 patents, his work focuses on enhancing image processing techniques.

Latest Patents

One of Yu Qiao's latest patents is a "Video semantic segmentation method based on active learning." This invention provides a comprehensive approach to video semantic segmentation, incorporating an image semantic segmentation module, a data selection module based on active learning, and a label propagation module. The method aims to quickly generate weakly-supervised data sets, thereby reducing manufacturing costs and optimizing the performance of semantic segmentation networks.

Another notable patent is the "Fully automatic natural image matting method." This invention streamlines the image matting process by extracting high-level semantic features and low-level structural features. It eliminates the need for auxiliary information, saving time for researchers and enhancing user interaction.

Career Highlights

Yu Qiao is affiliated with Dalian University of Technology, where he continues to advance research in computer vision. His work has garnered attention for its practical applications and innovative methodologies.

Collaborations

Yu collaborates with notable colleagues, including Xin Yang and Xiaopeng Wei, contributing to a dynamic research environment that fosters innovation.

Conclusion

Yu Qiao's contributions to computer vision through his patents demonstrate his commitment to advancing technology in this field. His innovative methods not only enhance image processing but also streamline workflows for researchers and users alike.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…