Company Filing History:
Years Active: 2023
Title: Innovations of Haiyin Piao in Video Semantic Segmentation
Introduction
Haiyin Piao is a notable inventor based in Liaoning, China. He has made significant contributions to the field of computer vision, particularly in the area of video semantic segmentation. His innovative approach leverages active learning to enhance the efficiency and effectiveness of video data processing.
Latest Patents
Haiyin Piao holds a patent for a "Video semantic segmentation method based on active learning." This invention provides a comprehensive solution that includes an image semantic segmentation module, a data selection module based on active learning, and a label propagation module. The image semantic segmentation module is responsible for segmenting image results and extracting high-level features required by the data selection module. The data selection module selects a data subset with rich information at an image level and identifies pixel blocks to be labeled at a pixel level. The label propagation module facilitates the migration from image to video tasks, enabling quick completion of video segmentation results to obtain weakly-supervised data. This invention can rapidly generate weakly-supervised data sets, reduce the cost of data manufacturing, and optimize the performance of a semantic segmentation network.
Career Highlights
Haiyin Piao is affiliated with Dalian University of Technology, where he continues to advance his research in computer vision. His work has garnered attention for its practical applications and innovative methodologies.
Collaborations
Some of his notable coworkers include Xin Yang and Xiaopeng Wei, who contribute to the collaborative research environment at Dalian University of Technology.
Conclusion
Haiyin Piao's contributions to video semantic segmentation through his innovative patent demonstrate his expertise and commitment to advancing technology in computer vision. His work not only enhances the efficiency of data processing but also paves the way for future innovations in the field.