Hunan, China

Yin Zou


Average Co-Inventor Count = 7.0

ph-index = 1

Forward Citations = 5(Granted Patents)


Company Filing History:


Years Active: 2022

Loading Chart...
1 patent (USPTO):Explore Patents

Title: Innovations of Yin Zou in Visual Localization Technology

Introduction

Yin Zou is an accomplished inventor based in Hunan, China. He has made significant contributions to the field of visual localization technology. His innovative approach focuses on enhancing the accuracy of localization methods through advanced image processing techniques.

Latest Patents

Yin Zou holds a patent for a "Visual localization method and apparatus based on semantic error image." This patent outlines a method that includes performing feature extraction for a target image and obtaining matching pairs through feature matching. The process involves semantic segmentation to create a two-dimensional semantic image, which aids in determining semantic information for each matching pair. The invention constructs a hypothesized pose pool and determines the optimal pose estimation by minimizing reprojection and semantic errors. This method ensures effective localization even in scenarios with significant scene changes.

Career Highlights

Yin Zou is affiliated with the National University of Defense Technology, where he applies his expertise in visual localization. His work has been instrumental in advancing the capabilities of localization technologies, making them more reliable and efficient.

Collaborations

Yin Zou collaborates with notable colleagues, including Jie Jiang and Xing Xin. Their combined efforts contribute to the development of innovative solutions in the field of visual localization.

Conclusion

Yin Zou's contributions to visual localization technology demonstrate his commitment to innovation and excellence. His patent reflects a significant advancement in the field, showcasing the potential for improved localization methods.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…