Shenzhen, China

Yuying Ge

USPTO Granted Patents = 1 

Average Co-Inventor Count = 7.0

ph-index = 1

Forward Citations = 1(Granted Patents)


Company Filing History:


Years Active: 2022

where 'Filed Patents' based on already Granted Patents

1 patent (USPTO):

Title: Yuying Ge: Innovator in Target Matching Technology

Introduction

Yuying Ge is a prominent inventor based in Shenzhen, China. He has made significant contributions to the field of image processing and matching technologies. His innovative work has led to the development of a unique patent that enhances the efficiency of target matching methods.

Latest Patents

Yuying Ge holds a patent for a "Target matching method and apparatus, electronic device, and storage medium." This invention includes a comprehensive process that involves extracting feature vectors from both query and candidate image sequences. The method determines self-expression and collaborative expression feature vectors for both sequences, allowing for a detailed analysis of similarity. Ultimately, it provides a matching result based on the calculated similarity feature vector.

Career Highlights

Yuying Ge is currently employed at Shenzhen Sensetime Technology Co., Ltd. His role at the company allows him to work on cutting-edge technologies that push the boundaries of image recognition and processing. His expertise in this area has positioned him as a valuable asset to his team.

Collaborations

Yuying Ge collaborates with talented individuals such as Ruimao Zhang and Hongbin Sun. Their combined efforts contribute to the advancement of innovative technologies within their field.

Conclusion

Yuying Ge's contributions to target matching technology exemplify the impact of innovative thinking in the realm of image processing. His patent and work at Shenzhen Sensetime Technology Co., Ltd. highlight his commitment to advancing technology and improving efficiency in this critical area.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…