Bellevue, WA, United States of America

Xi Chen


Average Co-Inventor Count = 10.0

ph-index = 1


Company Filing History:


Years Active: 2025

Loading Chart...
1 patent (USPTO):Explore Patents

Title: Innovations by Xi Chen in Visual Intent Classification

Introduction

Xi Chen is an accomplished inventor based in Bellevue, WA (US). He has made significant contributions to the field of visual search technology. His innovative work focuses on enhancing user experience through advanced machine learning techniques.

Latest Patents

Xi Chen holds a patent for "Visual intent triggering for visual search." This patent describes mechanisms for performing visual intent classification and detection on images. The technology utilizes a trained machine learning model to classify subjects in images according to a taxonomy. This classification serves as a pre-triggering mechanism to initiate further actions, thereby saving processing time. The patent outlines various user scenarios, query formulation, and enhancements to user experience. The model employs multiple feature detectors, multi-layer predictions, multilabel classifiers, and bounding box regression to achieve its objectives.

Career Highlights

Xi Chen is currently associated with Microsoft Technology Licensing, LLC. His work at Microsoft has allowed him to explore and develop cutting-edge technologies that push the boundaries of visual search capabilities. His innovative approach has garnered attention in the tech community.

Collaborations

Xi Chen has collaborated with notable colleagues, including Houdong Hu and Li Huang. These collaborations have contributed to the advancement of their shared goals in technology and innovation.

Conclusion

Xi Chen's contributions to visual intent classification represent a significant advancement in the field of visual search technology. His innovative patent and work at Microsoft highlight his role as a leading inventor in this domain.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…