Seattle, WA, United States of America

Jiapei Huang

USPTO Granted Patents = 1 

Average Co-Inventor Count = 10.0

ph-index = 1


Company Filing History:


Years Active: 2025

Loading Chart...
1 patent (USPTO):Explore Patents

Title: The Innovative Contributions of Jiapei Huang

Introduction

Jiapei Huang is a notable inventor based in Seattle, WA (US). He has made significant contributions to the field of visual search technology. His work focuses on enhancing user experience through advanced machine learning techniques.

Latest Patents

Jiapei Huang holds a patent for "Visual intent triggering for visual search." This patent describes mechanisms for visual intent classification and detection on images. The technology utilizes a trained machine learning model to classify subjects in images according to a taxonomy. This classification serves as a pre-triggering mechanism to initiate further actions, thereby saving processing time. The patent outlines various user scenarios, query formulation, and enhancements to user experience. The model employs multiple feature detectors, multi-layer predictions, multilabel classifiers, and bounding box regression.

Career Highlights

Jiapei Huang is currently associated with Microsoft Technology Licensing, LLC. His work at Microsoft has allowed him to explore innovative solutions in visual search technology. His contributions have been instrumental in advancing the capabilities of machine learning applications.

Collaborations

Jiapei Huang has collaborated with notable colleagues, including Xi Chen and Houdong Hu. Their combined expertise has fostered a productive environment for innovation and development.

Conclusion

Jiapei Huang's work exemplifies the intersection of technology and user experience. His patent on visual intent triggering showcases his commitment to advancing visual search capabilities. Through his contributions, he continues to influence the field of machine learning and visual technology.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…