Fairfax Station, VA, United States of America

Xiaowen Zhang


Average Co-Inventor Count = 3.0

ph-index = 1


Company Filing History:


Years Active: 2024

Loading Chart...
1 patent (USPTO):Explore Patents

Title: Innovations of Xiaowen Zhang in Natural Language Processing

Introduction

Xiaowen Zhang is an accomplished inventor based in Fairfax Station, VA (US). He has made significant contributions to the field of natural language processing, showcasing his expertise through his innovative patent. With a focus on machine learning, Zhang's work aims to enhance the understanding and classification of text documents.

Latest Patents

Xiaowen Zhang holds a patent for "Systems and methods for natural language processing." This patent describes a method that involves receiving a corpus of unlabeled text documents and generating classifications for each document as positive or negative. The process includes defining subsets of the documents and utilizing multiple machine learning models to refine the classifications, ultimately creating a more accurate fourth classification.

Career Highlights

Zhang is currently employed at Capital One Services, LLC, where he applies his skills in machine learning and natural language processing. His work at Capital One allows him to contribute to innovative solutions in the financial sector, leveraging technology to improve customer experiences and operational efficiency.

Collaborations

Xiaowen Zhang collaborates with talented individuals such as Joseph Ford, III, and Cody Stancil. These collaborations foster a creative environment that encourages the exchange of ideas and enhances the development of cutting-edge technologies.

Conclusion

Xiaowen Zhang's contributions to natural language processing exemplify the impact of innovative thinking in technology. His patent and work at Capital One highlight the importance of machine learning in transforming how we interact with text data.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…