Burlington, MA, United States of America

Qinlan Shen

USPTO Granted Patents = 1 

Average Co-Inventor Count = 5.0

ph-index = 1


Company Filing History:


Years Active: 2025

Loading Chart...
1 patent (USPTO):Explore Patents

Title: Innovations by Qinlan Shen in Machine Learning

Introduction

Qinlan Shen is an accomplished inventor based in Burlington, MA (US). He has made significant contributions to the field of machine learning, particularly in the area of data augmentation techniques. His innovative approach aims to enhance the training of machine learning models, making them more effective and efficient.

Latest Patents

Qinlan Shen holds a patent for "Guided augmentation of data sets for machine learning models." This patent discloses techniques for augmenting data sets used in training machine learning models and generating predictions. The methods described in the patent focus on increasing the number and diversity of examples within an initial training dataset by extracting subsets of words from existing sentences. Additionally, these techniques are designed to conserve scarce sample data in few-shot situations by training a data generation model using general data obtained from a broader data source. He has 1 patent to his name.

Career Highlights

Qinlan Shen is currently employed at Oracle International Corporation, where he continues to develop innovative solutions in machine learning. His work has positioned him as a key player in the advancement of data augmentation techniques, which are crucial for improving the performance of machine learning models.

Collaborations

Some of his notable coworkers include Ariel Gedaliah Kobren and Swetasudha Panda. Their collaborative efforts contribute to the innovative environment at Oracle International Corporation.

Conclusion

Qinlan Shen's contributions to machine learning through his patent and work at Oracle International Corporation highlight his role as a significant inventor in the field. His techniques for data augmentation are paving the way for more robust machine learning applications.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…