Pittsburgh, PA, United States of America

Shreyan Bakshi

USPTO Granted Patents = 1 

Average Co-Inventor Count = 9.0

ph-index = 1


Company Filing History:


Years Active: 2025

Loading Chart...
1 patent (USPTO):Explore Patents

Title: Shreyan Bakshi: Innovator in Voice-Based Assistant Technologies

Introduction

Shreyan Bakshi is a notable inventor based in Pittsburgh, PA (US). He has made significant contributions to the field of voice-based technologies, particularly in the development of assistant systems. His innovative approach has led to the creation of a patent that enhances user interaction with technology.

Latest Patents

Shreyan Bakshi holds a patent for "Voice-based auto-completions and auto-responses for assistant systems." This invention involves a method that includes receiving a first input from a user in a voice modality. The system analyzes this input to generate candidate hypotheses and determines how to present the output to the user. The result is a more intuitive and efficient interaction with assistant systems, as it provides suggested auto-completions based on the user's voice input.

Career Highlights

Currently, Shreyan Bakshi is employed at Meta Platforms, Inc., where he continues to work on advancing voice technology. His role at Meta allows him to collaborate with other talented individuals in the tech industry, contributing to innovative projects that shape the future of communication.

Collaborations

One of his notable coworkers is Fadi Botros, with whom he collaborates on various projects at Meta. Their combined expertise in technology and innovation fosters a creative environment that leads to groundbreaking advancements.

Conclusion

Shreyan Bakshi's work in voice-based technologies exemplifies the impact of innovation in enhancing user experiences with assistant systems. His patent and contributions at Meta Platforms, Inc. highlight his role as a key player in the tech industry.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…