Mountain View, CA, United States of America

Junwen Bai

USPTO Granted Patents = 1 


Average Co-Inventor Count = 7.0

ph-index = 1

Forward Citations = 1(Granted Patents)


Company Filing History:


Years Active: 2025

Loading Chart...
Loading Chart...
1 patent (USPTO):Explore Patents

Title: Innovations by Junwen Bai in Multilingual Automatic Speech Recognition

Introduction

Junwen Bai is an accomplished inventor based in Mountain View, CA, who has made significant contributions to the field of automatic speech recognition (ASR). His innovative work focuses on enhancing multilingual ASR systems, which are crucial for improving communication across different languages.

Latest Patents

Junwen Bai holds a patent for a method titled "Joint unsupervised and supervised training for multilingual ASR." This method involves receiving audio features and generating a latent speech representation based on those features. It also includes generating a target quantized vector token and a target token index for the corresponding latent speech representation. Furthermore, the method generates a contrastive context vector for either an unmasked or masked latent speech representation and derives a contrastive self-supervised loss based on the contrastive context vector and the target quantized vector token. Additionally, it generates a high-level context vector based on the contrastive context vector and learns to predict the target token index at the corresponding time step using a cross-entropy loss. Ultimately, the method predicts speech recognition hypotheses for the utterance and trains a multilingual ASR model using both unsupervised and supervised losses.

Career Highlights

Junwen Bai is currently employed at Google Inc., where he continues to push the boundaries of technology in speech recognition. His work is instrumental in developing systems that can understand and process multiple languages, making communication more

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…