Seattle, WA, United States of America

Hye Jin Jang


 

Average Co-Inventor Count = 4.0

ph-index = 1


Company Filing History:


Years Active: 2024

Loading Chart...
Loading Chart...
1 patent (USPTO):Explore Patents

Title: Hye Jin Jang: Innovator in Neural Network Technology

Introduction

Hye Jin Jang is a prominent inventor based in Seattle, WA, known for her contributions to the field of neural networks. She has made significant strides in developing innovative methods that enhance the performance of speech tasks through advanced technology.

Latest Patents

Hye Jin Jang holds a patent for "Orthogonally constrained multi-head attention for speech tasks." This patent describes a method for operating a neural network that involves receiving an input sequence at an encoder. The input sequence is encoded to produce a set of hidden representations. Attention-heads of the neural network calculate attention weights based on these hidden representations. A context vector is then calculated for each attention-head based on the attention weights and the hidden representations. Each of the context vectors corresponds to a portion of the input sequence, and an inference is output based on these context vectors.

Career Highlights

Hye Jin Jang is currently employed at Qualcomm Incorporated, where she applies her expertise in neural networks to develop cutting-edge technologies. Her work has been instrumental in advancing the capabilities of speech recognition systems.

Collaborations

Hye Jin Jang collaborates with talented colleagues, including Mingu Lee and Jinkyu Lee, who contribute to her innovative projects and research initiatives.

Conclusion

Hye Jin Jang is a trailblazer in the field of neural networks, with her patent showcasing her innovative approach to enhancing speech tasks. Her work at Qualcomm Incorporated and collaborations with esteemed colleagues further solidify her impact in the technology sector.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…