London, United Kingdom

Yazhe Li

USPTO Granted Patents = 4 

Average Co-Inventor Count = 4.2

ph-index = 2

Forward Citations = 11(Granted Patents)


Company Filing History:


Years Active: 2022-2024

Loading Chart...
4 patents (USPTO):

Title: Yazhe Li: Innovator in Latent Space Representations

Introduction

Yazhe Li is a prominent inventor based in London, GB, known for his contributions to the field of machine learning and audio processing. With a total of four patents to his name, he has made significant strides in developing methods and systems that enhance the capabilities of neural networks.

Latest Patents

Yazhe Li's latest patents include "Learning observation representations by predicting the future in latent space" and "Speech coding using content latent embedding vectors and speaker latent embedding vectors." The first patent focuses on training an encoder neural network to process input observations and generate latent representations. This method involves obtaining a sequence of observations and generating context latent representations to estimate future observations. The second patent addresses the generation of discrete latent representations of input audio data, allowing for efficient transmission from an encoder to a decoder system.

Career Highlights

Yazhe Li is currently employed at DeepMind Technologies Limited, where he applies his expertise in machine learning to develop innovative solutions. His work has contributed to advancements in artificial intelligence and audio processing technologies.

Collaborations

Yazhe has collaborated with notable colleagues, including Aaron Gerard Antonius Van Den Oord and Oriol Vinyals, further enhancing the impact of his research and innovations.

Conclusion

Yazhe Li's work in latent space representations and audio processing exemplifies the innovative spirit of modern inventors. His contributions continue to shape the future of technology and artificial intelligence.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…