Tokyo, Japan

Heiga Zen


Average Co-Inventor Count = 8.0

ph-index = 1

Forward Citations = 4(Granted Patents)


Location History:

  • Epsom, GB (2020)
  • Tokyo, JP (2022)

Company Filing History:


Years Active: 2020-2022

Loading Chart...
2 patents (USPTO):Explore Patents

Title: Heiga Zen: Innovator in Adaptive Audio-Generation Technology

Introduction

Heiga Zen is a prominent inventor based in Tokyo, Japan. He has made significant contributions to the field of audio technology, particularly in the development of adaptive text-to-speech systems. With a total of 2 patents, Zen's work focuses on enhancing the efficiency and adaptability of audio generation models.

Latest Patents

Zen's latest patents include innovative methods, systems, and apparatus for generating an adaptive audio-generation model. These patents detail the process of learning a plurality of embedding vectors and parameter values of a neural network using training data that comprises text and audio data from various individual speakers. The adaptive audio-generation model is designed to be tailored for new individual speakers, allowing for the learning of new embedding vectors that represent their unique voice characteristics.

Career Highlights

Heiga Zen is currently associated with DeepMind Technologies Limited, where he continues to push the boundaries of audio technology. His work has garnered attention for its potential applications in various fields, including artificial intelligence and human-computer interaction.

Collaborations

Zen collaborates with notable colleagues such as Yutian Chen and Scott Ellison Reed, contributing to a dynamic research environment that fosters innovation and creativity.

Conclusion

Heiga Zen's contributions to adaptive audio-generation technology exemplify the impact of innovative thinking in the field of audio processing. His patents reflect a commitment to advancing technology that enhances user experience and accessibility.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…