Austin, TX, United States of America

Jay Ha Whang

USPTO Granted Patents = 4 

Average Co-Inventor Count = 5.7

ph-index = 2

Forward Citations = 8(Granted Patents)


Company Filing History:


Years Active: 2024-2025

Loading Chart...
4 patents (USPTO):

Title: Jay Ha Whang: Innovator in Generative Neural Networks

Introduction

Jay Ha Whang is a prominent inventor based in Austin, TX (US). He has made significant contributions to the field of generative neural networks, holding a total of 4 patents. His work focuses on creating advanced methods for generating videos and images through innovative technologies.

Latest Patents

Whang's latest patents include groundbreaking technologies such as "Generating videos using sequences of generative neural networks" and "Generating images using sequences of generative neural networks." The video generation patent describes a method that involves receiving a text prompt, processing it with a text encoder neural network, and using a sequence of generative neural networks to create a final video that depicts the described scene. Similarly, the image generation patent outlines a process where an input text prompt is transformed into contextual embeddings, which are then processed to produce a final output image that represents the scene described by the prompt.

Career Highlights

Jay Ha Whang is currently employed at Google Inc., where he continues to push the boundaries of technology. His work has garnered attention for its innovative approach to utilizing neural networks in creative applications.

Collaborations

Whang collaborates with talented individuals such as Jonathan Ho and William Chan, contributing to a dynamic and innovative work environment.

Conclusion

Jay Ha Whang is a leading figure in the realm of generative neural networks, with a focus on creating transformative technologies for video and image generation. His contributions are shaping the future of how we interact with digital content.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…