Berkeley, CA, United States of America

Qiancheng Wu


Average Co-Inventor Count = 5.0

ph-index = 1

Forward Citations = 3(Granted Patents)


Company Filing History:


Years Active: 2024

Loading Chart...
1 patent (USPTO):Explore Patents

Title: Qiancheng Wu: Innovator in Text-Based Image Generation

Introduction

Qiancheng Wu is a prominent inventor based in Berkeley, California. He has made significant contributions to the field of image generation through his innovative methods and devices. His work focuses on the intersection of artificial intelligence and visual content creation.

Latest Patents

Qiancheng Wu holds a patent for a "Method and device for text-based image generation." This patent describes a process that includes obtaining a text that describes the content of an image to be generated. The method involves extracting a text feature vector using a text encoder, determining a semantic mask as spatial constraints for the image, and automatically generating the image using a generative adversarial network (GAN) model based on the semantic mask and text feature vector. He has 1 patent to his name.

Career Highlights

Qiancheng Wu is currently employed at Ping An Technology (Shenzhen) Co., Ltd. His role at the company allows him to explore and develop cutting-edge technologies in image generation. His work has garnered attention for its potential applications in various industries, including entertainment and marketing.

Collaborations

Qiancheng has collaborated with talented individuals such as Yuchuan Gou and Minghao Li. These collaborations have contributed to the advancement of his research and the successful development of innovative technologies.

Conclusion

Qiancheng Wu is a notable inventor whose work in text-based image generation showcases the potential of artificial intelligence in creative fields. His contributions continue to influence the way images are generated and utilized in various applications.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…