Beijing, China

Qingchang Hao


Average Co-Inventor Count = 5.2

ph-index = 1


Company Filing History:


Years Active: 2019-2022

Loading Chart...
3 patents (USPTO):Explore Patents

Title: Qingchang Hao: Innovator in Acoustic Model Training and Synchronization Technologies

Introduction

Qingchang Hao is a prominent inventor based in Beijing, China. He has made significant contributions to the fields of acoustic model training and audio-visual synchronization. With a total of 3 patents to his name, Hao continues to push the boundaries of technology through his innovative approaches.

Latest Patents

Hao's latest patents include a "Method and device for training an acoustic model." This invention provides a method for determining multiple tasks for training an acoustic model, optimizing resource usage across various nodes, and enhancing training efficiency through parallel processing. Another notable patent is the "Method and apparatus for synchronously playing image and audio." This invention focuses on acquiring play service requests and synchronizing audio and image data to ensure a seamless playback experience.

Career Highlights

Qingchang Hao is currently employed at Baidu Online Network Technology (Beijing) Co., Ltd. His work at Baidu has allowed him to develop cutting-edge technologies that have practical applications in various industries. His expertise in acoustic modeling and synchronization has positioned him as a key player in the tech landscape.

Collaborations

Hao collaborates with talented individuals such as Yunfeng Li and Yutao Gai. These partnerships foster a creative environment that enhances innovation and drives technological advancements.

Conclusion

Qingchang Hao is a distinguished inventor whose work in acoustic model training and audio-visual synchronization has made a significant impact in the technology sector. His innovative patents and collaborations continue to shape the future of these fields.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…