Beijing, China

Zeng-Guang Hou


Average Co-Inventor Count = 4.0

ph-index = 1

Forward Citations = 5(Granted Patents)


Company Filing History:


Years Active: 2020

Loading Chart...
1 patent (USPTO):Explore Patents

Title: Zeng-Guang Hou: Innovator in Spatio/Spectro-Temporal Data Prediction

Introduction

Zeng-Guang Hou is a prominent inventor based in Beijing, China. He has made significant contributions to the field of data prediction through his innovative methods and systems. His work focuses on utilizing spatio/spectro-temporal data to enhance early classification of outcomes in various applications.

Latest Patents

Zeng-Guang Hou holds a patent for a "Method and system for predicting outcomes based on spatio/spectro-temporal data." This invention leverages temporal or spatio/spectro-temporal data (SSTD) to classify outputs resulting from spatio-temporal patterns. The classification models are based on spiking neural networks (SNN), which are adept at learning and classifying SSTD. This invention has the potential to predict early events across multiple domains, including engineering, bioinformatics, neuroinformatics, ecology, medicine, and economics.

Career Highlights

Zeng-Guang Hou is associated with Aut Ventures Limited, where he applies his expertise in data prediction. His innovative approach has garnered attention in the research community and has implications for various industries.

Collaborations

Zeng-Guang Hou has collaborated with notable individuals in his field, including Nikola Kirilov Kasabov and Valery Feigin. These collaborations have further enriched his research and development efforts.

Conclusion

Zeng-Guang Hou's contributions to the field of spatio/spectro-temporal data prediction highlight his innovative spirit and dedication to advancing technology. His work continues to influence various sectors, paving the way for future advancements in data classification and prediction.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…