Hangzhou, China

Dong Xing


 

Average Co-Inventor Count = 8.5

ph-index = 1


Company Filing History:


Years Active: 2022-2025

Loading Chart...
Loading Chart...
2 patents (USPTO):Explore Patents

Title: Innovations of Dong Xing

Introduction

Dong Xing is a notable inventor based in Hangzhou, China. He has made significant contributions to the field of object recognition and neural signal prediction. With a total of 2 patents, his work showcases the intersection of technology and neuroscience.

Latest Patents

Dong Xing's latest patents include an "Object recognition method and apparatus" and a "Method for prediction of cortical spiking trains." The object recognition method involves an object recognition device that obtains AER data of a to-be-recognized object. This data includes multiple AER events, each comprising a timestamp and address information. The device extracts feature maps from the AER data, which contain both spatial and temporal information, allowing for accurate recognition of the object. The prediction method for cortical spiking trains incorporates natural features of spiking neural signals to optimize prediction models, enhancing their capability to predict neural spike trains.

Career Highlights

Throughout his career, Dong Xing has worked with prominent organizations such as Huawei Technologies Co., Limited and Zhejiang University. His experience in these institutions has allowed him to develop innovative solutions in his field.

Collaborations

Dong has collaborated with notable colleagues, including Gang Pan and Qianhui Liu. Their joint efforts have contributed to advancements in their respective areas of expertise.

Conclusion

Dong Xing's innovative work in object recognition and neural signal prediction highlights his significant contributions to technology and neuroscience. His patents reflect a commitment to advancing these fields and improving our understanding of complex systems.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…