Beijing, China

Hanwen Gao


Average Co-Inventor Count = 11.0

ph-index = 1


Company Filing History:


Years Active: 2022

where 'Filed Patents' based on already Granted Patents

1 patent (USPTO):

Title: Innovations of Hanwen Gao in Vision-LiDAR Fusion Technology

Introduction

Hanwen Gao is a prominent inventor based in Beijing, China. He has made significant contributions to the field of vision and LiDAR technology. His innovative work focuses on enhancing object detection systems through advanced data fusion methods.

Latest Patents

Hanwen Gao holds a patent for a "Vision-LiDAR fusion method and system based on deep canonical correlation analysis." This method involves collecting RGB images and point cloud data of a road surface synchronously. It includes extracting features from RGB images to obtain RGB features and performing coordinate system conversion and rasterization on point cloud data. The process culminates in the input of fused point cloud features into an object detection network for effective object detection. This patent showcases his expertise in integrating different modalities for improved technological outcomes. He has 1 patent to his name.

Career Highlights

Hanwen Gao is affiliated with Tsinghua University, where he continues to push the boundaries of research in his field. His work is characterized by a commitment to advancing the capabilities of autonomous systems through innovative data processing techniques.

Collaborations

Hanwen has collaborated with notable colleagues, including Xinyu Zhang and Li Wang. Their joint efforts contribute to the ongoing research and development in the area of vision and LiDAR technologies.

Conclusion

Hanwen Gao's contributions to the field of vision-LiDAR fusion technology exemplify the impact of innovative thinking in enhancing object detection systems. His work at Tsinghua University and his patented methods are paving the way for future advancements in this critical area of research.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…