Hong Kong, China

Bolei Zhou

USPTO Granted Patents = 1 

Average Co-Inventor Count = 4.0

ph-index = 1

Forward Citations = 2(Granted Patents)


Company Filing History:


Years Active: 2021

Loading Chart...
1 patent (USPTO):

Title: Bolei Zhou: Innovator in Context-Aware Action Recognition

Introduction

Bolei Zhou is a notable inventor based in Hong Kong, China. He has made significant contributions to the field of action recognition through his innovative patent. His work focuses on enhancing the understanding of video data, which has applications in various technological domains.

Latest Patents

Bolei Zhou holds a patent titled "Context-aware action recognition by dual attention networks." This patent describes a computer-implemented method for performing recognition by receiving video data and executing a pre-attention prediction to generate first prediction priors. The method further involves utilizing a dual attention module to create attention maps that highlight regions of interest within the video frames. These attention maps facilitate the generation of enhanced feature representations, which are crucial for accurate post-attention predictions.

Career Highlights

Throughout his career, Bolei Zhou has worked with prestigious organizations, including IBM and the Massachusetts Institute of Technology. His experience in these institutions has allowed him to collaborate with leading experts in the field and contribute to groundbreaking research.

Collaborations

Bolei Zhou has collaborated with notable individuals such as Quanfu Fan and Dan Gutfreund. These partnerships have enriched his work and expanded the impact of his innovations.

Conclusion

Bolei Zhou's contributions to context-aware action recognition demonstrate his expertise and commitment to advancing technology. His innovative patent and collaborations with esteemed organizations and individuals highlight his significant role in the field.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…