Sunnyvale, CA, United States of America

Bo He


Average Co-Inventor Count = 7.0

ph-index = 1


Company Filing History:


Years Active: 2024

Loading Chart...
1 patent (USPTO):Explore Patents

Title: Bo He - Innovator in Video Event Detection

Introduction

Bo He is an accomplished inventor based in Sunnyvale, California. He has made significant contributions to the field of video processing, particularly in the automation of sports video editing and highlight generation. His innovative work focuses on the precise recognition and localization of events-of-interest in videos.

Latest Patents

Bo He holds a patent for "Transformer-based temporal detection in video." This invention addresses the growing need for automated sports video editing as the volume of sports-related videos generated online continues to rise. The patent describes a two-stage paradigm that effectively detects categories of events and the timing of these events in videos. By utilizing multiple action recognition models to extract high-level semantic features, the transformer-based temporal detection module can accurately locate target events. This approach has achieved state-of-the-art performance in both action spotting and replay grounding, showcasing its potential for various video content beyond sports.

Career Highlights

Bo He is currently employed at Baidu USA LLC, where he continues to develop innovative solutions in video technology. His work has garnered attention for its practical applications and effectiveness in enhancing video processing capabilities.

Collaborations

He has collaborated with notable colleagues, including Zhiyu Cheng and Le Kang, contributing to advancements in their shared field of expertise.

Conclusion

Bo He's contributions to video event detection exemplify the intersection of technology and innovation. His work not only enhances sports video editing but also has broader implications for various video content.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…