College Park, MD, United States of America

Yue Hei Ng


Average Co-Inventor Count = 6.0

ph-index = 1

Forward Citations = 6(Granted Patents)


Company Filing History:


Years Active: 2019-2021

Loading Chart...
2 patents (USPTO):Explore Patents

Title: Innovations of Yue Hei Ng in Video Classification

Introduction

Yue Hei Ng is an accomplished inventor based in College Park, MD (US). He has made significant contributions to the field of video classification through his innovative use of neural networks. With a total of 2 patents, his work is paving the way for advancements in how videos are analyzed and categorized.

Latest Patents

Ng's latest patents focus on methods, systems, and apparatus for classifying videos using neural networks. One of the key methods involves obtaining a temporal sequence of video frames from a particular video at various time steps. For each time step, the video frame is processed using a convolutional neural network to generate features. These features are then processed using an LSTM neural network to produce a set of label scores. This process allows for the classification of the video as relating to one or more topics represented by the labels derived from the scores.

Career Highlights

Yue Hei Ng is currently employed at Google Inc., where he continues to develop and refine his innovative approaches to video classification. His work is instrumental in enhancing the capabilities of machine learning applications in the media industry.

Collaborations

Ng has collaborated with notable colleagues such as Sudheendra Vijayanarasimhan and George Dan Toderici. These collaborations have further enriched his research and development efforts in the field of neural networks.

Conclusion

Yue Hei Ng's contributions to video classification through neural networks exemplify the innovative spirit of modern inventors. His patents reflect a deep understanding of technology and its applications in real-world scenarios.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…