Great Neck, NY, United States of America

Chen Wang

USPTO Granted Patents = 2 

Average Co-Inventor Count = 4.0

ph-index = 1


Company Filing History:


Years Active: 2025

Loading Chart...
2 patents (USPTO):Explore Patents

Title: Innovations by Inventor Chen Wang

Introduction

Chen Wang is an accomplished inventor based in Great Neck, NY (US). He has made significant contributions to the field of machine learning and video processing, holding a total of 2 patents. His work focuses on enhancing visual augmentations and clustering video content through advanced algorithms.

Latest Patents

Chen Wang's latest patents include "Embeddings Representing Visual Augmentations" and "Clustering Videos Using a Self-Supervised DNN." The first patent involves accessing an input video item that includes a target visual augmentation. A machine learning model generates an embedding that represents the visual effect of the target augmentation. This model is trained to minimize loss between training video representations across various training sets. The second patent outlines systems and methods for clustering videos by processing RGB video frames and optical flow frames. The system generates optimal assignments for clustering, enhancing the organization of video content.

Career Highlights

Chen Wang is currently employed at Snap Inc., where he applies his expertise in machine learning to develop innovative solutions for video processing. His work has been instrumental in advancing the capabilities of visual content analysis.

Collaborations

Some of Chen Wang's notable coworkers include Huseyin Coskun and Alireza Zareian. Their collaborative efforts contribute to the innovative environment at Snap Inc.

Conclusion

Chen Wang's contributions to the field of machine learning and video processing demonstrate his commitment to innovation. His patents reflect a deep understanding of technology and its applications in enhancing visual content.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…