Ann Arbor, MI, United States of America

Ryan Szeto


Average Co-Inventor Count = 3.0

ph-index = 1


Company Filing History:


Years Active: 2022-2024

Loading Chart...
2 patents (USPTO):Explore Patents

Title: Ryan Szeto: Innovator in Video Processing Technologies

Introduction

Ryan Szeto is a prominent inventor based in Ann Arbor, MI (US). He has made significant contributions to the field of video processing, holding 2 patents that showcase his innovative approach to enhancing image and video quality through advanced techniques.

Latest Patents

Ryan's latest patents include a "System and method for multi-frame contextual attention for multi-frame image and video processing using deep neural networks." This invention provides a method for obtaining a reference frame, identifying context frames, and producing a refined reference frame through processing. Another notable patent is the "System and method for video processing with enhanced temporal consistency." This system maintains temporal consistency across video frames while converting input videos from one frame rate to another, generating processed frames, and aggregating them to yield a final output video.

Career Highlights

Ryan Szeto is currently employed at Samsung Electronics Co., Ltd., where he continues to push the boundaries of video processing technology. His work focuses on developing systems that improve the quality and consistency of video content, making significant strides in the industry.

Collaborations

Ryan has collaborated with talented individuals such as Mostafa El-Khamy and Jungwon Lee, contributing to the advancement of innovative video processing solutions.

Conclusion

Ryan Szeto's contributions to video processing technologies through his patents and work at Samsung Electronics Co., Ltd. highlight his role as a key innovator in the field. His advancements are paving the way for future developments in image and video processing.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…