Daejeon, South Korea

Kwanyong Park

USPTO Granted Patents = 1 

Average Co-Inventor Count = 4.0

ph-index = 1


Company Filing History:


Years Active: 2025

Loading Chart...
1 patent (USPTO):Explore Patents

Title: Innovations of Kwanyong Park in Video Object Segmentation

Introduction

Kwanyong Park is an accomplished inventor based in Daejeon, South Korea. He has made significant contributions to the field of machine learning, particularly in video object segmentation. His innovative work has led to the development of a patented technology that enhances the way objects are identified and segmented in video sequences.

Latest Patents

Kwanyong Park holds a patent for "Per-clip video object segmentation using machine learning." This patent describes a method for performing per-clip object segmentation of objects in a video sequence. The system involves receiving a query video sequence and memory data, which includes a memory video frame and an annotated memory video frame with an object mask. The process segments the query video sequence into multiple clips and utilizes a trained encoder-decoder network to predict object masks for the objects in the video.

Career Highlights

Kwanyong Park is currently employed at Adobe, Inc., where he continues to push the boundaries of technology in video processing. His work at Adobe has allowed him to collaborate with other talented professionals in the field, further enhancing the capabilities of machine learning applications.

Collaborations

Some of his notable coworkers include Joon-Young Lee and Seoung Wug Oh. Their collaborative efforts contribute to the innovative environment at Adobe, fostering advancements in video technology.

Conclusion

Kwanyong Park's contributions to video object segmentation through machine learning exemplify the impact of innovation in technology. His patent and work at Adobe highlight the importance of collaboration and creativity in advancing the field.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…