Gimpo-si, South Korea

MinHyun Lee


Average Co-Inventor Count = 3.0

ph-index = 1

Forward Citations = 1(Granted Patents)


Company Filing History:


Years Active: 2023

Loading Chart...
1 patent (USPTO):Explore Patents

Title: MinHyun Lee: Innovator in Weakly Supervised Semantic Segmentation

Introduction

MinHyun Lee is a prominent inventor based in Gimpo-si, South Korea. He has made significant contributions to the field of semantic segmentation, particularly through his innovative patent. His work focuses on enhancing the efficiency and accuracy of image processing technologies.

Latest Patents

MinHyun Lee holds a patent for a "Weakly supervised semantic segmentation device and method based on pseudo-masks." This invention discloses a device that includes a localization map generator, a saliency map processor, a multi-label processor, and a pseudo-masks generator. The device is designed to generate a plurality of first localization maps by providing an image to a first classifier. It calculates a saliency loss through a saliency map to identify boundary lines and co-occurring pixels. Additionally, it predicts multi-labels based on the first localization maps and updates the first classifier to generate pseudo-masks.

Career Highlights

MinHyun Lee is affiliated with Yonsei University, where he continues to advance his research and development in the field of semantic segmentation. His work has garnered attention for its innovative approach to weakly supervised learning techniques.

Collaborations

Some of his notable coworkers include Hyunjung Shim and Seungho Lee, who contribute to the collaborative research environment at Yonsei University.

Conclusion

MinHyun Lee's contributions to the field of semantic segmentation through his patented technology demonstrate his commitment to innovation and research excellence. His work is paving the way for advancements in image processing methodologies.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…