Suwon-si, South Korea

Jinyoung Byun


Average Co-Inventor Count = 3.0

ph-index = 1


Company Filing History:


Years Active: 2022

Loading Chart...
1 patent (USPTO):Explore Patents

Title: Jinyoung Byun: Innovator in Stereo Matching Technology

Introduction

Jinyoung Byun is a prominent inventor based in Suwon-si, South Korea. He has made significant contributions to the field of computer vision, particularly in stereo matching technology. His innovative approach utilizes convolutional neural networks to enhance the accuracy of disparity map generation from stereo images.

Latest Patents

Jinyoung Byun holds a patent for a method titled "Method for stereo matching using end-to-end convolutional neural network." This patent discloses a stereo matching method that generates a disparity map from stereo images. The process involves obtaining a cost volume by applying a first convolutional neural network (CNN) to a left image and a second CNN to a right image. The cost volume is determined based on feature maps extracted from both images. The method further includes normalizing the cost volume using a third CNN, up-sampling the normalized cost volume, and obtaining a disparity map through regression analysis applied to the up-sampled cost volume. He has 1 patent to his name.

Career Highlights

Jinyoung Byun is affiliated with Sungkyunkwan University, where he continues to advance his research in the field of computer vision. His work has garnered attention for its practical applications in various industries, including robotics and autonomous systems.

Collaborations

Some of his notable coworkers include Jaewook Jeon and Phuoc Tien Nguyen, who have collaborated with him on various research projects.

Conclusion

Jinyoung Byun's contributions to stereo matching technology exemplify the innovative spirit of modern inventors. His work not only enhances the field of computer vision but also paves the way for future advancements in technology.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…