Changhua County, Taiwan

Chang-Kun Yao


Average Co-Inventor Count = 2.0

ph-index = 1

Forward Citations = 1(Granted Patents)


Company Filing History:


Years Active: 2017

Loading Chart...
1 patent (USPTO):Explore Patents

Title: Chang-Kun Yao: Innovator in Object Classification Technology

Introduction

Chang-Kun Yao is a notable inventor based in Changhua County, Taiwan. He has made significant contributions to the field of object classification through his innovative patent. His work focuses on developing adaptive devices and methods that enhance the efficiency of classifying objects.

Latest Patents

Chang-Kun Yao holds a patent for an "Adaptive device and adaptive method for classifying objects with parallel architecture." This invention provides a method that stores multiple scene parameters and classifier parameters. It retrieves image data and encloses obstruction images within frames, determining a frame range based on the obstructions. The method employs several image processing units to calculate obstruction characteristic data in a parallel processing manner. It selects and computes the relevant parameters to generate classification data, enabling real-time detection of obstructions.

Career Highlights

Chang-Kun Yao is currently associated with the Automotive Research & Test Center, where he applies his expertise in developing advanced technologies for the automotive industry. His innovative approach has led to the creation of solutions that improve object classification processes.

Collaborations

Chang-Kun Yao collaborates with Zhen-Wei Zhu, contributing to the advancement of their shared projects and enhancing the research outcomes at their institution.

Conclusion

Chang-Kun Yao's work in adaptive object classification technology showcases his innovative spirit and dedication to improving classification methods. His contributions are valuable to the automotive sector and beyond.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…