Taipei, Taiwan

Cheng-Chou Chen


Average Co-Inventor Count = 3.0

ph-index = 1

Forward Citations = 1(Granted Patents)


Company Filing History:


Years Active: 2020-2024

Loading Chart...
2 patents (USPTO):Explore Patents

Title: Cheng-Chou Chen: Innovator in Object Positioning and Video Surveillance Technologies

Introduction

Cheng-Chou Chen is a notable inventor based in Taipei, Taiwan. He has made significant contributions to the fields of object positioning and video surveillance. With a total of 2 patents, his work showcases innovative methods that enhance the efficiency and accuracy of these technologies.

Latest Patents

Cheng-Chou Chen's latest patents include an "Object Positioning Method and System" and a "Video Surveillance System and Video Surveillance Method." The object positioning method involves acquiring an original object image, demagnifying it, and using a rough-positioning model to identify feature positions. This process allows for precise determination of the position of a to-be-positioned object within the original image. The video surveillance method captures images of monitored areas to obtain video streams and sensing data. It determines whether a target object triggers a specific event and generates notifications based on preset analysis conditions.

Career Highlights

Cheng-Chou Chen is currently employed at Pegatron Corporation, where he continues to develop and refine his innovative technologies. His work has had a significant impact on the efficiency of video surveillance systems and object positioning methods.

Collaborations

Cheng-Chou has collaborated with notable coworkers, including Tze-Yee Lau and Chin-Yu Ko, contributing to the advancement of their shared projects.

Conclusion

Cheng-Chou Chen's contributions to object positioning and video surveillance technologies highlight his role as an influential inventor. His innovative patents reflect a commitment to enhancing technological capabilities in these fields.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…