Singapore, Singapore

Jinghuan Chen

USPTO Granted Patents = 3 

Average Co-Inventor Count = 4.3

ph-index = 1


Company Filing History:


Years Active: 2023-2024

Loading Chart...
3 patents (USPTO):Explore Patents

Title: Innovations of Jinghuan Chen

Introduction

Jinghuan Chen is a notable inventor based in Singapore, recognized for his contributions to technology through his innovative patents. With a total of three patents to his name, Chen has made significant strides in the field of object correlation prediction and detection methods.

Latest Patents

Chen's latest patents include groundbreaking methods and apparatuses for predicting the correlation between objects. One of his notable inventions involves detecting multiple objects in a target image, including different body parts and a body object. This method utilizes a joint bounding box to predict correlations between the detected objects. Another significant patent focuses on a face-hand correlation degree detection method, which involves acquiring an image, determining feature sets for faces and hands, and assessing the interaction between them.

Career Highlights

Jinghuan Chen is currently employed at Sensetime International Pte. Ltd., where he continues to develop innovative solutions in the realm of image processing and object detection. His work has contributed to advancements in technology that enhance the understanding of interactions between various objects.

Collaborations

Chen has collaborated with esteemed colleagues, including Chunya Liu and Xuesen Zhang, to further his research and development efforts. Their combined expertise has led to the successful creation of innovative technologies that push the boundaries of current capabilities.

Conclusion

Jinghuan Chen's work exemplifies the spirit of innovation and dedication to advancing technology. His patents reflect a commitment to improving object detection and correlation methods, making significant contributions to the field.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…