Singapore, Singapore

Chunya Liu

USPTO Granted Patents = 5 

Average Co-Inventor Count = 4.2

ph-index = 1


Company Filing History:


Years Active: 2023-2024

Loading Chart...
5 patents (USPTO):Explore Patents

Title: Chunya Liu: Innovator in Object Correlation Prediction

Introduction

Chunya Liu is a prominent inventor based in Singapore, known for his contributions to the field of object correlation prediction. With a total of five patents to his name, Liu has made significant strides in developing methods and devices that enhance the understanding of interactions between various objects.

Latest Patents

Liu's latest patents include innovative technologies such as methods, apparatuses, devices, and storage mediums for predicting correlation between objects. This patent outlines a method that involves detecting multiple objects in a target image, determining a joint bounding box surrounding these objects, and predicting correlations based on the identified regions. Another notable patent focuses on a face-hand correlation degree detection method, which includes acquiring an image, determining feature sets for faces and hands, and assessing the correlation between them based on interaction features.

Career Highlights

Chunya Liu is currently employed at Sensetime International Pte. Ltd., where he continues to push the boundaries of technology in his field. His work has garnered attention for its practical applications in various industries, particularly in enhancing human-computer interaction.

Collaborations

Liu has collaborated with notable colleagues such as Xuesen Zhang and Bairun Wang, contributing to a dynamic research environment that fosters innovation and creativity.

Conclusion

Chunya Liu's work exemplifies the spirit of innovation in the realm of object correlation prediction. His patents not only advance technological understanding but also pave the way for future developments in the field.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…