Hong Kong, China

Dan Su


Average Co-Inventor Count = 2.0

ph-index = 1

Forward Citations = 1(Granted Patents)


Company Filing History:


Years Active: 2021

Loading Chart...
1 patent (USPTO):Explore Patents

Title: The Innovative Contributions of Dan Su

Introduction

Dan Su is a notable inventor based in Hong Kong, CN. He has made significant strides in the field of gaze estimation technology. With a focus on developing methods that enhance the accuracy of gaze tracking, Su's work is paving the way for advancements in various applications.

Latest Patents

Dan Su holds a patent for a "System and method for gaze estimation." This innovative method involves processing training data to determine one or more local-learning base gaze estimation models. These models can be utilized to identify both 2D gaze points in a scene image and 3D gaze points in scene camera coordinates. His patent reflects a deep understanding of machine learning and its applications in gaze tracking.

Career Highlights

Currently, Dan Su is employed at Hemy8 SA, where he continues to develop cutting-edge technologies. His work at the company emphasizes the importance of accurate gaze estimation in various fields, including virtual reality and human-computer interaction. Su's contributions are recognized for their potential to enhance user experiences through improved gaze tracking.

Collaborations

Dan Su collaborates with Youfu Li, a talented woman in the field. Together, they work on projects that aim to push the boundaries of gaze estimation technology. Their partnership exemplifies the importance of teamwork in driving innovation.

Conclusion

Dan Su's contributions to gaze estimation technology are noteworthy and impactful. His patent and ongoing work at Hemy8 SA highlight his commitment to advancing this field. As technology continues to evolve, inventors like Dan Su play a crucial role in shaping the future of human-computer interaction.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…