Guangdong, China

Xinping Yu


Average Co-Inventor Count = 1.0


Company Filing History:


Years Active: 2025

Loading Chart...
1 patent (USPTO):Explore Patents

Title: The Innovative Contributions of Xinping Yu

Introduction

Xinping Yu is a notable inventor based in Guangdong, China. He has made significant strides in the field of gaze estimation technology. His work focuses on developing methods and apparatuses that enhance the understanding of gaze differences.

Latest Patents

Xinping Yu holds a patent for a "Method and apparatus for measuring gaze difference based on gaze estimation." This innovative patent discloses a method that involves obtaining video data of a testee gazing at a visual target. A face image sequence is extracted from this video data, which is then processed through a first neural network for key frame extraction. The resulting face images are analyzed using a second neural network for facial feature point extraction. This process allows for cropping based on facial feature point coordinates to obtain eye region images. Ultimately, a gaze difference estimation network is trained to predict the gaze difference between the first and second eye region images.

Career Highlights

Throughout his career, Xinping Yu has worked with several prestigious institutions. He has been associated with the Zhongshan Ophthalmic Center, Sun Yat-sen University, and Tsinghua University. His experience in these organizations has contributed to his expertise in gaze estimation technology.

Collaborations

Xinping Yu has collaborated with notable individuals in his field, including Ruixin Wang and Haotian Lin. These collaborations have further enriched his research and development efforts.

Conclusion

Xinping Yu's contributions to gaze estimation technology are noteworthy and reflect his dedication to innovation. His patent and career achievements highlight his role as a significant inventor in this specialized field.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…