Mountain View, CA, United States of America

Yichen Pan


Average Co-Inventor Count = 2.0

ph-index = 1

Forward Citations = 15(Granted Patents)


Company Filing History:


Years Active: 2019

Loading Chart...
1 patent (USPTO):Explore Patents

Title: Yichen Pan - Innovator in Human-Computer Interaction

Introduction

Yichen Pan is a notable inventor based in Mountain View, CA (US). He has made significant contributions to the field of Human-Computer Interfaces (HCI). His innovative work focuses on enhancing user interaction with computers through various gesture recognition techniques.

Latest Patents

Yichen Pan holds a patent titled "Depth-value classification using forests." This patent addresses the use of depth cameras in HCI systems to recognize user gestures, such as hand, head, and body movements. The invention employs a classifier trained with vectors that include both base and extended components, leading to more accurate gesture classification. The base vector consists of a leaf-based assessment of classification results for a given depth value candidate pixel, while the extended vector incorporates additional information from related pixels. This improved structure, combined with various optimization methods, allows for more efficient in-situ operation.

Career Highlights

Yichen Pan is currently associated with Youspace, Inc., where he continues to develop innovative solutions in the realm of HCI. His work has the potential to revolutionize how users interact with technology, making it more intuitive and responsive.

Collaborations

Yichen collaborates with Ralph T Brunner, contributing to the advancement of HCI technologies.

Conclusion

Yichen Pan's contributions to the field of Human-Computer Interaction through his innovative patent demonstrate his commitment to enhancing user experience. His work is paving the way for more intuitive and efficient interactions between humans and computers.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…