Menlo Park, CA, United States of America

Haowen Ruan


 

Average Co-Inventor Count = 1.0

ph-index = 1


Company Filing History:


Years Active: 2025

Loading Chart...
Loading Chart...
1 patent (USPTO):Explore Patents

Title: Haowen Ruan - Innovator in User Pose Inference Technology

Introduction

Haowen Ruan is a notable inventor based in Menlo Park, CA. He has made significant contributions to the field of user interaction technology. His innovative work focuses on inferring user poses through advanced tracking devices.

Latest Patents

Ruan holds a patent for a groundbreaking invention titled "Inferring user pose using optical data." This patent describes a tracking device that monitors a portion of a user's skin to infer a pose or gesture made by a body part. For instance, the device can track skin on a user's forearm to determine gestures made by the user's hand. The tracking device includes an illumination source to light up the skin area, while an optical sensor captures images of the illuminated skin. A controller uses these images and a machine-learned model to infer the user's pose or gesture. Ruan's patent represents a significant advancement in gesture recognition technology, with 1 patent to his name.

Career Highlights

Haowen Ruan is currently employed at Meta Platforms Technologies, LLC, where he continues to develop innovative technologies. His work at Meta focuses on enhancing user interaction through advanced tracking and gesture recognition systems.

Collaborations

Ruan collaborates with talented individuals in his field, including his coworker Francesco Marsili. Their combined expertise contributes to the development of cutting-edge technologies at Meta.

Conclusion

Haowen Ruan is a pioneering inventor whose work in user pose inference technology is shaping the future of human-computer interaction. His contributions are paving the way for more intuitive and responsive devices.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…