Kitchener, Canada

Zhe Chen



 

Average Co-Inventor Count = 2.4

ph-index = 2

Forward Citations = 12(Granted Patents)


Location History:

  • Waterloo, CN (2015)
  • Waterloo, CA (2015 - 2018)
  • Kitchener, CA (2017 - 2019)
  • Breslau, CA (2019 - 2022)

Company Filing History:


Years Active: 2015-2022

Loading Chart...
Loading Chart...
Loading Chart...
10 patents (USPTO):Explore Patents

Title: Zhe Chen: Innovator in Motion-Based Input Technology

Introduction

Zhe Chen is a prominent inventor based in Kitchener, Canada, known for his significant contributions to motion-based input technology. With a total of 10 patents to his name, Chen has been at the forefront of developing innovative methods that enhance user interaction with computing devices.

Latest Patents

One of Chen's latest patents is titled "Performing an action associated with a motion based input." This method involves measuring motion data using a motion sensor of a computing device, as well as proximity data through a proximity sensor. The process includes matching the measured motion and proximity data to a gesture, which allows the device to perform an action associated with that gesture. Another patent under the same title describes a method implemented by a computing device that detects motion and matches it with an input model stored in memory. This method determines a confidence level for the match and performs actions based on the confidence threshold.

Career Highlights

Zhe Chen is currently employed at BlackBerry Corporation, where he continues to innovate in the field of motion-based technology. His work has significantly impacted how users interact with devices, making technology more intuitive and responsive.

Collaborations

Chen collaborates with talented individuals such as Nazih Almalki and Marcin Cietwierkowski, contributing to a dynamic team focused on advancing technology in meaningful ways.

Conclusion

Zhe Chen's work in motion-based input technology exemplifies the spirit of innovation. His patents and contributions continue to shape the future of user interaction with technology.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…