Beijing, China

Zhongwei Cheng


Average Co-Inventor Count = 2.9

ph-index = 1

Forward Citations = 1(Granted Patents)


Company Filing History:


Years Active: 2017-2019

Loading Chart...
2 patents (USPTO):Explore Patents

Title: Innovations of Zhongwei Cheng

Introduction

Zhongwei Cheng is a notable inventor based in Beijing, China. He has made significant contributions to the field of technology, particularly in the area of detecting abnormal situations and recognizing the dangerousness of objects. With a total of 2 patents, his work showcases innovative methods that enhance safety and efficiency.

Latest Patents

Zhongwei Cheng's latest patents include a method and apparatus for detecting abnormal situations. This invention involves detecting whether a first target exists in an obtained image and recognizing if the target holds an object. The method further includes obtaining motion information of the object and determining the existence of an abnormal situation based on this information. Another patent focuses on recognizing the dangerousness of an object. This method generates a heterogeneous point cloud from an image captured by a stereo camera, allowing for the assessment of the object's solid shape and surface features to determine its dangerousness.

Career Highlights

Zhongwei Cheng is currently employed at Ricoh Company, Ltd., where he continues to develop innovative solutions. His work is instrumental in advancing technology that addresses safety concerns in various applications.

Collaborations

He collaborates with talented coworkers such as Shengyin Fan and Xin Wang, contributing to a dynamic and innovative work environment.

Conclusion

Zhongwei Cheng's contributions to technology through his patents reflect his commitment to innovation and safety. His work not only enhances the understanding of object recognition but also plays a crucial role in developing methods for detecting abnormal situations.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…