Hangzhou, China

Kaizhang Kang


Average Co-Inventor Count = 3.0

ph-index = 1


Company Filing History:


Years Active: 2023

Loading Chart...
1 patent (USPTO):Explore Patents

Title: Kaizhang Kang: Innovator in 3D Object Analysis

Introduction

Kaizhang Kang is a prominent inventor based in Hangzhou, China. He has made significant contributions to the field of three-dimensional object analysis through his innovative methods. His work focuses on utilizing neural networks to enhance the accuracy and efficiency of obtaining geometric and material properties of 3D objects.

Latest Patents

Kaizhang Kang holds a patent for "Methods for obtaining normal vector, geometry and material of three-dimensional objects based on neural network." This method involves actively irradiating an object with specific patterns while capturing images simultaneously. By calculating the obtained photos, the method allows for the acquisition of a normal vector, which is then used to optimize the object's model. This approach not only provides high-quality geometric results but also captures material feature information effectively. The method is notable for its high accuracy and versatility, as it is not limited to specific acquisition devices. He has 1 patent to his name.

Career Highlights

Throughout his career, Kaizhang Kang has worked with esteemed institutions such as Zhejiang University and Faceunity Technology Co., Ltd. His experience in these organizations has contributed to his expertise in the field of neural networks and 3D object analysis.

Collaborations

Kaizhang has collaborated with notable individuals in his field, including Hongzhi Wu and Kun Zhou. These partnerships have further enriched his research and development efforts.

Conclusion

Kaizhang Kang's innovative methods in 3D object analysis demonstrate his commitment to advancing technology through neural networks. His contributions are paving the way for more accurate and efficient methods in the field.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…