Pittsburgh, PA, United States of America

Kang Li


Average Co-Inventor Count = 4.0

ph-index = 1

Forward Citations = 18(Granted Patents)


Company Filing History:


Years Active: 2011

Loading Chart...
1 patent (USPTO):Explore Patents

Title: Kang Li - Innovator in Image Segmentation Technology

Introduction

Kang Li is a notable inventor based in Pittsburgh, PA, who has made significant contributions to the field of image segmentation. With a focus on developing systems and methods for efficient segmentation of globally optimal surfaces, his work has implications for various applications in volumetric datasets.

Latest Patents

Kang Li holds a patent for a "System and methods for image segmentation in n-dimensional space." This innovative system provides efficient segmentation of object boundaries in volumetric datasets. The optical surface detection system he developed is capable of simultaneously detecting multiple interacting surfaces, with optimality controlled by cost functions designed for individual surfaces and geometric constraints that define surface smoothness and interrelations. He has 1 patent to his name.

Career Highlights

Throughout his career, Kang Li has worked with esteemed institutions such as the University of Iowa Research Foundation and the University of Notre Dame. His experience in these organizations has allowed him to refine his skills and contribute to groundbreaking research in image processing.

Collaborations

Kang Li has collaborated with notable colleagues, including Xiaodong Wu and Danny Ziyi Chen. Their combined expertise has further advanced the field of image segmentation and enhanced the impact of their research.

Conclusion

Kang Li's innovative work in image segmentation technology showcases his dedication to advancing the field. His contributions, particularly through his patent, highlight the importance of efficient systems in processing complex volumetric data.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…