San Jose, CA, United States of America

Xinju Li


Average Co-Inventor Count = 3.0

ph-index = 1

Forward Citations = 7(Granted Patents)


Location History:

  • Ann Arbor, MI (US) (2012)
  • San Jose, CA (US) (2013)

Company Filing History:


Years Active: 2012-2013

Loading Chart...
2 patents (USPTO):Explore Patents

Title: Innovations of Inventor Xinju Li

Introduction

Xinju Li is an accomplished inventor based in San Jose, CA. He has made significant contributions to the field of image processing, holding 2 patents that showcase his innovative methods. His work primarily focuses on enhancing image detection techniques through advanced algorithms.

Latest Patents

One of Xinju Li's latest patents is the "Hough transform method for linear ribbon and circular ring detection in the gradient domain." This method involves converting a portion of an image from a first domain to a second domain. The process applies a Hough transform on the converted image, calculating a range of angles for each tested pixel relative to a center pixel. It quantizes the range of angles into multiple bins and utilizes a weighted voting schema to detect features in the image. This method can be implemented by program instructions executing in parallel on CPUs or GPUs.

Career Highlights

Xinju Li is currently employed at Adobe, Inc., where he continues to develop innovative solutions in image processing. His expertise in this area has positioned him as a valuable asset to the company.

Collaborations

Xinju has collaborated with notable coworkers, including Simon Chen and Jen-Chan Jeff Chien, contributing to various projects that enhance Adobe's technological offerings.

Conclusion

Xinju Li's contributions to image processing through his patents and work at Adobe, Inc. highlight his role as a leading inventor in the field. His innovative methods continue to push the boundaries of technology and improve image detection capabilities.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…