Irvine, CA, United States of America

Kui Liu


Average Co-Inventor Count = 4.0

ph-index = 1


Company Filing History:


Years Active: 2025

Loading Chart...
1 patent (USPTO):Explore Patents

Title: Innovations of Kui Liu in Image-Based Barcode Detection

Introduction

Kui Liu is an accomplished inventor based in Irvine, California. He has made significant contributions to the field of image processing, particularly in barcode detection technology. His innovative approach has led to the development of a unique method that enhances the efficiency and accuracy of barcode detection.

Latest Patents

Kui Liu holds a patent for an image-based barcode detection method. This method involves capturing an image and partitioning it into sub-images. For each sub-image, the process includes providing it to a detection model, which identifies one or more sub-image regions of interest (SROIs). Each SROI is defined by its position and a symbology category that encompasses various barcode symbologies. The method further generates regions of interest (ROIs) from the SROIs and provides these ROIs to a decoder for processing.

Career Highlights

Kui Liu is currently employed at Zebra Technologies Corporation, a company known for its innovative solutions in barcode and RFID technology. His work at Zebra Technologies has allowed him to apply his expertise in image processing to real-world applications, contributing to the advancement of barcode detection systems.

Collaborations

Some of Kui Liu's notable coworkers include Dongqing Chen and Neeharika Nelaturu. Their collaborative efforts in research and development have further enhanced the capabilities of the technologies they work on.

Conclusion

Kui Liu's contributions to image-based barcode detection exemplify the impact of innovative thinking in technology. His patent reflects a significant advancement in the field, showcasing his dedication to improving barcode detection methods.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…