San Francisco, CA, United States of America

Xiaoyang Li


Average Co-Inventor Count = 4.0

ph-index = 1


Company Filing History:


Years Active: 2024-2025

Loading Chart...
2 patents (USPTO):Explore Patents

Title: Innovations of Xiaoyang Li

Introduction

Xiaoyang Li is an accomplished inventor based in San Francisco, CA. He has made significant contributions to the field of image processing and visual fingerprinting. With a total of 2 patents, his work showcases innovative approaches to enhancing digital image manipulation and identification.

Latest Patents

Xiaoyang Li's latest patents include "Controllable Makeup Transfer via Scribble Inputs" and "Adversarially Robust Visual Fingerprinting and Image Provenance Models." The first patent describes systems and methods for image processing that allow users to add makeup to a face in an original image based on a scribble input. This technology utilizes a machine learning model to generate a target image that incorporates the specified makeup. The second patent focuses on deep visual fingerprinting models that identify matching digital images and their provenance. It employs robust contrastive learning to enhance the accuracy of image identification, even in the presence of adversarial examples.

Career Highlights

Xiaoyang Li is currently employed at Adobe, Inc., where he continues to push the boundaries of image processing technology. His work has garnered attention for its practical applications in various digital media fields.

Collaborations

Some of Xiaoyang Li's notable coworkers include Maksym Andriushchenko and John Philip Collomosse. Their collaborative efforts contribute to the innovative environment at Adobe, fostering advancements in technology.

Conclusion

Xiaoyang Li's contributions to image processing and visual fingerprinting exemplify the spirit of innovation in technology. His patents reflect a commitment to enhancing digital experiences through advanced methodologies.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…