Berlin, Germany

Matthieu Guillaumin


Average Co-Inventor Count = 3.7

ph-index = 1


Company Filing History:


Years Active: 2021-2025

Loading Chart...
2 patents (USPTO):Explore Patents

Title: Matthieu Guillaumin: Innovator in 3D Object Retrieval and Image Selection Techniques

Introduction

Matthieu Guillaumin is a prominent inventor based in Berlin, Germany. He has made significant contributions to the fields of three-dimensional object retrieval and image selection techniques. With a total of 2 patents, Guillaumin's work is at the forefront of technological innovation.

Latest Patents

Guillaumin's latest patents include groundbreaking technologies. The first patent, titled "Three Dimensional Object Part Retrieval," describes devices and techniques for retrieving three-dimensional part models. This innovation utilizes two-dimensional image data to generate shape embeddings for different parts of an object. The second patent, "Search Result Image Selection Techniques," focuses on prioritizing images associated with items based on query attributes. This technique employs attention scores to rank images, enhancing the user experience in image retrieval.

Career Highlights

Matthieu Guillaumin is currently employed at Amazon Technologies, Inc., where he continues to develop innovative solutions. His work has garnered attention for its practical applications in various industries, particularly in enhancing the efficiency of image and object retrieval systems.

Collaborations

Guillaumin collaborates with talented individuals such as Nikhil Garg and Bojan Pepik. These partnerships contribute to the advancement of technology and the successful implementation of innovative ideas.

Conclusion

Matthieu Guillaumin is a key figure in the realm of technological innovation, particularly in the areas of 3D object retrieval and image selection techniques. His contributions are shaping the future of how we interact with digital images and objects.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…