Berlin, Germany

Michael Donoser

USPTO Granted Patents = 10 

Average Co-Inventor Count = 4.9

ph-index = 3

Forward Citations = 89(Granted Patents)


Location History:

  • Gerlin, DE (2022)
  • Berlin, DE (2016 - 2023)

Company Filing History:


Years Active: 2016-2023

Loading Chart...
10 patents (USPTO):Explore Patents

Title: Biography of Inventor Michael Donoser

Introduction: Michael Donoser is a prominent inventor based in Berlin, Germany. He has made significant contributions to the field of technology, particularly in content selection and image data analysis. With a total of 10 patents to his name, Donoser continues to innovate and push the boundaries of his field.

Latest Patents: Among his latest patents is the "Attribute-based content selection and search," which describes systems and techniques for selecting and searching content based on visual attributes. This invention includes a graphical user interface (GUI) that displays products with various visual attributes, allowing users to make selections based on visual similarities and dissimilarities. Another notable patent is "Relevant text identification based on image feature selection," which focuses on predicting text relevant to image data by establishing a correspondence between image portions and related text.

Career Highlights: Michael Donoser currently works at Amazon Technologies, Inc., where he applies his expertise in developing innovative solutions. His work has been instrumental in enhancing user experience through advanced content selection and image analysis techniques.

Collaborations: Throughout his career, Donoser has collaborated with notable colleagues, including Pradeep Krishna Yarlagadda and Cédric Philippe Charles Jean Ghislain Archambeau. These collaborations have further enriched his work and contributed to the success of various projects.

Conclusion: Michael Donoser is a distinguished inventor whose work continues

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…