Austin, TX, United States of America

Mikhail Kotov


Average Co-Inventor Count = 9.0

ph-index = 1


Company Filing History:


Years Active: 2024

Loading Chart...
1 patent (USPTO):

Title: Mikhail Kotov: Innovator in Visual Search Technology

Introduction

Mikhail Kotov is a notable inventor based in Austin, Texas, recognized for his contributions to visual search technology. With a focus on enhancing the way users interact with images, Kotov has developed innovative solutions that bridge the gap between visual and textual information.

Latest Patents

Kotov holds a patent for a technology titled "Text Adjusted Visual Search." This invention provides improved visual search results by combining visual similarity and textual similarity between images. The visual similarity is quantified as a visual similarity score, while the textual similarity is determined based on associated text, such as a title. The overall similarity of two images is quantified as a weighted combination of these scores, allowing for a user-configurable weighting through a control on the search interface. This technology aims to enhance the accuracy and relevance of visual search results.

Career Highlights

Mikhail Kotov is currently employed at Adobe, Inc., where he continues to innovate and develop cutting-edge technologies. His work at Adobe has positioned him as a key player in the field of visual search, contributing to the company's reputation for excellence in digital media solutions.

Collaborations

Kotov has collaborated with talented individuals such as Roland Geisler and Saeid Motiian, further enriching his work and expanding the impact of his innovations.

Conclusion

Mikhail Kotov's contributions to visual search technology exemplify the intersection of creativity and technical expertise. His patent for text adjusted visual search represents a significant advancement in how users can interact with and retrieve visual information.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…