Somerville, MA, United States of America

Thomas W Colthurst


Average Co-Inventor Count = 5.3

ph-index = 1

Forward Citations = 1(Granted Patents)


Company Filing History:


Years Active: 2014-2018

Loading Chart...
3 patents (USPTO):Explore Patents

Title: Innovations of Thomas W. Colthurst

Introduction

Thomas W. Colthurst is an accomplished inventor based in Somerville, MA (US). He has made significant contributions to the field of technology, particularly in search engine results and image identification. With a total of 3 patents, Colthurst's work has had a notable impact on how users interact with digital information.

Latest Patents

Colthurst's latest patents include a "Method and apparatus for animating transitions between search results." This technology focuses on creating animated transitions between consecutive sets of search engine results, enhancing user experience. Another significant patent is "Identifying an image for an entity," which involves methods, systems, and apparatus for identifying images associated with specific entities. This method includes identifying a set of resources, each containing images and references to entities, and assigning images based on their relevance to the entities.

Career Highlights

Colthurst is currently employed at Google Inc., where he continues to innovate and develop new technologies. His work at Google has allowed him to be at the forefront of advancements in search engine technology and image processing.

Collaborations

Some of his notable coworkers include Matthew K. Gray and Alison Cichowlas, who have collaborated with him on various projects within the company.

Conclusion

Thomas W. Colthurst's contributions to technology through his patents and work at Google Inc. highlight his role as a significant inventor in the field. His innovative approaches to search results and image identification continue to shape the way users interact with digital content.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…