Cambridge, MA, United States of America

Vincent Michael Casser

USPTO Granted Patents = 4 

 

Average Co-Inventor Count = 4.8

ph-index = 1


Company Filing History:


Years Active: 2023-2025

Loading Chart...
Loading Chart...
4 patents (USPTO):

Title: Vincent Michael Casser: Innovator in Neural Networks

Introduction

Vincent Michael Casser is a prominent inventor based in Cambridge, MA (US). He has made significant contributions to the field of neural networks, holding a total of 4 patents. His work focuses on advancing technologies that enhance image processing and depth prediction.

Latest Patents

Casser's latest patents include "Training instance segmentation neural networks through contrastive learning" and "Unsupervised depth prediction neural networks." The first patent describes a system that utilizes one or more computers to process training images and generate embedding outputs for instance segmentation. This innovative approach aims to improve the accuracy of object detection in images. The second patent outlines a system for generating depth outputs from input images depicting the same scene. This system processes background images and characterizes camera motion to enhance depth prediction capabilities.

Career Highlights

Throughout his career, Vincent has worked with notable companies such as Google Inc. and Waymo LLC. His experience in these leading technology firms has allowed him to develop and refine his expertise in neural networks and image processing.

Collaborations

Casser has collaborated with talented individuals in the field, including Soeren Pirk and Reza Mahjourian. These partnerships have contributed to the advancement of his research and innovations.

Conclusion

Vincent Michael Casser is a distinguished inventor whose work in neural networks has the potential to revolutionize image processing and depth prediction technologies. His contributions continue to shape the future of these fields.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…