Vienna, Austria

Martin Kampel


 

Average Co-Inventor Count = 5.0

ph-index = 1

Forward Citations = 1(Granted Patents)


Company Filing History:


Years Active: 2023

Loading Chart...
Loading Chart...
1 patent (USPTO):Explore Patents

Title: Martin Kampel - Innovator in Object Recognition Technology

Introduction

Martin Kampel is a notable inventor based in Vienna, Austria. He has made significant contributions to the field of object recognition technology. His innovative approach has led to the development of a unique method for determining the type and state of an object of interest.

Latest Patents

Kampel holds a patent for a "Method for determining a type and a state of an object of interest." This method involves generating a depth map of a scene using a depth sensor, which captures the object of interest along with any occlusion objects. The process includes computing three 2D occupancy views from different angles and feeding these views into a trained convolutional neural network. The network then provides both a class and a bounding box for the object, allowing for accurate identification and assessment of its state.

Career Highlights

Throughout his career, Martin Kampel has worked with several prominent companies, including Cogvis Software and Consulting GmbH and Toyota Motor Europe NV/SA. His experience in these organizations has allowed him to refine his skills and contribute to advancements in technology.

Collaborations

Kampel has collaborated with notable professionals in his field, including Christopher Pramerdorfer and Rainer Planinc. These partnerships have further enhanced his work and innovation in object recognition.

Conclusion

Martin Kampel's contributions to the field of object recognition technology demonstrate his expertise and innovative spirit. His patented method showcases the potential for advancements in understanding and interacting with objects in various environments.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…