Haifa, Israel

Vladimir Cooperman

USPTO Granted Patents = 2 

Average Co-Inventor Count = 5.0

ph-index = 1

Forward Citations = 12(Granted Patents)


Company Filing History:


Years Active: 2017

Loading Chart...
2 patents (USPTO):Explore Patents

Title: Vladimir Cooperman: Innovator in Dynamic Augmentation and Sensor-Based Interactions

Introduction

Vladimir Cooperman is a notable inventor based in Haifa, Israel. He has made significant contributions to the fields of dynamic augmentation and sensor-based interactions. With a total of 2 patents, Cooperman's work showcases his innovative approach to technology and user interaction.

Latest Patents

Cooperman's latest patents include "Dynamic augmentation of a physical scene" and "Multi-user sensor-based interactions." The first patent involves computer-readable storage media and methods for dynamically modifying the rendering of a physical scene. This technology allows for real-time application of virtual articles to enhance the user's experience. The second patent focuses on observing the actions or states of a user through sensors on a different computing device, enabling seamless communication and interaction without direct sensing.

Career Highlights

Vladimir Cooperman is currently employed at Intel Corporation, where he continues to develop innovative technologies. His work at Intel has positioned him as a key player in advancing user interaction and augmented reality.

Collaborations

Cooperman has collaborated with notable colleagues, including Kobi Nistel and Barak Hurwitz. These partnerships have contributed to the development of his groundbreaking patents and innovations.

Conclusion

Vladimir Cooperman's contributions to technology through his patents and work at Intel Corporation highlight his role as an influential inventor. His innovative approaches to dynamic augmentation and sensor-based interactions continue to shape the future of user experience.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…