Zurich, Switzerland

Rafael Huber


Average Co-Inventor Count = 5.0

ph-index = 1

Forward Citations = 1(Granted Patents)


Company Filing History:


Years Active: 2018

Loading Chart...
1 patent (USPTO):Explore Patents

Title: Rafael Huber: Innovator in Video Salience Technology

Introduction

Rafael Huber is a notable inventor based in Zurich, Switzerland. He has made significant contributions to the field of video technology, particularly in understanding how visual salience can predict the success of online videos. His innovative approach combines computer science with predictive algorithms to enhance video performance metrics.

Latest Patents

Rafael Huber holds a patent titled "Visual salience of online video as a predictor of success." This patent encompasses systems, methods, and computer program products designed to compute a saliency value for a video. The process involves analyzing the saliency values of a set of pixels in each frame of the video. Furthermore, it computes an expected value for a metric using a predictive algorithm based on the saliency value, ultimately outputting this expected value as an indication of the anticipated outcome for the metric achieved by the video. He has 1 patent to his name.

Career Highlights

Rafael Huber is currently employed at Disney Enterprises, Inc., where he continues to innovate in the realm of video technology. His work focuses on enhancing the effectiveness of online video content through advanced analytical methods.

Collaborations

Rafael collaborates with talented individuals such as Seth Frey and Anthony M Accardo, contributing to a dynamic team that pushes the boundaries of video technology.

Conclusion

Rafael Huber's work in video salience technology exemplifies the intersection of creativity and technical expertise. His contributions are paving the way for more effective online video content, showcasing the importance of innovation in the digital age.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…