Kusterdingen, Germany

Patrick Wieschollek


Average Co-Inventor Count = 4.0

ph-index = 1


Company Filing History:


Years Active: 2020-2022

Loading Chart...
2 patents (USPTO):Explore Patents

Title: Innovations by Patrick Wieschollek

Introduction

Patrick Wieschollek is an accomplished inventor based in Kusterdingen, Germany. He has made significant contributions to the field of computer imaging, particularly in the area of deep learning. With a total of two patents to his name, Wieschollek is recognized for his innovative approaches to complex imaging challenges.

Latest Patents

Wieschollek's latest patents focus on a deep-learning method for separating reflection and transmission images visible at a semi-reflective surface in a computer image of a real-world scene. This technology addresses the challenges posed by semi-reflective surfaces, such as windows, where both reflection and transmission of scenes can complicate image processing. By utilizing deep learning techniques, his invention enhances the performance of various computer applications by effectively separating these two components, thereby improving the clarity and usability of generated images.

Career Highlights

Wieschollek is currently employed at Nvidia Corporation, a leading company in the field of graphics processing and artificial intelligence. His work at Nvidia allows him to collaborate with some of the brightest minds in technology, contributing to advancements in imaging and deep learning.

Collaborations

Some of his notable coworkers include Orazio Gallo and Jinwei Gu, who share a commitment to pushing the boundaries of technology and innovation in their respective fields.

Conclusion

Patrick Wieschollek's contributions to deep learning and computer imaging exemplify the impact of innovative thinking in technology. His work continues to influence advancements in the industry, showcasing the importance of separating reflection and transmission in imaging applications.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…