Zurich, Switzerland

Christos Sakaridis


 

Average Co-Inventor Count = 4.0

ph-index = 1


Company Filing History:


Years Active: 2024-2025

Loading Chart...
Loading Chart...
2 patents (USPTO):Explore Patents

Title: Christos Sakaridis: Innovator in Semantic Segmentation

Introduction

Christos Sakaridis is a prominent inventor based in Zurich, Switzerland. He has made significant contributions to the field of image processing, particularly in semantic segmentation. With a total of 2 patents, Sakaridis is recognized for his innovative approaches to enhancing image analysis under challenging conditions.

Latest Patents

Sakaridis' latest patents include a system and method for training a model to perform semantic segmentation on low visibility images using high visibility images with a close camera view. This method involves obtaining a plurality of sets of images, iteratively training the model, and processing preliminary semantic segmentation labels to improve accuracy. Another notable patent focuses on training a model to perform semantic segmentation on foggy images, which is crucial for applications in various fields, including autonomous driving and surveillance.

Career Highlights

Throughout his career, Sakaridis has worked with esteemed organizations such as ETH Zurich and Toyota Jidosha Kabushiki Kaisha. His experience in these institutions has allowed him to refine his skills and contribute to groundbreaking research in image processing technologies.

Collaborations

Sakaridis has collaborated with notable professionals in his field, including Luc Van Gool and Dengxin Dai. These partnerships have further enriched his work and expanded the impact of his innovations.

Conclusion

Christos Sakaridis stands out as a key figure in the realm of semantic segmentation, with his patents paving the way for advancements in image processing. His work continues to influence the industry and inspire future innovations.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…