London, United Kingdom

Daniele Quercia

USPTO Granted Patents = 2 

 

Average Co-Inventor Count = 3.4

ph-index = 1


Company Filing History:


Years Active: 2021-2023

Loading Chart...
Loading Chart...
2 patents (USPTO):

Title: Daniele Quercia: Innovator in Virtual Meeting Technologies

Introduction

Daniele Quercia is a prominent inventor based in London, GB. He has made significant contributions to the field of technology, particularly in the realm of virtual meetings and image processing. With a total of 2 patents, Quercia continues to push the boundaries of innovation.

Latest Patents

Quercia's latest patents include a groundbreaking apparatus and method for virtual meetings. This invention involves receiving multiple interaction inputs from participants during a virtual meeting and determining significant time slices based on these inputs. The result is a comprehensive summary of the meeting that includes audio data from key moments. Another notable patent focuses on providing anonymized content within images. This method analyzes an image to identify its scene category and generates an anonymized version by applying a morphing model, balancing between the original image and a generic representation of the identified scene.

Career Highlights

Daniele Quercia is currently employed at Nokia Technologies Oy, where he leverages his expertise to develop innovative solutions. His work has garnered attention for its practical applications in enhancing virtual communication and privacy in image sharing.

Collaborations

Quercia collaborates with talented individuals such as Marios Constantinides and Ke Zhou, contributing to a dynamic and innovative work environment.

Conclusion

Daniele Quercia stands out as an influential inventor in the technology sector, particularly with his advancements in virtual meeting technologies and image processing. His contributions continue to shape the future of communication and privacy in digital interactions.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…