SAN DIEGO, CA, United States of America

Matthew Wnuk

USPTO Granted Patents = 1 

 

Average Co-Inventor Count = 4.0

ph-index = 1


Company Filing History:


Years Active: 2025

Loading Chart...
Loading Chart...
1 patent (USPTO):Explore Patents

Title: The Innovative Contributions of Matthew Wnuk

Introduction

Matthew Wnuk is a notable inventor based in San Diego, California. He has made significant strides in the field of visual speech recognition technology. His work focuses on developing electronic apparatuses that enhance communication through advanced algorithms.

Latest Patents

Wnuk holds a patent for a groundbreaking invention titled "Visual speech recognition based on connectionist temporal classification loss." This electronic apparatus and method utilize a Deep Neural Network (DNN) to analyze video inputs of human speakers. The DNN is trained using a connectionist temporal classification (CTC) loss function, allowing it to predict lip movements accurately. The apparatus detects word boundaries in a sequence of characters corresponding to these lip movements and segments the video into clips that represent individual words. Ultimately, it generates a coherent sequence of word predictions, forming sentences or phrases based on the visual input.

Career Highlights

Matthew Wnuk is currently employed at Sony Group Corporation, where he continues to innovate in the realm of speech recognition technology. His work has the potential to revolutionize how machines interpret human speech through visual cues.

Collaborations

Wnuk collaborates with talented individuals such as Shiwei Jin and Jong Hwa Lee, contributing to the advancement of their shared goals in technology and innovation.

Conclusion

Matthew Wnuk's contributions to visual speech recognition exemplify the intersection of technology and communication. His innovative patent and work at Sony Group Corporation highlight his role as a leading inventor in this field.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…