Chelmsford, MA, United States of America

Yeunung Chen


Average Co-Inventor Count = 3.0

ph-index = 1

Forward Citations = 53(Granted Patents)


Company Filing History:


Years Active: 1992

Loading Chart...
1 patent (USPTO):Explore Patents

Title: Yeunung Chen: Innovator in Speech Recognition Technology

Introduction

Yeunung Chen is a notable inventor based in Chelmsford, MA (US). He has made significant contributions to the field of speech recognition technology. His innovative approach has led to the development of a unique method that enhances the accuracy of speech recognition systems.

Latest Patents

Yeunung Chen holds a patent for a "Method for utilizing formant frequencies in speech recognition." This patent describes a speech recognizer that employs hypothesis testing to determine formant frequencies for effective speech recognition. The system includes a pre-processor that receives speech signal frames and utilizes linear predictive coding to generate all formant frequency candidates. An optimum formant selector works with a comparator to select the formants that best match stored reference formants. Additionally, a dynamic time warper and high-level recognition logic determine whether to declare a recognized word.

Career Highlights

Yeunung Chen is currently employed at Texas Instruments Corporation, where he continues to innovate in the field of speech recognition. His work has been instrumental in advancing the technology used in various applications.

Collaborations

Yeunung has collaborated with esteemed colleagues such as George R Doddington and R Gary Leonard. Their combined expertise has contributed to the success of various projects within the company.

Conclusion

Yeunung Chen's contributions to speech recognition technology exemplify the impact of innovative thinking in the field. His patent and ongoing work at Texas Instruments Corporation highlight his commitment to advancing technology that enhances communication.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…