Foster City, CA, United States of America

Volodymyr Kondratenko


Average Co-Inventor Count = 4.0

ph-index = 1

Forward Citations = 1(Granted Patents)


Company Filing History:


Years Active: 2022

Loading Chart...
1 patent (USPTO):

Title: Volodymyr Kondratenko: Innovator in Video Encoding Technology

Introduction

Volodymyr Kondratenko is a notable inventor based in Foster City, CA (US). He has made significant contributions to the field of video encoding, particularly through his innovative methods that enhance the efficiency of video processing.

Latest Patents

Kondratenko holds a patent for a method titled "Predicting encoding parameters for convex hull video encoding." This computer-implemented method involves downsampling and encoding video segments into multiple encoded segments using an analysis encoder. The process includes decoding and upsampling these segments to their original resolution. Furthermore, it determines an analysis encoding parameter value set based on the decoded segments and predicts a target encoding parameter value set for a target encoder. This method aims to optimize video encoding processes, making them more efficient and effective.

Career Highlights

Volodymyr Kondratenko is currently employed at Meta Platforms, Inc., where he continues to develop and refine his innovative ideas in video technology. His work has positioned him as a key player in the advancement of video encoding methods.

Collaborations

Kondratenko has collaborated with notable colleagues, including Ping-Hao Wu and Gaurang Chaudhari. These partnerships have contributed to the development of cutting-edge technologies in the field.

Conclusion

Volodymyr Kondratenko's contributions to video encoding technology exemplify the impact of innovative thinking in the tech industry. His patent and ongoing work at Meta Platforms, Inc. highlight his role as a forward-thinking inventor in this rapidly evolving field.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…