San Ramon, CA, United States of America

Xiaomeng Dong

USPTO Granted Patents = 1 

Average Co-Inventor Count = 8.0

ph-index = 1


Company Filing History:


Years Active: 2025

Loading Chart...
1 patent (USPTO):Explore Patents

Title: Innovations of Xiaomeng Dong in Deep Learning Image Processing

Introduction

Xiaomeng Dong is an accomplished inventor based in San Ramon, California. He has made significant contributions to the field of deep learning image processing, particularly in medical imaging technologies. His innovative work has led to the development of systems that enhance the accuracy and efficiency of image analysis.

Latest Patents

Xiaomeng Dong holds a patent for "Deep learning image processing via region-wise parameter maps." This patent describes systems and techniques that facilitate improved deep learning image processing. The system can access a medical image, allocating pixels or voxels among various regions. It generates region-wise parameter maps through the execution of a deep learning neural network, allowing for a transformed version of the medical image to be rendered on an electronic display.

Career Highlights

Xiaomeng Dong is currently employed at GE Precision Healthcare LLC, where he applies his expertise in deep learning and image processing. His work focuses on developing advanced technologies that improve medical imaging capabilities. With one patent to his name, he continues to push the boundaries of innovation in healthcare technology.

Collaborations

Xiaomeng has collaborated with notable colleagues, including Hongxu Yang and Gopal Biligeri Avinash. Their combined efforts contribute to the advancement of deep learning applications in medical imaging.

Conclusion

Xiaomeng Dong's contributions to deep learning image processing represent a significant advancement in medical technology. His innovative patent and work at GE Precision Healthcare LLC highlight his commitment to improving healthcare through technology.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…