Shanghai, China

Xiaorong Ye


Average Co-Inventor Count = 4.0

ph-index = 1

Forward Citations = 6(Granted Patents)


Company Filing History:


Years Active: 2011

Loading Chart...
1 patent (USPTO):Explore Patents

Title: The Innovative Contributions of Xiaorong Ye

Introduction

Xiaorong Ye is a prominent inventor based in Shanghai, China. She has made significant contributions to the field of memory access technology. Her innovative work has led to the development of advanced methods and apparatuses that enhance the efficiency of read/write memory operations.

Latest Patents

Xiaorong Ye holds a patent for the "Calibration of read/write memory access via advanced memory buffer." This patent describes methods and apparatuses to calibrate read/write memory accesses through data buses of different lengths via advanced memory buffers. One embodiment of her invention includes an advanced memory buffer (AMB) that features a plurality of ports to interface with various data buses, a port to connect with a common clock bus, and an adjustable circuit to level delays on the data buses. This technology is particularly beneficial when the data buses have different wire lengths between the dynamic random access memory (DRAM) chips and the advanced memory buffer.

Career Highlights

Xiaorong Ye is currently employed at Montage Technology Group Limited, where she continues to push the boundaries of memory technology. Her work has been instrumental in advancing the capabilities of memory systems, making them more efficient and reliable.

Collaborations

Throughout her career, Xiaorong has collaborated with notable colleagues, including Zhendong Guo and Larry Wu. These partnerships have fostered an environment of innovation and creativity, leading to groundbreaking advancements in their field.

Conclusion

Xiaorong Ye's contributions to memory access technology exemplify her dedication to innovation. Her patent and work at Montage Technology Group Limited highlight her role as a leading inventor in the industry. Her efforts continue to shape the future of memory technology.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…