Company Filing History:
Years Active: 2024
Title: Innovations of Wei Pang in Deep Neural Network Hardware Acceleration
Introduction
Wei Pang is an accomplished inventor based in Jiangsu, China. He has made significant contributions to the field of deep learning through his innovative patent. His work focuses on enhancing the efficiency of deep neural network hardware accelerators.
Latest Patents
Wei Pang holds a patent for a "Deep Neural Network Hardware Accelerator Based on Power Exponential Quantization." This invention comprises several components, including an AXI-4 bus interface, input and output cache areas, a weighting cache area, a weighting index cache area, an encoding module, a configurable state controller module, and a processing element (PE) array. The design of the input and output cache areas as a line cache structure allows for efficient data handling. The encoder encodes weightings based on an ordered quantization set, which stores the possible values of the absolute weightings after quantization. The PE unit performs shift calculations using data from the input and weighting index cache areas, significantly reducing the need for computing resources and increasing calculation efficiency.
Career Highlights
Wei Pang is affiliated with Southeast University, where he continues to advance research in hardware acceleration for deep learning applications. His innovative approach has positioned him as a key figure in the development of efficient neural network architectures.
Collaborations
Wei Pang collaborates with notable colleagues, including Shengli Lu and Ruili Wu. Their combined expertise contributes to the advancement of research in the field of deep learning and hardware acceleration.
Conclusion
Wei Pang's contributions to deep neural network hardware acceleration exemplify the innovative spirit of modern inventors. His patent reflects a significant advancement in the efficiency of computational resources, paving the way for future developments in artificial intelligence.