Toronto, Canada

Meghan Lele


 

Average Co-Inventor Count = 3.0

ph-index = 1

Forward Citations = 2(Granted Patents)


Company Filing History:


Years Active: 2021

Loading Chart...
Loading Chart...
1 patent (USPTO):Explore Patents

Title: Innovations by Meghan Lele

Introduction

Meghan Lele is a prominent inventor based in Toronto, Canada. She has made significant contributions to the field of convolutional neural networks (CNNs). Her innovative work has led to the development of a unique patent that enhances the efficiency of CNN accelerators.

Latest Patents

Meghan Lele holds a patent titled "Method and apparatus for performing different types of convolution operations with the same processing elements." This patent describes a method for implementing a CNN accelerator that utilizes one or more processing elements to perform convolution. The configuration of the CNN accelerator can be modified to change filters and formatting of output data. This flexibility allows the processing elements to perform deconvolution and backpropagation convolution in response to changes in filters and output data formatting. She has 1 patent to her name.

Career Highlights

Meghan Lele is currently employed at Altera Corporation, where she continues to push the boundaries of technology. Her work focuses on improving the performance and adaptability of CNNs, which are crucial in various applications, including image and speech recognition.

Collaborations

Throughout her career, Meghan has collaborated with talented individuals such as Davor Capalija and Andrew Ling. These collaborations have fostered an environment of innovation and creativity, leading to advancements in their respective fields.

Conclusion

Meghan Lele's contributions to the field of convolutional neural networks exemplify her dedication to innovation and technology. Her patent and work at Altera Corporation highlight her role as a leading inventor in her field.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…