Niskayuna, NY, United States of America

Peter Lamb


Average Co-Inventor Count = 4.0

ph-index = 1

Forward Citations = 7(Granted Patents)


Company Filing History:


Years Active: 2019

Loading Chart...
1 patent (USPTO):Explore Patents

Title: The Innovative Contributions of Peter Lamb

Introduction

Peter Lamb is an accomplished inventor based in Niskayuna, NY (US). He has made significant contributions to the field of image processing, particularly in the area of material segmentation in image volumes. His innovative approach combines traditional classification methods with advanced deep learning techniques.

Latest Patents

Peter Lamb holds a patent for "Material segmentation in image volumes." This patent presents a multi-level and multi-channel framework for segmentation that utilizes model-based or 'shallow' classification methods, such as linear regression and support vector machines, followed by deep learning processes. The framework begins with a low-resolution version of multi-channel data, generating a coarser tissue mask that is then used to crop patches from a high-resolution volume. The cropped volume is subsequently processed using a trained convolutional network to achieve deep learning-based segmentation within the slices.

Career Highlights

Peter Lamb is currently employed at General Electric Company, where he applies his expertise in image processing and machine learning. His work has contributed to advancements in various applications, enhancing the capabilities of imaging technologies.

Collaborations

Peter has collaborated with notable colleagues, including Bhushan Dayaram Patil and Roshni Bhagalia. Their combined efforts have fostered innovation and progress in their respective fields.

Conclusion

Peter Lamb's contributions to material segmentation in image volumes exemplify the intersection of traditional and modern techniques in image processing. His work continues to influence advancements in technology and research.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…