Kirkland, WA, United States of America

James Steven Hegarty

USPTO Granted Patents = 1 

Average Co-Inventor Count = 5.0

ph-index = 1


Company Filing History:


Years Active: 2023

Loading Chart...
1 patent (USPTO):Explore Patents

Title: Innovations of James Steven Hegarty

Introduction

James Steven Hegarty is an accomplished inventor based in Kirkland, WA (US). He has made significant contributions to the field of depth sensing technology. His innovative approach has led to the development of a unique method that enhances the accuracy of depth perception in electronic devices.

Latest Patents

Hegarty holds a patent for "Methods for depth sensing using candidate images selected based on an epipolar line." This method involves receiving an image of a projected illumination pattern and selecting a candidate image from a plurality of options. The depth for the portion of the image is then determined based on the depth information associated with the selected candidate image. This patent showcases his expertise in image processing and depth sensing technologies. He has 1 patent to his name.

Career Highlights

James Steven Hegarty is currently employed at Meta Platforms Technologies, LLC, where he continues to innovate and develop cutting-edge technologies. His work focuses on improving the capabilities of electronic devices through advanced imaging techniques. Hegarty's contributions have positioned him as a key player in the tech industry.

Collaborations

Hegarty has collaborated with notable colleagues, including Zijian Wang and Steven John Lovegrove. These partnerships have fostered a creative environment that encourages the exchange of ideas and the development of innovative solutions.

Conclusion

James Steven Hegarty's work in depth sensing technology exemplifies the impact of innovation in the tech industry. His patent and ongoing contributions continue to shape the future of electronic devices.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…