Cambridge, MA, United States of America

Sergey M Nikitin


Average Co-Inventor Count = 1.5

ph-index = 1


Company Filing History:


Years Active: 2025

Loading Chart...
2 patents (USPTO):Explore Patents

Title: Sergey M. Nikitin: Innovator in Robotic Manipulation and Depth Estimation Technologies

Introduction

Sergey M. Nikitin is a prominent inventor based in Cambridge, MA, known for his contributions to robotic manipulation and depth estimation technologies. With a total of 2 patents, he has made significant strides in optimizing camera systems for item identification and enhancing depth estimation capabilities.

Latest Patents

Nikitin's latest patents include "Sensor Optimization for Robotic Manipulations" and "Long Range Depth Estimation Sensor." The first patent describes systems and techniques for optimizing the deployment of a camera scanning system in an environment for item identification. This involves obtaining parameters of both the camera system and the environment, and modifying these parameters to ensure effective scanning. The second patent focuses on capturing multiple images of a scene using three imaging sensors to generate point clouds, which are then used to derive depth information of the scene.

Career Highlights

Currently, Sergey M. Nikitin is employed at Amazon Technologies, Inc., where he continues to innovate in the field of robotics and imaging technologies. His work has been instrumental in advancing the capabilities of robotic systems and enhancing their efficiency in various applications.

Collaborations

Nikitin collaborates with talented individuals such as Sara Jean Woo and Jing Ma, contributing to a dynamic and innovative work environment.

Conclusion

Sergey M. Nikitin's work exemplifies the intersection of technology and innovation, particularly in the fields of robotic manipulation and depth estimation. His contributions continue to shape the future of these technologies.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…