North Bethesda, MD, United States of America

Dong Yang

USPTO Granted Patents = 7 

Average Co-Inventor Count = 4.9

ph-index = 2

Forward Citations = 11(Granted Patents)


Company Filing History:


Years Active: 2021-2025

Loading Chart...
7 patents (USPTO):Explore Patents

Title: The Innovations of Dong Yang

Introduction

Dong Yang is a prominent inventor based in North Bethesda, MD (US). He has made significant contributions to the field of technology, particularly in the area of neural networks. With a total of 7 patents to his name, Yang continues to push the boundaries of innovation.

Latest Patents

Among his latest patents is the invention titled "Object detection using one or more neural networks." This patent describes apparatuses, systems, and techniques to detect objects in images, including digital representations of those objects. In at least one embodiment, one or more objects are detected in an image based, at least in part, on points corresponding to a surface of one or more objects. Another notable patent is the "Pretraining framework for neural networks," which outlines apparatuses, systems, and techniques to indicate the extent to which text corresponds to one or more images. In at least one embodiment, this extent is indicated using one or more neural networks and is used to train those networks.

Career Highlights

Dong Yang is currently employed at Nvidia Corporation, a leading company in the field of graphics processing and artificial intelligence. His work at Nvidia has allowed him to collaborate with some of the brightest minds in the industry.

Collaborations

Some of his notable coworkers include Daguang Xu and Ziyue Xu. Their collaborative efforts contribute to the innovative projects at Nvidia Corporation.

Conclusion

Dong Yang's contributions to the field of neural networks and object detection exemplify the spirit of innovation. His work continues to influence advancements in technology and artificial intelligence.

This text is generated by artificial intelligence and may not be accurate.
Please report any incorrect information to support@idiyas.com
Loading…