The patent badge is an abbreviated version of the USPTO patent document. The patent badge does contain a link to the full patent document.

The patent badge is an abbreviated version of the USPTO patent document. The patent badge covers the following: Patent number, Date patent was issued, Date patent was filed, Title of the patent, Applicant, Inventor, Assignee, Attorney firm, Primary examiner, Assistant examiner, CPCs, and Abstract. The patent badge does contain a link to the full patent document (in Adobe Acrobat format, aka pdf). To download or print any patent click here.

Date of Patent:
Apr. 26, 2022

Filed:

Mar. 28, 2017
Applicant:

Sri International, Menlo Park, CA (US);

Inventors:

Han-Pang Chiu, Princeton, NJ (US);

Supun Samarasekera, Skillman, NJ (US);

Rakesh Kumar, West Windsor, NJ (US);

Mikhail Sizintsev, Princeton, NJ (US);

Xun Zhou, Pennington, NJ (US);

Philip Miller, Yardley, PA (US);

Glenn Murray, Jamison, PA (US);

Assignee:

SRI International, Menlo Park, CA (US);

Attorney:
Primary Examiner:
Assistant Examiner:
Int. Cl.
CPC ...
G01C 21/32 (2006.01); G01C 21/34 (2006.01); G06V 10/75 (2022.01); G06V 20/10 (2022.01); G06K 9/62 (2022.01); H04W 4/021 (2018.01);
U.S. Cl.
CPC ...
G01C 21/32 (2013.01); G01C 21/3476 (2013.01); G06K 9/6232 (2013.01); G06V 10/76 (2022.01); G06V 20/10 (2022.01); H04W 4/021 (2013.01);
Abstract

During GPS-denied/restricted navigation, images proximate a platform device are captured using a camera, and corresponding motion measurements of the platform device are captured using an IMU device. Features of a current frame of the images captured are extracted. Extracted features are matched and feature information between consecutive frames is tracked. The extracted features are compared to previously stored, geo-referenced visual features from a plurality of platform devices. If one of the extracted features does not match a geo-referenced visual feature, a pose is determined for the platform device using IMU measurements propagated from a previous pose and relative motion information between consecutive frames, which is determined using the tracked feature information. If at least one of the extracted features matches a geo-referenced visual feature, a pose is determined for the platform device using location information associated with the matched, geo-referenced visual feature and relative motion information between consecutive frames.


Find Patent Forward Citations

Loading…