The patent badge is an abbreviated version of the USPTO patent document. The patent badge does contain a link to the full patent document.

The patent badge is an abbreviated version of the USPTO patent document. The patent badge covers the following: Patent number, Date patent was issued, Date patent was filed, Title of the patent, Applicant, Inventor, Assignee, Attorney firm, Primary examiner, Assistant examiner, CPCs, and Abstract. The patent badge does contain a link to the full patent document (in Adobe Acrobat format, aka pdf). To download or print any patent click here.

Date of Patent:
Dec. 04, 2018

Filed:

Mar. 30, 2016
Applicant:

Thales, Courbevoie, FR;

Inventor:

Alain Simon, Les Mesnuls, FR;

Assignee:

THALES, Courbevoie, FR;

Attorney:
Primary Examiner:
Int. Cl.
CPC ...
G06T 7/73 (2017.01); G06T 7/80 (2017.01); H04N 5/225 (2006.01); H04N 5/232 (2006.01);
U.S. Cl.
CPC ...
G06T 7/74 (2017.01); G06T 7/80 (2017.01); G06T 2207/10016 (2013.01); G06T 2207/10032 (2013.01); G06T 2207/10048 (2013.01); G06T 2207/20081 (2013.01); G06T 2207/30244 (2013.01); G06T 2207/30252 (2013.01); H04N 5/2258 (2013.01); H04N 5/23238 (2013.01);
Abstract

The invention relates to a method of determining the absolute direction of an object of a scene (), with a predetermined desired performance. It comprises a learning phase and an online operation phase, the learning phase comprising the following steps: acquisition by circular scanning by means of a first optronic imaging device of determined fixed position, of a series of partially overlapping optronic images (), including an image or several images of the scene (step A), automatic extraction from the images, of descriptors defined by their image coordinates and their radiometric characteristics, with at least one descriptor of unknown direction in each overlap () of images (step B), from the descriptors extracted from the overlaps between images, automatic estimation of the mutual relative rotation of the images and mapping of the descriptors extracted from the overlaps (step C), identification in the images, of at least one known reference geographic direction () of precision compatible with the desired performance, and determination of the image coordinates of each reference (step D), from the descriptors extracted from the overlaps and mapped, the direction and the image coordinates of each reference, automatic estimation of the attitude of each image, called fine registration step (step E), from the attitude of each image, the position and internal parameters of the first imaging device, and the image coordinates of each descriptor, computation of the absolute directions of the descriptors according to a predetermined model of image capture of the imaging device (step F), the online operation phase comprising the following steps: acquisition of at least one image of the object called current image (), from a second imaging device of determined fixed position (step A), extraction of descriptors from each current image (step B), mapping of the descriptors of each current image with the descriptors whose absolute direction was calculated in the learning phase, to determine the absolute direction of the descriptors of each current image (step C), from the absolute directions of the descriptors of each current image, estimation of the attitude of each current image (step D), from the image coordinates of the object in each current image, the attitude of each current image, the position and predetermined internal parameters of the second imaging device, computation of the absolute direction of the object according to a predetermined model of image capture of each current image (step E).


Find Patent Forward Citations

Loading…