•  
  •  
 

Journal of System Simulation

Abstract

Abstract: Vision navigation technology of scene matching needs hardware to measure the distance and attitude of the camera. A method of position and attitude estimation based on the combination of feature points in the calibration area is proposed. The software can get the camera position and attitude of the real-time image by selecting the best set of Scale-Invariant Feature Transform(SIFT) feature matching points of the reference image calibration area of the real-time image, and calculating the local coordinates of the ground of SIFT feature matching points based on the triangle internal linear interpolation method, using space resection, avoids the defect of hardware measuring camera's distance and attitude, and expands the application scope of vision navigation technology of scene matching. The experimental results show that the position and attitude of real-time image calculated by this method is close to the real.

First Page

1638

Revised Date

2020-06-03

Last Page

1646

CLC

TP391

Recommended Citation

Cai Peng, Shen Chaoping, Li Hongyan. Position and Attitude Estimation Based on Combination Matching in Calibration Area[J]. Journal of System Simulation, 2021, 33(7): 1638-1646.

DOI

10.16182/j.issn1004731x.joss.20-0218

Share

COinS