Full text: Proceedings; XXI International Congress for Photogrammetry and Remote Sensing (Part B1-3)

967 
The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Vol. XXXVII. Part BI. Beijing 2008 
employed in the system. The DGPS measurements are used to 
update the error states in both the 24 state Kalman filter KF1, to 
estimate the INS errors and to provide navigation solutions. At 
the same time the 4 state Kalman filter KF2 estimates the LRF 
and optic flow modelling errors. 
INS drift error is corrected by the DGPS to get high accuracy 
hybrid navigation solutions, through KF measurement updates. 
The CCD camera acquires texture information for optical flow 
measurements. The LRF measures the relative altitude to the 
ground which is used for the relative horizontal movement 
combined with the optical flow and gyro angle rate 
measurements. The data fusion algorithms are implemented in 
real-time processing mode. 
As shown in Table 1, the sensors have different data rates. It is 
necessary to select proper data rates for the two Kalman filters, 
considering the data availability and the required data rates (50 
Hz) of navigation solutions, horizontal velocity and height over 
the ground. Therefore, the data rates for the sensors used for 
prediction were all set to 50 Hz. The 25 Hz LRF data were 
extrapolated to 50 Hz based on the fact of its slow change. The 
data rate of the DGPS data used for the Kalman filter 
measurement update was set to be 5 Hz. 
5. TEST RESULTS 
The field test data from the proposed GPS/INS/Vision 
navigation system were processed in two scenarios: 1) with 
GPS signals available during the entire mission and, 2) with 
simulated GPS signal outages. 
5.1 Integrated GPS/INS/Vision navigation 
GPS measurements are used to update the error states in both 
KF1 and KF2, in order to estimate the INS, LRF and optic flow 
modelling errors and provide navigation solutions. However the 
accelerometer used in the system produced very poor results 
due to the strong UAV vibrations in this experiment. The 
advantage of the proposed system design is that there is still a 
functional navigation backup based vision sensors, even some 
of the sensors become faulty during the operations. The 
corrected measurements from the LRF and optic flow are 
processed by the integrated INS/Vision navigation algorithm 
introduced in Section 2 to estimate horizontal velocity and 
height above the ground, which is crucial for UAV automatic 
navigation and landing. 
The following figures show the field test results of the proposed 
GPS/INS/Vision navigation system. Figures 5 and 6 plot the 
positioning results in horizontal and vertical components, 
respectively. The vision-based subsystem enables the estimation 
of the horizontal position derived from the velocity and height 
above the ground derived from the LRF. 
As shown in Figures 5 and 6, the positioning results from the 
vision-based system closely follow the DGPS positioning 
results. The horizontal positions are derived from the vision 
estimated velocity by accumulating the velocity. The bias of the 
velocity causes the positioning drift. The vertical positioning 
result from the vision subsystem is the height above the ground, 
which is totally different with the DGPS measured relative 
height. The altitude change of terrain under the UAV 
contributes to the difference. For UAV landing, it is more 
important to measure terrain height than the GPS height. 
Positioning: North-East 
Figure 5. Horizontal positioning results 
Positioning: Height 
Figure 6. Vertical positioning results 
vn,\re,\d 
Figure 7. Velocities in three directions
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.