Full text: Technical Commission VII (B7)

   
ptable pay- 
board soft- 
iming code 
n of direct 
wed to syn- 
raw data of 
ly estimate 
derived by 
A 6S’ with 
ition by the 
raw data of 
with a fre- 
ver and the 
"anon Ixus 
.. The self- 
the camera 
ir viewing. 
libration. 
y the firm- 
to infinity, 
eras, could 
very short 
idy the im- 
est field in 
In order to 
ie lens, the 
nera along 
analysis of 
orientation 
t indicate a 
1. The cal- 
resulted in 
  
5m) image 
angle. The 
'he control 
3 DIRECT GEOREFERENCING 
3.1 General 
Direct georeferencing is the direct estimation of position and ori- 
entation of the camera with sensors on board of the aircraft (i.e., 
without using control points). The position is defined by the three 
coordinates of the projection center (Xo, Yo, Zo) in a navigation 
frame. The orientation of the camera in the navigation frame 
can be described by the three rotation angles roll, pitch and yaw 
(ro, Po, yo). 
Fig. 3 gives an overview of the available sensors on the UAV and 
which sensors can be integrated to obtain position and orientation 
of the camera. The interpolation at the exposure-times gives the 
position and the orientation of each image. 
To improve the height measurements of the GNSS-receiver, the 
air pressure sensor was utilised. The WGS84-coordinates were 
transformed into the navigation frame with a seven-parameter 
transformation. More demanding, than the estimation of the posi- 
tion, was the derivation of the orientation from the measurements 
of the IMU and the magnetometer, which is described in the fol- 
lowing section. 
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
sensors on the UAV interpolation exposure-time 
A | 
| 
PS-recei —| X(t),Y(t) | = 
such x Y) a position 
Xo. Yo, Zo 
air pressure sensor | —» Z(t) um e eum 
3 accelerometers | 
= cens pr 
FI [s ; Dm “Ta | orientation 
gyroscopes ae s em To, po, o 
= 
magnetometer 
  
  
  
Figure 3: Direct georeferencing of the images by integration of 
all available sensors on board of the UAV. 
3.2 Estimation of orientation 
The rotation angles r, p, y pl are usually found by integrating the 
measured rotation rates wy, wh, ut [^ /s] of the gyroscopes over 
time. This works well for high-grade inertial sensors. However, 
the inertial sensors on the UAV are based on MEMS technology. 
They are small, lightweight, inexpensive, and consume very lit- 
tle power. Due to their fabrication process MEMS-sensors have 
large bias instabilities and high noise (El-Sheimy, 2009). Thus, 
the integration of the angular rates leads to large errors in the ro- 
tation angles already after a few seconds. To reduce this errors, 
absolute angle measurements are needed. They can be obtained 
for roll and pitch from the accelerometers and for yaw from the 
magnetometer. 
While the UAV is not moving (e.g., when the UAV is hanging 
above a defined waypoint) or is just moving very slowly, the three 
accelerometers can be used to estimate roll and pitch. On that 
condition, the accelerometers measure the three orthogonal com- 
ponents g^, 9, 9° ofthe gravitational acceleration g. For the sake 
of simplicity, this is shown in Fig. 4 just for the 2D-case. In fact, 
in this approach, the accelerometers are used as a tilt meter. Due 
to the vibrations on the UAV, before roll and pitch are computed, 
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XXXIX-B7, 2012 
XXII ISPRS Congress, 25 August — 01 September 2012, Melbourne, Australia 
489 
the measured accelerations must be low-pass filtered with a cutoff 
frequency of 100 Hz or less. 
The time frames where the UAV can assumed to me stable or 
constantly moving were determend on the actual observed accel- 
eration and rotation values. If for both values only low variations 
can be observed the UAV platform can be assumed to be on a 
fixed or continous moving position and orientation. Within all 
such time frames the accelerometers can be utilised for the drift 
compensation. 
horizontal plane 
p= arcsin 2% 
|] 
  
  
y 
Figure 4: By measuring the components of the gravitational ac- 
celeration d with the accelerometers, the tilt angles roll and pitch 
can be estimated. 
The integration of the gyroscopes, the accelerometers, and the 
magnetometer for the estimation of the three rotation angles r, p 
and y, is shown in Fig. 5. First, starting from the initial values 
r(0), p(0) and y(0) the rotation rates w2, w}, W° are integrated. 
This gives a first realization of the rotation angles r1, pi, y1. As 
described above, the measured accelerations can be used to derive 
T2 and p», whereas the magnetometer measures directly yo. At 
this point, the rotation angles from the gyroscopes can be com- 
pared to the rotation angles from the accelerometers and the mag- 
netometer. These differences are the error signals, which can be 
used to correct the rotation angles derived from the gyroscopes. 
The value of the gain factor k defines how strong the stabilization 
of the integral should be. If k is set to 1, the rotation angles are 
completely derived by the accelerometers and the magnetometer. 
On the other hand, if & is set to O, the rotation angles are com- 
pletely derived by the gyroscopes. However, for 0 « k « 1 the 
advantages of all sensors can be combined. For the UAV used in 
this study, the gain factor k was set to 2 96. It is noted that k need 
not be the same for r, p on the one hand, and y on the other hand. 
For the presented approach the calculated rotation angles are dom- 
inated on short time scales by the measurements of the gyro- 
scopes, whereas the accelerometers and the magnetometer cor- 
rect the rotation angles over a long time scale. 
3.3 Data Streams 
The images taken are directly stored on the SD-Card of the cam- 
era. The navigation sensor data is downlinked via a Wi.232 con- 
nection and stored on a laptop harddisk. In professional kine- 
matic multi-sensor systems the GPS PPS signal is used for the 
synchronization of all data streams, which is, however, not avail- 
able for the described components. 
The two streams of navigation data, on the one hand, and im- 
ages, on the other hand, are synchronized via a signal generated 
   
    
  
    
   
  
     
  
    
   
    
    
    
  
   
    
   
     
   
    
   
    
    
   
    
    
   
  
  
     
     
   
    
  
   
   
    
     
     
    
    
   
    
 
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.