Full text: Papers accepted on the basis of peer-review full manuscripts (Part A)

  
ISPRS Commission III, Vol.34, Part 3A ,,Photogrammetric Computer Vision“, Graz, 2002 
  
where: 
[x YZ ] : point coordinates in the ground system; 
[x GPs Yaps ZGps ]: observed GPS antenna position; 
lax cps AYGps AZ ans | : offset vector between GPS and 
image frames; 
[x y-f I: point coordinates in the image system; 
f: focal length; 
RIS (AQ, AQ, AK, ): rotation from the image to INS 
systems; 
RIS (AGGps ,A9Gps, AKgps) : rotation from the GPS to 
INS systems; 
RS (Ones ANS): rotation from INS to ground 
system, using angles observed by INS system; 
k: scale factor. 
GPS SYSTEM 
IMAGE SYSTEM 
4 UNS 
M / Raps 
x 
     
INS SYSTEM 
Figure 2. Reference systems in direct georeferencing 
algorithm. 
In the same way, the collinearity equations for multi-lens 
sensors are extended in order to take into account the GPS/INS 
misalignments and drifts. 
3. INDIRECT GEOREFERENCING 
If the misalignments between GPS, INS and image systems are 
known, the ground coordinates of the points observed in the 
stereo images can be calculated with forward intersection based 
on Equation 5. Anyway in most cases the GPS offset and INS 
drift angles are not available, so the direct georeferencing 
algorithm can not be applied. The GPS/INS misalignment have 
to be estimated with post-flight calibration procedures, together 
with any additional measurements errors that may be contained 
in the GPS/INS observations. The solution is achieved by 
including in the standard photogrammetric triangulation 
suitable functions that model the sensor external orientation 
and take into account the GPS and INS measurements. The 
complete procedure is called indirect georeferencing model. 
The proposed trajectory model is based on piecewise 
polynomial functions depending on time. 
3.1 Trajectory modelling 
The aircraft trajectory is divided into segments, according to the 
number and distribution of GCPs and TPs. For each segment i, 
A - 248 
delimited by the time extremes e and o, , the variable: 
zu 
f — [0,1] (6) 
i 
fin — ini 
is defined, where # is the acquisition time of the processed 
line. Then in each segment the sensor attitude and position (Xe, 
Yoo Ze, @c, Pc, Kc) are modelled with a second order 
polynomial function depending on ? : 
XOX, PX +X T+ XT? 
Ye (D) * Yit YE Tu 
Ze D=Z mo t Zi *Zit zi? oi 
Oc (1)-0,, +O, +0 +057? 
Y ER 
Pc (4)=0inser *99 t9; t9» 
z dr 
Kc (E) Kj, sy. +Ko +Kıt +K5t 
where: 
Y, 
instr 
[x Z ust |: PC position observed with GPS; 
lo, (str Ks] : PC attitude observed with INS; 
instr 
[x Qe K2 | : 18 unknown parameters modelling the external 
orientation in segment i. 
The constant terms describe the shifts and angular drifts 
between the image system and, respectively, the GPS and INS 
systems, while the linear and quadratic terms model the 
additional errors in GPS/INS measurements. 
If the time interval between exposure of adjacent image lines is 
constant, the image line number / can be used in place of the 
time. In this case, calling n, and/^,. the first and last lines of 
in 
segment i, 7 is defined as: 
f=— ely J] (8) 
At the points of conjunction between adjacent segments, 
constraints on the zero, first and second order continuity are 
imposed on the trajectory functions: we force that the values of 
the functions and their first and second derivatives computed in 
two neighbouring segments are equal at the segments 
boundaries. As the point on the border between segment i and 
i*l has t—lin segment i and 7 =0 in segment i+/, imposing 
the zero order continuity for X function, we obtain: 
c iz) € 
  
(9) 
t=0 
It yields to: 
X)-X XL X (10)
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.