Full text: Proceedings, XXth congress (Part 2)

  
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol XXXV, Part B2. Istanbul 2004 
  
constraint that indicates the points X, Y Z) (Xs Xi) 
(Xo Yo Zu) and (xj, yj. 0)} arc coplanar, is introduced and 
7] e I 
mathematically described by Equation 1. 
pH 
(F x V.P, so (1) 
. J) . . 
In the above equation, P is the vector connecting the 
perspective centre to the first end point along the object space 
line, 7 is the vector connecting the perspective centre to the 
. . . YA 
second end point along the object space line, and ÿ is the 
vector connecting the perspective centre to an intermediate 
point along the corresponding image line. It is important to note 
that the three vectors should be represented relative to a 
common coordinate system (e.g. the ground coordinate 
system). The constraint in Equation 1 incorporates the image 
coordinates of the intermediate point, the Exterior Orientation 
Parameters (EOP), the Interior Orientation Parameters (IOP) 
including distortion parameters, as well as the ground 
coordinates of the points defining the object space line. Such a 
constraint does not introduce any new parameters and can be 
written for all intermediate points along the line in the imagery. 
The number of constraints is equal to the number of 
intermediate points measured along the image line. 
X Y 
X Y. 
   
e, "x" 
    
  
xx 
TECHN 
2Z,-Z, 
PET x, — distortions x 
F- Mo) MS 
  
  
Sx 
p — distortions y 
X, XS 
— P=} - 7 
Z.—Z 
Figure 2. Perspective transformation between image and object 
space straight lines and the coplanarity constraint for 
intermediate points along the line. 
In some applications, photogrammetric lines are used as control 
lines instead of being regular tic lines. In this situation, the 
object coordinates of line end points are known, hence, these 
points need not be measured in any of the images. 
Consequently, image space linear features are represented only 
by a group of intermediate points measured in all images. 
After the identification and extraction of straight lines from 
imagery, a photogrammetric model is generated through a 
photogrammetric triangulation using an arbitrary datum without 
any control information. This arbitrary datum 1s defined by 
fixing seven coordinates of any three well-distributed points. 
Laser straight line features: The increasing recognition of laser 
  
scanning as a favourable data acquisition tool by the 
photogrammetric community led to a number of studies aiming 
at pre-processing laser data. The major goal of such studies 
ranges from simple primitive detection and extraction to more 
complicated tasks such as segmentation, and perceptual 
organization (Csathó et al., 1999; Lee and Schenk, 2001; Filin, 
2002). 
In this paper, laser straight line features will be used as a source 
of control to align the photogrammetric model. To extract such 
lines, suspected planar patches in the laser dataset are manually 
identified with the help of corresponding optical imagery, 
Figure 3. The selected patches are then checked using a 
lcast-squares adjustment to determine whether they are planar 
or not, and to remove blunders. Finally, neighbouring planar 
patches with different orientation are intersected to determine 
the end points along object space discontinuities between the 
patches under consideration. 
The datum for the laser lines is directly established by a 
combination of high-quality GPS/INS units installed onboard of 
the sensor platform. 
  
(a) (b) 
Figure 3. Manually identified planar patches in the laser data (a) 
guided by the corresponding optical image (b). 
2.2 Registration transformation function 
At this point, a photogrammetric model is generated from the 
photogrammetric triangulation using an arbitrary datum without 
knowledge of any control information. In addition, a set of 
conjugate photogrammetric-laser lines has been manually 
identified. These lines, in both datasets, are identified by their 
end points. It is important to reiterate that the end points of such 
conjugate lines are not required to be conjugate. 
An essential property of any registration technique is the type 
of transformation or mapping function adopted to properly 
overlay the two datasets. In this paper, a 3D similarity 
transformation is used as the registration transformation 
function, Equation 2. Such transformation assumes the absence 
of systematic biases in both photogrammetric and LIDAR 
surfaces (Filin, 2002). However, the quality of fit between 
conjugate primitives can be analyzed to investigate the presence 
of such behaviour. 
Er, x, | dn 
| 9. zx ans D K) EY e 
ij 7; =, 
where: 
S is the scale factor, (X4. Y4 Zu is the translation. vector 
between the origins of the photogrammetric and laser data 
coordinate systems, R(Q,d,K) is the 3D orthogonal rotation 
matrix between the two coordinate systems, (X, Y, Z.)' are the 
photogrammetric point coordinates, and (X4 Y4 Z4)! are the 
coordinates of the corresponding point relative to the laser data 
reference frame. 
2.3 Similarity measure 
The role of the similarity measure is to mathematically express 
the relationship between the attributes of conjugate primitives 
in overlapping surfaces. The similarity measure formulation 
depends on the selected registration. primitives and their 
    
Internatic 
respective 
this. pap: 
incorpora 
coinciden 
proper co 
Referring 
segment 
similarity 
dataset. 
constraint 
coincides 
absolute « 
B 
S 7 
Figure 4. 
For the 
mathemat 
Equation 
EY. 
Y 
Ze 
where A 
Subtractir 
(ds 
Dividing | 
for $2. 
the conce 
object lin 
recover tl 
further m 
Second ro
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.