Full text: Proceedings, XXth congress (Part 3)

nbul 2004 
iseful. By 
| measure 
a person. 
buildings, 
isettle the 
osition of 
) interpret 
CO camera 
MU 
ripod 
S 
V 
gh option. 
] be used 
ation of 
| data are 
  
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol XXXV , Part B3. Istanbul 2004 
available. Like in close range photogrammetry one could 
imagine that points can be reconstructed if the image co- 
ordinates of the points are measured from different positions. 
Of course, choosing points by an ARS when the image is 
moving continuously is difficult. It is e.g. not possible to 
visualise both views at the same time. In section 4 a method is 
developed that simplifies the creation of three-dimensional 
objects for ARS applications. 
33 Simulated Data 
To test the ARS, data were simulated, containing the geometry 
of damaged buildings according to the methodology described 
by Schweier et al. (2003). Simulation at the scale of building 
parts should not be misunderstood in this context as the 
prediction of damages at buildings after an earthquake. Such an 
simulation seems to be impossible as there exist various aspects 
that cannot be modelled easily. For instance the failure of 
building parts can be caused by bad workmanship, poor 
material as well as deficiencies in the statics. Furthermore, 
furniture inside the buildings could alter the building's 
behaviour in the case of an earthquake. The location of cavities 
- a place where trapped persons survive more probably - can be 
influenced by furniture. That means a realistic simulation of a 
real collapse is not possible since too many things are unsure. 
However, simulated damages make sense being used for SAR 
training purposes. 
4. METHODS 
Next to hardware and data, a software is needed to run an ARS. 
The tasks of the software are to perform the superposition and 
to enable the user to interact with the virtual world. To handle 
these tasks the following photogrammetric methods have been 
developed: a method of computing the needed calibration- 
parameters to calculate the superposition and a method of 
simplifying the creation of shapes for new virtual objects. 
4.1 Superposition 
The superposition is achieved by mixing a picture of the reality 
with a synthetically generated image rendered by 3D computer 
graphics software. The picture of the reality is taken by a 
camera or directly observed by the retina of the user of the 
retinal display. If the retinal display option is used, the process 
of mixing is solved by hardware since the user sees both 
pictures through the transparent display at the same time. If the 
camera option is used, the video stream of the camera is simply 
used as the background of the scene displayed by the 3D 
graphics software. The remaining problem is to render the 3D 
image geometrical correctly. For this one has to know the 
correct mapping, defined by a combination of transformations 
and referring transformation parameters. The process of 
determining these parameters can be interpreted as a calibration 
process. 
While indoor AR calibration is widely studied in literature (a 
Survey is given by Leebmann (2003)), no detailed description 
for outdoor AR calibration can be found in literature. Outdoor 
AR calibration has analogies to airborne photogrammetry using 
INS and GPS for measuring the exterior orientation of the 
camera (Cramer et al., 2002). The functional model for ARS 
can be described by a concatenation of several transformations. 
The transformation of the point: 
911 
A) 
reference—system 
in homogenous co-ordinates is expressed by the product of 
several four-by-four matrices of the form: 
10 
10 aa 2 
from 
Jrom 
with 7, being components of a rotation matrix and [, being the 
components of a translation. A three-by-four projection matrix 
C 0 Xo 0 
Po =ld ¢, yw Q0 
070 1 0 
is used to transform the point from the camera or eye-system 
into the display-system: where C, and C, represent the scale 
in the row and the column direction respectively (these scales 
are often expressed as focal length and aspect ratio), X, and 
Y, are the principal point co-ordinates and d is the skew of 
the image. The projection of the point X 
u=(n m h): 
is the point 
nm dites X (1) 
GaussKrüger""GaussKrüger 
The perspective projection is the non-linear function 
pD which transforms the display co-ordinates to image co- 
ordinates: 
X 
x/w (2) 
y 
yz| y/w |- pD 
zZ 
z/w 
Ww 
Since the three rotations between rover and eye-system are not 
commutative they have to be kept separate and cannot be 
combined. 
eye— system T IMU qe rover ( 3) 
qnas or pre 
EE source rover “GaussKrüäger 
world eve—system ~ IMU 
The combination of the equations (1), (2) and (3) leads to an 
equation that can be used for the bundle adjustment. If the 
: J 3 TT source eye— system A d 
transformation parameters for rover: * IMU an 
p^ Hes are introduced as unknowns in the bundle 
eye— system 
adjustment, they can be determined and used as calibration 
: . ; IMU 
parameters. The parameters of the transformations T c and 
 
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.