Full text: Proceedings, XXth congress (Part 4)

  
  
  
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol XXXV, Part B4. Istanbul 2004 
  
Figure 10 The resulting textures with the fewest occluded 
pixels were chosen as the facade texture and 
placed on the building model. 
colour value, the pixel colour is always written. The 
hardware depth test ensures that always the closest pixel is 
taken and the artificial depth value of 1.0 gives precedence to 
non-occluded pixels. The result can be seen in Figure 11. 
  
  
  
  
(b) 
(a) Fagade texture extracted from two input 
photographs using per-pixel fusion. The images 
were taken from two different positions. 
(b) The extracted façade textures placed on the 
building model. 
Figure 11 
3.5 Removal of Lens Distortion 
If a calibrated camera is being used to capture the building 
facades, the lens distortion in the images are corrected on- 
the-fly in the pixel shader. The major benefit of this approach 
is that the extraction process works with the original images. 
Image pixels are therefore filtered only once during the 
whole process as opposed to the alternative approach where 
the idealised image is computed beforehand. The result is 
that the extracted facade textures are of higher quality. 
4 
4 
The lens distortion is described by the parameter set 
introduced by (Brown, 1971) and denotes the transition of 
pixels from the distorted to the idealized image. Here, the 
following subset is used as in (Fraser, 1997): 
Ne=Ax A, + AM, TAF, 
(2) 
Av=Ar +Ay PAX, +âr 
Change of interior orientation: 
X 
Ax SAX UO AC 
C 
di (3) 
2 AD Uu 
Ay, — Ay, TT. Ac 
C 
Radial distortion: 
—(-n 4 6 
Ax eXQ Kr K, tz. 
; + (4) 
Ay, = ÿ(°K,+r*K, +r°K,) 
Decentring distortion: 
2 —2- VS 
Av, =(r CN, is 
Ay, 2239 (7 € 23^), 
Affinity and shearing: 
Ax, — XB, ^ yB, 
(6) 
Av, =0 
Because the extraction process needs the transition of pixels 
from the idealized to the distorted image and the formula is 
not invertible, an iterative method is be used. Unfortunately, 
arbitrary iterations are not supported by current 3D graphics 
hardware. But because the graphics API Direct3D 9.0 
(Microsoft, 2003) already defines dynamic flow control in 
Pixel Shader 3.0, graphics cards are likely to include this 
feature in the near future. The removal of lens distortion can 
at that time be completely computed in hardware. 
Until then, an alternative approach must be used. The 2D 
transition vectors are pre-computed for all pixels in the image 
and stored in a two times 32 bit floating-point texture. In the 
pixel shader, a first texture look-up gets the transition vector 
and adds the correction values to the texture coordinates. The 
new coordinates are then used for the depth and colour 
lookup. 
4. IMPLEMENTATION AND RESULTS 
The design goal of the implementation was to have a vendor 
independent system, which means that the algorithms should 
work with a broad variety of 3D graphics cards. The graphics 
API of choice was therefore Direct3D 9.0, which also 
includes the high-level shader language (HLSL). 
  
Internati 
Aer Het 
The calil 
program 
automati 
generatec 
resolutioi 
2048 are 
that textu 
The outp 
fixed at 2 
The perf 
PC with 
and a gre 
256 MB 
Table 1. 
per pixe 
number 
Neverthe 
still belo 
# 
images 
1 
1 
1 
] 
8 
LA 
16 
16 
  
Table 1. 
5 
This arti: 
hardware 
textures. 
very Si 
language 
hardware 
the resu 
quality. 
mapped 
be used | 
As the s 
be exten 
refineme 
terrestria 
geometn 
remainin 
nuisance 
is not e 
façade te 
à real-tir 
operator 
The futu 
doing s 
CPU. B: 
image at 
need to
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.