Full text: Proceedings, XXth congress (Part 4)

  
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol XXXV, Part B4. Istanbul 2004 
can be computed. Each projected polygon then overlays all 
the pixels that unprojected will make up the final facade 
exture (see Figure 3). 
  
Figure 3. Projected 3D building model overlaid on the input 
photograph (Rosensteinmuseum). 
Because the fagade textures have to be represented by 
quadrilaterals, the polygons are substituted during the 
extraction process by their bounding rectangles which are 
given in three-dimensional world-space coordinates. The 
texture extraction is basically performed by rendering a 
quadrilateral with the colour values of the unprojected image 
pixels of the bounding rectangle. 
Lens distortions are removed on-the-fly in the pixel shader, 
so the extraction works as if processing idealized images 
where the calibration parameters are applied. 
3.1 Texture Extraction 
The first step is to set up the rendering pipeline to fill the 
entire target buffer where the final facade texture will be 
rendered to. For this purpose, all transformation-related states 
(including the one for the projection) are initialised with the 
identity matrix. Drawing a three-dimensional unit square 
with vertices v, = (-1, 1, 0), v, = (1, 1, 0), va = (1, -1, 0) and 
vs — (-l, -1, 0) will render all pixels in the target buffer as 
wanted. The four vertices can incidentally be thought of as 
the projected vertices of the polygon's bounding box into the 
image plane. 
So far, however, the rasteriser would only render a blank 
facade image as no colour information is provided yet. 
Therefore, a photograph must be assigned to the pipeline as 
an input texture from where to take the colour information 
from. As mentioned above, the polygon's projected bounding 
box defines the pixels to be extracted from the input texture. 
So in addition to the above mentioned vertices, the texture 
coordinates of the four vertices are specified as the four- 
clement (homogenous) world space coordinates of the 
bounding box. Setting the texture transformation matrix with 
the aforementioned transformation from world to image 
space concludes the initialisation. 
During the rendering of the target facade image, the rasteriser 
linearly interpolates the four-dimensional texture coordinates 
across the quadrilateral. A pe 
1 
in the perspectively corrected 
spective texture lookup results 
façade texture (see Figure 4). 
Some extra care has to be taken, however, as the results of 
this transformation are in the range -1 to 1 and the final 
texture coordinates are indexed in the range 0 to I. A single 
scale and bias will map the coordinates accordingly. 
  
Figure 4. Extracted facade textures. 
The resulting façade texture can then be read from the frame 
buffer and saved to file. Most 3D APIs provide spccial 
functions for this task. 
  
Figure5. The 3D building model with the extracted 
textures placed on the façade polygons. 
3.2 Texture Placement 
After the extraction, the textures need to be placed on the 
corresponding polygons (see Figure 5). In order to find the 
two-dimensional texture coordinates for the polygon vertices, 
Ï 
  
a function identical to gl TexGen (Shreiner, 2003) of OpenGL 
is used. The function automatically generates textur 
coordinates s and t by a linear combination of the vertex 
coordinates: 
  
SedxtHByrCOzrD 
I4 By TCzrD,) 
Uu 
A, B, C and D can be thought of as the definition of plane 
parameter form. The normal vector components A, B and C 
of the two planes are defined by the vector from the 
left vertex to the bottom right vertex of the bounding box 
from the bottom left vertex to the top left vertex according 
The values for D are simply computed by inserting the 
bottom left vertex into the equation. The result of 
   
the linear 
  
Interna 
———— 
combin 
| as rec 
33 Di 
The ex 
leads te 
because 
is take 
occlude 
avoid 
polygo 
Pixel-v 
using tl 
closest 
photog 
done b 
buffer 
depth | 
efficier 
shader 
graphic 
even as 
During 
pixel s 
colour 
texture 
for the 
values 
belong: 
texture 
occlude 
some 
blacker 
errors, 
in the c 
34 In 
As onl; 
capture 
several 
need tc 
One si 
same p 
the one 
Figure 
taken c 
angle. 
The pre 
get the 
should 
make tl 
depth, : 
force, i 
The pr 
texture 
occlude 
works 
depth t 
shader 
calcula 
farthest 
Is usua
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.