Full text: Proceedings, XXth congress (Part 4)

  
  
  
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol XXXV, Part B4. Istanbul 2004 
  
2. TEXTURE CREATION: SYSTEM OVERVIEW 
Texture processing aims at the provision of rectified images for mU 
all the visible faces of buildings. In the system proposed here, LÀ - 
terrestrial imagery is used as the texture source. The image 
acquisition unit was developed at University College London 
(Chapman, et al., 1994), the main characteristics of which are: eur I 
aub 4 = 
  
   
m— Li - 
Diana abi 
e Capturing images using an off-the-shelf CCD camera 
mounted on a servo-driven theodolite; 
Salad 
e Taking images automatically in a step-wise manner at 
preset locations to cover a complete field of view of 
360° by 90° in horizontal and vertical directions 
respectively; 
e Indexing the images using the theodolite angles 
recorded for each image. 
Indexing is an important feature of the system. In other words, 
in order to localise the coordinate system of each images, 
usually a number of control points are pre-marked in the image 
capture area. However, here, as the images are related to cach 
other through their recorded vertical and horizontal angles, 
localisation is done only once for each station, not each image. 
The result is a grate reduction in the localisation time and the 
number of control points needed. As a result of indexing, the 
relative angular location of images with respect to each other is 
known. Therefore, by knowing only the exterior orientation of 
image capture stations, the relative orientation of images at all 
camera positions can be determined. 
To process textures, the following components are required: 
  
  
e A Numerical Frame of Reference (NFR); 
e A Texture Image Database (TID); 
e An Automatic Texture Processing Tool (ATPT). ta) (bi 
; : Figure 2. Individual images and the corresponding texture 
The NFR describes the geometric structure of buildings, while 
the TID is the database of terrestrial CCD images that contains 
all the data required to relate the images to each other in space. 3. MERGING TEXTURES 
The heart of TID is a reference file which is a look-up table 
defining a number of entities for each image: Merging refers to fusing images from different stations in order 
to get a complete or higher quality texture. There are several 
e Image ID; cases where parts of textures are missing or need improvement. 
e Name of the file containing camera information; They can be categorised into three groups: occlusions, missing 
. . S 1 ope a MQ x 1 7 ic 1 x 
e Station ID at which the image was taken; images and perspective distortions. 
e [mage dimensions; p = 
It was mentioned that to create a texture all features between a 
e Ê Jul: > it adi Us: TET ^ . E : 
Angular theodolite readings; camera and a building facade are projected onto the texture 
e — Name and location of image on the computer hard disc. plane. Consequently, some parts may be occluded by details 
like cars and trees near to the camera. The existence of 
As shown in Figure l, to from the texture of a face, the 3D occluded objects in textures does not necessarily mean they 
coordinates of its corners, extracted from the NFR, are passed have to be removed. On the contrary, features like a tree or a 
to ATPT that uses the camera calibration, localisation, and data passing person, can even improve the reality of a visual model 
provided by the reference file to find the images covering the (Figure 3). However, there are situations where occlusions may 
face of interest. Based on a given spacing interval a 3D grid, the cause problems, for instance, where an important point is 
points of which define the texture points, is fitted to the face. behind an object or part of a building appears on the texture of 
The basic goal of ATPT is to define the pixel values associated another building facade (Figure 4). 
to the grid points, which are processed one at a time. For each 
point the reference file is searched and an image with most The second case where merging may become necessary is when 
orthogonal view over the point is selected. The coordinates of an image covering a certain part is missing. This can happen 
the point on this image are then computed using the collinearity due to data capture internal errors, over ‘exposure resulting 
equations. An image interpolation is finally carried out to from bright sunlight, or buildings being too high. For example, 
estimate the pixel value of the point. The estimated value is if buildings are too high it may not be possible for the camera to 
then written to a new image, which is in fact the texture ofthe cover the top parts of the buildings. However, as the instrument 
face. Figure 2 shows an example. is moved along the buildings, such parts are usually covered 
438 
  
Internatic 
  
from tl 
vertical 
in Figur 
of the bi 
using W 
is too k 
features 
images 
example 
top.
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.