Full text: XVIIth ISPRS Congress (Part B5)

  
ic model (the effect of the atmosphere is eliminated) (section 
4.2). 
By inserting into (6), each sample contributes one equation 
Loi; — 04i ( Eat Eust En) (9) 
to a system of n linear equations which is redundant for 
n > 3. E,;;, E,;; and E,,, are the irradiances due to the 
incident sunlight, ambient light and skylight. Since the scene 
geometry and the illumination geometry are known, they can 
be calculated according to (3), (4) and (5). For each wave- 
length within the visible spectrum, the system of equations 
(9) is solved for the unknowns which are the weights k;, ku 
and k, of the daylight components. Finally, the weights are 
averaged within the visible spectrum. Since o,; is known up 
to a constant factor only, k,, k, and k, are the relative 
weights of the daylight components. 
Nakamae et al. (Nakamae, 1986) and Thirion (Thirion, 1992) 
determine the ratio of illumination by direct sunlight and am- 
bient light. Skylight, wavelength dependency and the effect 
of the atmosphere are not considered. 
The second inversion of the illumination model involves 
solving the equation of the illumination model (6) for the dif- 
fuse reflectance o,, of a material. This inversion will be used 
during rendering (section 6). 
Given are the scene geometry (section 4.1) and the illumina- 
tion geometry d;, Q,, d, and m, (sections 3.3 and 4.1). The 
radiances k, In k, L;, k, L, of the daylight components 
are known from the first inversion of the illumination model 
(see above). p is a known location in object space. It is si- 
tuated on the surface of an object of an opaque, diffusely re- 
flecting material with material parameter k; > 0. The known 
apparent object color L, is the radiance of the light reaching 
the camera associated with one of the input images at the 
known distance d from p. L, is derived from the RGB triplet 
of the reconstructed image function at p' (appendix A). p' 
are the image coordinates of p in the selected input image. 
The true object color L,; is determined from L; by the sec- 
ond inversion of the atmospheric model (the effect of the at- 
mosphere is eliminated) (section 4.2). 
Inserting into (6) and solving for o, yields 
041 = Loi ( k (Ej * E, + Ew) Ts (10) 
E,,, E,, and E,; are the irradiances due to the incident 
sunlight, ambient light and skylight. Since the scene geome- 
try and the illumination geometry are known, they can be cal- 
culated according to (3), (4) and (5). Since the relative 
weights k,, k, and k, were used, also o,; is known up to a 
constant factor only. 
Thirion (Thirion, 1992) determines the reflectance of a mate- 
rial without considering skylight, wavelength dependency and 
the effect of the atmosphere. 
438 
  
  
À 
4 interaction 
natural 
PREPARE | ^99 | env. descr. 
preprocessing 
  
  
  
  
  
  
  
  
artificial | > 
env. descr. > RENDER | —» 
LIBRARY 
| light : 
sources mais | 
T 
  
  
param | output | 
parameters images 
VIEW 
  
  
  
  
  
  
  
  
  
  
modeling 
  
  
  
  
  
  
  
  
CONVERT|<@——| DXF file =! SA À 
LL meee irre — 
  
  
  
  
Fig. 4 System overview. 
  
  
  
5. ENVIRONMENT DESCRIPTIONS 
The natural environment description (Fig. 4) contains geo- 
metrical and non-geometrical data about the natural environ- 
ment of the planned building: Atmospheric parameters (Len 
VT d,), illumination parameters (ds, ,, k;, ky, m,, d,, k,, 
L,) and polygons (vertices, material attribute, image attrib- 
ute). Most of this information is retrieved from the input 
images during the interactive preprocessing step (section 4). 
Additionally, the natural environment description contains the 
interior and exterior orientation and the file name of each of 
the input images. 
The artificial environment description (Fig. 4) contains the 
CAD model of the planned building with additional artificial 
light sources, if required. The program CONVERT (Fig. 4) 
converts data from the widely used DXF format (Autodesk, 
1988) to the required data format. 
6. RENDERING 
During the final rendering step (program RENDER) (Fig. 4), 
the input images and the natural and artificial environment 
description (section 5) are used to generate the output 
images. These show the planned building embedded in the 
existing environment from the perspectives of the input 
images. The rendering algorithm is an extension to conven- 
tic 
thi 
ti 
aA 0.330292 (^ 
N fr. 
amie eam Per = EN INA 8 AA sn PN
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.