Full text: XVIIth ISPRS Congress (Part B5)

  
s geo- 
nviron- 
(Lan 
d,, k,, 
attrib- 
) input 
ion 4). 
ins the 
iach of 
ins the 
rtificial 
Fig. 4) 
odesk, 
Fig. 4), 
)nment 
output 
] in the 
3 input 
onven- 
tional ray-tracing. Ray-tracing was introduced by Whitted 
(Whitted, 1980). It is assumed that the reader is familiar with 
the principle of recursive ray-tracing in object space. 
6.1 Setup 
One of the input images is selected as the current input 
image. The current camera is the camera associated with 
the current input image. It represents the observer in object 
space and remains unchanged during the generation of the 
current output image. 
6.2 Independent Pixel Processing 
  
A natural object or natural light source is part of the natu- 
ral environment description (section 5). Each facet of the dis- 
cretized sky hemisphere (section 3.3) is treated as an indi- 
vidual natural light source. An artificial object or artificial 
light source is part of the artificial environment description 
(section 5). A primary ray originates at the center of projec- 
tion of the current camera. A secondary ray is generated af- 
ter reflection or refraction at a material interface. 
Several cases are identified: 
Case A. A primary ray does not hit any object: The inter- 
section point p of the ray and the image plane of the current 
camera is mapped from world coordinates to image coordi- 
nates p' of the current input image. The RGB triplet of the re- 
constructed image function at p' is converted to a spectral 
distribution function L; (appendix A) and returned. 
Case B. A secondary ray does not hit any object: The hori- 
zon color L, ; (sections 3.2 and 4.2) is returned if the ray di- 
rection is above the horizon. 
Case C. Any ray hits an artificial object: The illumination 
model (6) is evaluated at the intersection point p of the ray 
and the artificial object under consideration of the illuminating 
natural and artificial light sources. Subsequently, the atmo- 
spheric model (2) is evaluated based on the known distance 
d along the ray and the result L; is returned. Depending on 
the object material, a reflected and / or a refracted ray may 
be recursively traced. All material properties come from the 
member of the material library (Fig. 4) which is referenced 
through the object's material attribute. 
Case D. Anyray hits a natural object: The intersection point 
p of the ray and the natural object is mapped from world 
coordinates to image coordinates p' of one of the input 
images. This is the current input image if the ray is a primary 
ray; otherwise it is the input image referenced through the 
polygon's image attribute (section 4.1). The RGB triplet of 
the reconstructed image function at p' is converted to a 
spectral distribution function L; (appendix A) which repre- 
sents the apparent color of the natural object. The true color 
L,, Of the natural object is determined by the second inver- 
sion of the atmospheric model (section 4.2) based on the 
known distance d, between p and the camera associated 
with the selected input image (the effect of the atmosphere is 
eliminated). The diffuse reflectance o,; Of the object material 
is determined from L,, by the second inversion of the illumi- 
nation model (section 4.3). All other material properties, in- 
cluding k,, come from the member of the material library 
(Fig. 4) which is referenced through the polygon's material 
attribute. 
Three subcases of case D are identified: 
Case D1. The natural object is illuminated by an artificial 
light source: The illumination model (6) is evaluated at p un- 
der consideration of the illuminating artificial light source and 
the result is added to the true object color L,;. Subsequent- 
ly, the atmospheric model (2) is evaluated based on the 
known distance d along the ray and the result L; is returned. 
Case D2. The natural object is illuminated by a natural light 
source: The illumination model is not evaluated since the illu- 
mination effect at p is already represented by the true object 
color L,;. The atmospheric model (2) is evaluated based on 
the known distance d along the ray and the result L; is re- 
turned. 
Case D3. The natural object is not illuminated by a natural 
light source due to an artificial object only. The irradiance E; 
due to the natural light source is calculated at p as if there 
were no artificial objects in the scene. The illumination model 
(6) is evaluated at p with a negative irradiance —E; and the 
result is added to the true object color L,;. Subsequently, 
the atmospheric model (2) is evaluated based on the known 
distance d along the ray and the result L; is returned. 
Depending on the object material, a reflected and / or a re- 
fracted ray may be recursively traced for each of the three 
subcases D1 - D3. 
6.3 Change of Perspective 
Restarting at the setup step (section 6.1) with another cur- 
rent input image and the corresponding current camera al- 
lows to generate a different view of the scene without any 
changes in the natural or artificial environment description. 
6.4 Output and Viewing 
The program VIEW (Fig. 4) displays the finished part of an 
image the computation of which is still in progress. This al- 
lows to interrupt the calculation of not very promising images. 
VIEW may run on a local workstation while RENDER runs 
concurrently on a remote host. Image data is transmitted in 
ASCII format through a UNIX pipe. 
7. EXAMPLES 
Three examples will be presented in this section. Additional 
information about the examples is given in Tab. 1. 
Example 1. Gallery: This is the test scene on the campus of 
ETH-Hoenggerberg, Zurich. Fig. 5, top left and bottom left, 
shows the natural environment on a sunny day. The gallery 
(supports, roof, floor), the horizontal lawn plane and the main 
buildings in the background were approximated by planar 
polygons during preprocessing (section 4.1). The horizon 
439 
 
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.