Full text: XIXth congress (Part B1)

  
Claudio Dupas 
  
The study area is covered by six 1:50,000 scale topographic map sheets produced by IBGE. The maps were edited in 
1975 and are based on aerial photographs from 1966. These were the only cartographic information available containing 
this level of detail. The maps contain information on the roads and rail network, urban areas, municipalities, water 
bodies, land cover (just a few classes) and topography (20 metres spacing contour lines). The map projection used is the 
UTM zone 23K, based on the datum Corrego Alegre. Digital maps corresponding to the above-described set of map 
sheets were also obtained in order to build a DEM from the layer containing the contour lines. 
3 METHODS 
Following the stratified random sampling rules, four sets of sampling plots were measured during fieldwork. For the 
forest trainning set, 26 plots were measured; other 26 for the forest test set; 14 for the Eucalyptus trainning set and 14 
for the Eucalyptus test set. For each plot, DBH, tree hight, altitude, slope and aspect were measured. Only trees with 
height (4 3 m and DBH gg 10 cm were measured. 
Two speckle filters were selected to be tested on JERS-1 and ERS-2 image subsets: the Gamma MAP filter (Lopes et 
al, 1993), and the Lee-Sigma filter (Lee, 1980). The filter algorithms are described in the above mentioned references. 
Three different window sizes (3x3, 5x5 and 7x7) were applied for each filter. A visual subjective judgement of the 
results pointed which one of the six filter-window combinations could best smooth the images while preserving the 
edges. The 5x5 Gamma Map was the one selected. 
A raster DEM was produced from the 1:50,000-scale elevation vector layer (20m-spacing contour lines). The six 
topographic maps sheets covering the study area were scanned and rectified and a mosaic was created. The mosaic was 
used as *master' image for rectification of all images. 
Image fusion can only be performed if the pixels corresponding to the same location on the different images to be 
combined have the same size, orientation and center location. In other words, pixels registered to each other must refer 
to the same feature on the ground. Geometric correction, or geocoding, is therefore one of the most important steps in 
image fusion. Ortho-rectification is a geocoding system usually applied to correct remote sensing imagery (especially 
radar images) covering a mountainous area. It uses a geometric model to correct terrain-induced distortions. The 
following parameters are used in the model: platform (position, velocity, altitude), sensor (viewing geometry), map 
projection (coordinate system, datum, etc), Earth general parameters, and the final set of GCPs, defined by X, Y and Z 
(from a DEM) coordinates. The images were ortho-rectified using the OrthoEngine module of EASI/PACE software. 
Since the imagery used was going to be classified, the choices on which method to take were made on the basis of 
conserving as much as possible the original pixel values. The nearest neighbour was therefore the resample algorithm 
employed in image rectification. Moreover, in order to avoid multiple resampling (and therefore multiple changes in 
image geometry), image rectification, SAR speckle filtering, and the change in pixel size were all carried out in a single 
step. A common pixel size of 15 meters was used. 
The previously rectified images were fused on a pixel-by-pixel basis. Each different type of SAR image was fused with 
the Landsat TM 95 image using two different fusion techniques. The different image combinations and transformation 
types are summarized in Table 2. Several have been described on literature to perform image combinations. For the 
purpose of this study, the Intensity, Hue and Saturation (IHS) cylindrical transformation seemed the more appropriated. 
The equations for the IHS colour transformation can be found in Harrison and Jupp (1990). Alternatively, the Brovey 
transformation was used as a second technique. The Brovey transformation equation is ilustrated in Vrabel (1996). 
Table 2: The fused images 
  
  
  
  
  
  
  
  
  
Fused images Imagery used Transformation used 
TMJC Landsat TM 95 / JERS-I IHS Cylindrical transformation 
TMJB Landsat TM 95 / JERS-1 Brovey transformation 
TMEC Landsat TM 95 / ERS-2 IHS Cylindrical transformation 
TMEB Landsat TM 95 / ERS-2 Brovey transformation 
  
Classification of SAR and SAR-fused data is a research field that is being given increased attention (van der Sanden, 
1997; Michelson et al., 2000). Studies involving textural analysis-based classifications have obtained good results (van 
der Sanden, 1997) but operationally they are still time demanding and difficult to implement. Recently, new contextual 
classifiers based on Bayesian image segmentation algorithms have been created and are starting to be evaluated in 
practical applications. Michelson et al. (2000) compared an image segmetation algorithm (the SMAP), a neural network 
(based on the back propagation algorithm) and the maximum likelihood algorithm to perform land cover classification 
of Landsat TM, ERS-1 and fused ERS-1-Landsat TM imagery. The comparison of the overall classification accuracy 
indicated that the SMAP (57.196) outperformed the maximum likelihood (52.4 96) which, in turn, outperformed the 
  
98 International Archives of Photogrammetry and Remote Sensing. Vol. XXXIII, Part Bl. Amsterdam 2000. 
neur 
usin 
sens 
com 
the t 
(Lill 
The 
and: 
The 
pixe 
each 
The 
Grec 
clas: 
(‘co 
199€ 
and 
total 
The 
over 
num 
Sup: 
fuse 
bitm 
The 
clas: 
mas] 
clas: 
as 'g 
Like 
valu 
mini 
Figu 
fore: 
com 
the I 
the I 
The 
scatt 
(HH 
form 
Ther 
ERS 
imag 
foun 
Bijk 
(witl 
Bott 
relat
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.