Full text: Proceedings; XXI International Congress for Photogrammetry and Remote Sensing (Part B7-3)

The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Vol. XXXVII. Part B7. Beijing 2008 
1120 
alternate, application oriented, color space to represent 
multispectral data more objectively (Nasr et al., 2001). It uses 
three positional parameters in lieu of the Red, Green and Blue 
(RGB); Intensity, Hue and Saturation. Intensity relates to the 
overall brightness of a color or energy level of the light and is 
devoid of any color content. It shows how close it is to black or 
white. Hue refers to the dominant or average wavelength of 
light contributing to a color, i.e. the actual perceived color such 
as red, blue, yellow, orange, etc. Saturation specifies the degree 
to which the color is pure from white light (grayscale) dilution 
or pollution. It runs from neutral gray through pastel to 
saturated colors. The transformation from RGB color space to 
IHS space is nonlinear, lossless and reversible. One can vary 
each of the IHS components without affecting the others. It is 
performed by a rotation of axis from the first orthogonal RGB 
system to a new orthogonal IHS system. The equations 
describing the transformation to the IHS are as follows 
(Pellemans, et al., 1993): 
VI 
vi 
0 
r 
1 
VI 
0 
y 
. VI _l 0 
VI VI u 
0 0 1 
WI 
r ^ o 10 r 
VI J 
G 
R 
The value of H, S and I can then be computed as: 
H = tan 
S = cos' 
fP 
Vx, 
4~y 
V 
x + y + z 
' <t> (tf) 
l-(x + y + z)l I m (H ,S) 
Where is the maximum co-latitude permitted at a 
given hue and / (//, S) is the maximum intensity permitted 
at a given hue and co-latitude. 
3. DATA ACQUISITION AND METHODOLOGY 
To aid the extraction of the information by visual interpretation, 
data fusion is used to provide a new refined image and 
contribute to a better understanding of the objects observed 
within that image. Fusion of different imaging sensors data 
involves two major steps. First, the digital images from both 
sensors are geometrically registered in respect to one another. 
Next, the information contents (spatial and spectral) are mixed 
to generate a single data set that contains the best of both sets 
(Eldougdoug and Nasr, 1994). The geometric registration plays 
an essential role because misregistration causes artificial 
features in the multisensor data sets, which falsify the 
interpretation later on. It includes the resampling of image data 
to a common pixel spacing and map projection (Onsi, 2002). 
The study area covers approximately 30.8 km by 45.4 km as 
shown in the location map (Figure 1). Two sets of multisensor 
data are used in this study. Part of landsat TM scene (Path: 177, 
Row: 45) 28.5 m resolution, acquired on October, 1984 (Figure 
2) and part of RADARSAT-1 scene, 12.5 m resolution, 
acquired on September, 1998 (Figure 3). These images were 
processed using the ERDAS Imagine, version 8.7 software. IHS 
method is applied to three bands at a time, whose fusion output 
is displayed in either true or false color. Therefore, three 
selected bands from the Landsat scene; 7 (2.08 - 2.35 pm), 4 
(0.76 - 0.90 pm) and 2 (0.52 - 0.60 pm) were used since they 
contain most of the information about the surface geological 
features of the study area. 
26° E 29° E 32° E 35° E 
Figure 1. Figure placement and numbering 
Figure 1. Location map of the study area indicated by hatches 
in a rectangular form 
Figure 2. Landsat TM scene, October 1984
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.