Full text: Proceedings; XXI International Congress for Photogrammetry and Remote Sensing (Part B1-1)

The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Vol. XXXVII. Part Bl. Beijing 2008 
Firstly, we assess NIIRS values by human operators for various 
high resolution images and compare the values with the NIIRS 
provided in the metadata of satellite images. Secondly, we use 
GIQE and estimate NIIRS values through image analysis. We 
will compare the NIIRS value obtained through image analysis 
with the value from human operators. Our ultimate goal is to 
develop a technique for automated estimation of NIIRS values. 
This should be feasible once the validity of the image based 
estimation of NIIRS is proven. 
2. DATASET AND MANUAL ESTIMATION OF NIIRS 
For experiments we used two IKONOS image and four 
Quickbird images. The following table summarizes the 
properties of images used. For Quickbird images predicted 
NIIRS (PNIIRS) values were provided within the metadata. For 
IKONOS images, NIIRS values were not included in metadata 
explicitly. Instead we used the published NIIRS values. Note 
that GSDs for the same satellite images were different to each 
other due to their different viewing angles. 
have assumed idea situations when predicting NIIRS levels for 
their images. 
The exact cause of the difference between PNIIRS and TNIIRS 
requires further investigation. We assumed the TNIRS as the 
reference and proceeded the next experiments. 
3. NIIRS ESTIMATION THROUGH IMAGE 
ANALYSIS 
While NIIRS values are to be estimated by human operator, 
research has been carried out to relate NIIRS with other image 
quality measures, such as GSD, MTF and SNR. As a result, 
Leachtenauer et al. proposed the relationship between NIIRS 
and other image quality measures as below 
NIIRS = 10.251 - a log 10 GSD gm + b log 10 RER GM 
- (0.656 * H) - (0.344 * G / SNR) 
Image Type 
Acquisition Date 
GSD(m) 
PNIIRS 
Quickbird 1 
24 Sept. 2002 
0.6994 
4.3 
Quickbird 2 
2 Nov. 2002 
0.6797 
4.4 
Quickbird 3 
15 Jan 2005 
0.7509 
4.5 
Quickbird 4 
15 Jan 2005 
0.7661 
4.5 
IKONOS 1 
7 Feb. 2002 
0.9295 
(4.5) 
IKONOS 2 
7 Feb. 2002 
0.9099 
(4-5) 
Table 1. Characteristics of images used for experiments. 
These six images were used for estimating NIIRS levels by 
human operators. From each image, seven sub-images 
containing geographic or man-made features were extracted. 
Four human operators were analysed a NIIRS level for each 
sub-image by observing the features within the sub-image and 
the NIIRS visibility tables provided by IRARS (1996). Final 
NIIRS level for one image was determined by taking an average 
of the NIIRS levels estimated for seven sub-images from four 
operators. Table 2 shows the NIIRS values so-estimated. In this 
paper we regard this as “true” NIIRS (and hence refered to as 
TNIIRS hereafter). 
Image Type 
PNIIRS 
TNIIRS 
Quickbird 1 
4.3 
3.71 
Quickbird 2 
4.4 
3.75 
Quickbird 3 
4.5 
3.93 
Quickbird 4 
4.5 
3.75 
IKONOS 1 
(4.5) 
3.53 
IKONOS 2 
(4-5) 
3.52 
Table 2. NIIRS provided in the metadata (PNIIRS) and 
estimated by human operators (TNIIRS) 
There is a significant difference between PNIIRS and TNIIRS. 
Whereas the values published within the metadata were closer 
to the nominal values, the actual values estimated by human 
operators were much smaller. This could be because un 
experienced operators estimated the value. Experienced 
operators should identify features better and hence score NIIRS 
level higher. On the other hands, satellite image providers may 
where RER is regularized edge response, H the overshoot and G 
the sum of MTF correction kernels. 
RER can be measured by analysing the slopes of edge profiles 
within the image and this value represents MTF characteristics 
of the image (Blonski et al., 2006). For calculating RER, we 
normalized the magnitude of edge responses from 0 to 1 and 
produced nominal edge responses by averaging out individual 
edge responses (see figure 1). Then we assume the position at 
which normalized edge response is 0.5 as the center of edge and 
calculate the differences of edge responses at +0.5 and -0.5 
pixels from the edge center in X direction (ERx) and Y 
direction (ERy). RER can be calculated as a geometric mean of 
Ex and Ey (Blonski et al., 2006) as below. 
RER gm = j[ER x (0.5) - £«,(-0.5)jfyi,(0.5)- £«,(-0.5)] 
Figure 1. Calculation of RER (Blonski et al., 2006) 
H and G are included within GIQE to take the side effect of 
MTF correction into account. In general MTF correction will 
increase the overshoot within edge profile. For calculating H, 
we first calculate the maximum values at +1 to +3 pixels from 
the edge center within the edge response in x and y direction
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.