Full text: XVth ISPRS Congress (Part A2)

386 
ANNEX 
  
Information content for uncorrelated image samples 
  
The average information content per sample, in case of uncorrelated 
samples, is given by the mutual information measure: 
1I(X,YY 2dHtY)- HC /X) (A1) 
where: v 
4() » - | fy() 1095 fy) dy (A2) 
Q co 200 
H{Y/X}= M | f. Cy/x).f. (x3 1095 F0y/X hedxdy (A3) 
grote? X e, 
fy (y) = | fp (/x) f(x) dx 
where fy, fp, fy are respectively the probability density functions of the 
unspeckled image, the speckle and the speckled image. It can be shown 
that for the conditional pdf of the speckle component derived from (2), 
Let 
H(Y/X) » Jog,(L-1)! -log,L-(E1) | >= 1 ¢|+ | £ (x) log, x dx (A4) 
e o winzal jh j goo 2 
where 2.0.5772... |Ss. Euleris constant. 
The entropy values of (A2) and (A4) has been numerica!ly evaluated for the 
source intensity distribution given by, 
fx(x) = fg(x) + fg(-x) for x > O 
where f.(x) is the Gaussian distribution with mean u and variance vs 
The comparison with the computed lower bound of I(X,Y) is performed with 
the same value of 
2 
r= T NS where msE(f,(x)) and c 2. E (fx (x) 2 = E (fg(x))* 
ag? X S 
The results are given in the table below. 
  
ry 2 
75 
© 
a 
co 
  
3 0.161 
0.585 
0.904 
0.069 
0.292 
0.500 
d 
CO Un r2 CO On 
OOO OO 
Or 1) COS TS 
{as I KO SI PN 
be 
  
  
  
  
  
 
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.