Full text: Proceedings (Part B3b-2)

The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Vol. XXXVII. Part B3b. Beijing 2008 
C k (Ex k , En k , He k ))k = 1,2,..., m (I0) 
In this expression, C k ( Ex k , En k , He k ) are three digital 
characters of object-cloud, which procreated in band k . So, 
the each dimension of multi-dimensional cloud can form a one 
dimensional cloud-space by one-dimensional cloud-space 
mapping model. 
3. Extraction of multi-dimensional edge cloud and 
transitional region 
3.1 Extraction of multi-dimensional edge cloud 
The object in image turn into multi-dimensional cloud in 
cloud-space though mapping model, the contiguous cloud 
present a intersectant state because of the uncertainty of edge 
pixel and influence of super-entropy [ll] .The edge cloud is an 
especial cloud which expected value is the average gray level 
of edge pixels, the membership of cloud drops to this cloud is 
the degree of every pixel of transitional region close to this 
average gray level. Figure 1 shows two edge clouds with 
different digital characters. 
b) Ex = 24.6, En = 30.7, He = 5.3 
Figure 1 Edge cloud with different digital characters 
The method to extract the one-dimensional edge cloud is 
discussed in literature 1 ll] . Because multispectral image 
corresponding to a multi-dimensional space, so, the extraction 
of the edge cloud in multi-dimensional space must be 
performed in every sub-space. Suppose a multispectral image 
with m bands, a multi-dimensional cloud-space Ris created 
though the mapping model. Two contiguous objects in image space 
are corresponding to two multi-dimensional 
ciouds4fi t ,5j 4 ,№ 4 ,Bc (2 ,a C) /i 0 ,..,Bc te ,Bi fa ,/§J 
and . 
The Boolean calculation between object-cloud with 
corresponding dimensional in A and B can be implemented. 
In this expression, Ex ck > 
characters of edge cloud in dimension k . 
The edge cloud of left and right intersectant clouds can be 
obtained through above algorithm. First, the expected values are 
achieved by its adjacent region calculation. So, the relativity of 
these pixels of image has been considered completely. Second, 
the calculate process is a smooth process similarly, so, the 
influence of noise is weakened to some extent. Third, the 
entropy and hyper entropy is obtained by calculation, the 
relationship with the entropy value and hyper entropy of left 
and right object cloud is close. And it represents the influence 
of random elements of image to the edge cloud expectation and 
standard deviation. So, logical range of transitional region can 
be deduced by the result. 
3.2 Extraction algorithm of edge transitional region 
(k = l,2,...,m) 
Transitional region is formed by part of the pixels between the 
objective and background of image. These pixels locate 
between objective and background, gray level distributing is 
also between the objective gray level value and background 
gray level value [13] . So, transition region is expressed a region 
which is covered by some cloud drops except for cloud core of 
intersectant cloud. Suppose A and B are adjacent objects in 
image / , two intersectant 
clouds A=(P 4 (/,/),Ec a ,Eh A ,fk A ) and B=(P B (ij),Ec B ,Eh B ,IE B ) 
in cloud space can be obtained by mapping mode. By formula 11 
edge cloud C{L c ( i, j ), Ex c , En c , He c ) and three 
digital characters can be obtained at the same time. Ex c is 
the gray leVel expected value of the core of edge cloud, En c 
is entropy which is express the gray level scope of edge cloud. 
=“|(fix* ~ 3En Ak +3En Bk +He Bk )\ 
En ck =~\(&Bk +3 En Bk + He Bk )—(Ex Ak —3En Ak —He Ak )\ 
He ck =mzx(He Ak ,He Bk ) 
En Ck and He Ck are the digital
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.