Full text: Proceedings, XXth congress (Part 5)

   
B5. Istanbul 2004 
  
arts 
he first task is to 
cameras center of 
This process is 
the object and for 
libration process is 
;h of the cameras. 
meters (Xt, Yt, Zt) 
eters (phi, epsilon, 
he parameters, an 
| points of known 
ne of the cameras. 
sed to estimate the 
il et al. 2001) 
cameras’ extrinsic 
»bject is calculated 
vo images and the 
o overcome the 
pattern is projected 
pondence problem 
id in the literature 
, Wust and Capson 
methods to use a 
jects may be multi- 
(1998) analyze the 
ystem. This model 
LCD projector, the 
and the surface 
is that it assumes a 
in the three RGB 
showed that when 
n captured in RGB 
e channel contains 
'o channels contain 
a noise immunity 
jetween a and the 
ossible strips in the 
im accuracy under 
ent. 
nted in the present 
s the different RGB 
'D sensors changes 
d patterns can be 
   
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol XXXV, Part B5. Istanbul 2004 
squeezed or extended in order to include the regions where the 
system can efficiently differentiate between color levels. 
The colors and texture of chicken filets are relatively constant 
among filets, and are also uniform across the filet itself. The 
system presented in this work addresses the problem of 
matching points in stereoscopic images for objects with 
uniform color and texture. The main idea is to project a non- 
uniform varying color pattern. In some wavelengths the 
spectrum of the reflected color pattern will change rapidly, but 
in some ranges the color spectrum will change slowly. This 
change depends on the object, on the imaging system and on 
the projection system. 
Projection of a monotonic linear increasing hue value pattern 
on the object does not result in the same linear increasing 
pattern on the captured image. This is due to the object's 
chromatic characteristics and the features of the capturing / 
projection systems. Analysis of the captured pattern showed 
that the hue values, which are absorbed by the object, look 
similar on the captured pattern. Wavelengths, which are 
reflected by the object body, are well detected as changing on 
the captured pattern. The ability to detect small amounts of 
change depends on the projection / capturing systems. The 
systems sensitivity is also not constant and may change from 
one wavelength to another. 
Assuming that the best-fit projection will be the one that 1s 
captured as linearly changing across all the hue values, the 
specific objective of this work was then narrowed down to 
develop the methodology for constructing a projection pattern 
that will be captured as linearly changing across all the hue 
values. For a given projection system, image capturing system 
and chromatic / textural object characteristic a projected 
pattern that leads to a linear captured pattern can be found. 
To find the best projection pattern an iterative method was 
implemented. Let i be the iteration number. Let L be a pattern 
with linearly varying hue values. Let Pi be the projected hue 
pattern at iteration i. Seti 2-1 and P1 = L. 
The stages of the iterative method: 
1. Project (PD). 
2. Capture the pattern of the hue values as reflected by 
the object (Ci). 
3. Compute the differences between the captured 
pattern, Ci, and the linear pattern, L. Denote the 
difference pattern as the delta pattern Di where Di = 
Ci-L. 
4. If all elements of D; are smaller than a small value 
€ then stop. 
else Pıy = P; - @ D; , where Q is the learning 
rate, go to 1. 
7, EXPERIMENTAL RESULTS 
The method was initially tested on a white board plane. A 
linearly varying hue pattern was projected on the surface of the 
board and two cameras captured the scene. Figure 2 shows the 
result of the reflected color pattern after the first iteration. The 
X-axis is the pixel number across one cycle of the varying hue 
pattern (from red too violet) and the Y-axis is the hue value. In 
the first iteration L = P1 which means that the projected pattern 
is the desired linearly varying pattern. C1 is the resultant 
pattern captured by the imaging system. P2 is calculated 
according to P2 = P1 - & D1. The learning rate was set to be 
0.7. In the next iteration the projection pattern was P2. Figure 3 
shows the results at iteration 11 after the process converged. 
One can see that the captured pattern almost coincides with L, 
which means that the hue of the captured pattern varies almost 
linearly. 
  
  
    
    
  
  
  
   
  
  
  
  
  
    
    
  
  
  
  
  
  
  
  
08} = 
0.8 / 1 
| 
0.7 {+ 
Captured mb | 
ed, pattem 7 Projected 
ss | (C1) / 
/ pattern (P1) 
04 y^ and L 
03} iier 1 
tT | 
a y om Corrected 1 
7 s t | 
"E pattern (P2) | 
0 rs NO tpi] 
0 20 40 60 80 100 120 140 160 
Figure 2. Iteration lof the iterative process 
  
os. Captured 
pattern 
(CID) 
06 s 
  
07r 
  
  
  
  
  
os! p Projected 
A pattern 
  
  
  
  
  
  
56,100 150. 200 250 300 
{A) iteration no.i 
0 100 200 300 
18) iteration no.5 
  
100 200 300 
{C} iteration no.11 
Figure 4. Error rate of phases 1, 5 and 11 of the iterative 
process 
Figures 4 A, B, C show the DEM profiles, generated at 
iterations 1, 5 and 11 respectively. X-axis is the DEM profile 
and Y-axis is the pixels height in millimeters. The 
improvement of the DEM accuracy of the flat white board 
along with the iterations is clearly seen: as the iterative process 
  
    
   
  
  
  
  
  
     
   
   
     
    
   
   
   
    
   
   
    
     
   
  
   
   
   
   
   
   
    
    
   
      
    
   
    
   
   
    
   
     
      
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.