Full text: Close-range imaging, long-range vision

positional 
itions that 
  
  
  
ı typical D- 
tion and 
ce between 
it will not 
ing models 
ablish strict 
mation and 
em that we 
1g time was 
ing System 
our current 
the shutter 
time. As a 
ng time has 
ON BY 
IN 
to building- 
ystem first 
ed image by 
| within the 
the images 
; and tracks 
-ulates their 
ts. Next, the 
ated three- 
Finally, the 
e building's 
) reconstruct 
detail. 
  
     
     
Omnidirectional image 
   
: Estimate FOE 
Calibrated image 
  
| Feature point tracking 
GPS data 
Offset parameter 
(b) 
  
Calculate 3D points 
  
  
  
  
| Reshape surface model GM Re-project surface model] : 
X Extract texture : 
  
(c) 
       
Reconstruct 3D model 
Figure 5. Process flow of reconstruct 3D model from 
omnidirectional Stereo 
3.1 Calculating offset between camera position and GPS 
system position 
To obtain the shooting position of an omnidirectional camera, 
the offset between the camera's position and GPS system 
position as determined by the GPS antenna must be determined 
beforehand. This offset is calculated by using points whose 
positions in space are known (SCP') as reference points. Here, 
the position of a SCP projected onto a captured image is 
manually plotted, and the system position is then optimized so 
that the difference between it and the projected position is 
minimum. The difference between this optimal GPS system 
position and its initial value is taken to be the offset. If the 
installation positions of the camera and GPS antenna on the roof 
of the vehicle are not changed, an offset parameter determined 
in the above way can be applied to all images. In actuality, 
nearly the same value for this offset was obtained on calculating 
it from multiple images. This demonstrates that, once an offset 
has been calculated, it can be applied to all omnidirectional 
images as long as the camera's position is not altered. 
3.2 Estimating camera attitude 
Figure 5(a) shows that part of the process flow that determines 
the camera's attitude parameter at the time of shooting. 
Depending on the system, it is not uncommon for such attitude 
information to be determined by a gyro sensor. Nevertheless, 
determining attitude at the correct camera position is 
troublesome even with the use of a gyro. For this reason, 
determining camera attitude at the time of shooting from an 
actual image is the most ideal approach. In this regard, a 
property of an omnidirectional image is that camera attitude can 
  
! SCP (Spatial Control Point). An extension of the Ground 
Control Point (GCP) concept, a SCP means a point whose 
3D position in space has been surveyed. Specifically, it 
refers to 3D values of points on a building's corner, side, etc. 
(Ishikawa, 2001) 
be approximated by the difference between FOE in the image 
and the center of the image. Accordingly, FOE can be estimated 
by using, for example, a vertical segment of a building in the 
image. Estimating FOE in this way makes it easy to calculate 
camera attitude at the time of shooting from the captured image 
itself. This, in turn, makes it possible to carry out high-accuracy 
3D measurements that take camera attitude into account. 
3.3 3D measurements by omnidirectional stereo vision 
Figure 5(b) shows that part of the process flow that performs 
3D measurements by omnidirectional stereo vision. We here 
describe the optical projection formula of an omnidirectional 
camera and the equations for calculating depth and height 
through omnidirectional stereo. Figure 6 and 7 shows the 
relationships between the omnidirectional cameras and points 
on target buildings. The plane which consists of an x-axis-y-axis 
is made into the ground of height 0 among a figure, and z-axis is 
taken in the height direction. 
| 
      
   
Building Projection point 
Target polN 
Projection point 
  
  
Figure 6. Horizontal position of a building and omnidirectional 
cameras 
Target point 
  
Building 
Projection point 
  
  
  
Position 1 
dee 
  
  
  
  
Figure 7. Vertical position of a building and an omnidirectional 
camera 
Now, from the projection formula of a fish eye lens, the 
distance v from image center to the position where the target 
point is projected on the image can be given by Equation (1). 
—201— 
 
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.