Full text: XVIIIth Congress (Part B3)

   
ms (INS), 
, data rate, and 
m the static usc 
cs situations. In 
rably extended. 
ermined directly 
from a block of 
the operational 
ior position and 
perience with a 
r integration for 
ffecting system 
the potential for 
' design tool. In 
idditional layers 
S. 
€ it can provide 
ithout the use of 
ent procedures. 
fusion with the 
are possible in 
iced in all cases 
asons. Costs are 
re little or no 
s sufficient for 
chwarz (1995). 
tral applications 
oncept of the 
ic unit emerges. 
; georeferencing 
ns, and can be 
‘the same scene 
ar geometry or 
?w step because 
re obtained in a 
is conceptually 
ted images and 
> georeferencing 
is currently the 
cases where 
ed. It is viewed 
    
    
   
   
   
    
    
   
   
   
   
   
   
   
    
    
   
   
    
   
   
   
   
   
   
    
   
   
   
    
   
     
   
    
    
    
    
    
    
  
   
    
   
     
  
     
  
    
   
   
    
   
   
as one of several possible auxiliarv data which are used to 
support block adjustment and thus the indirect method of 
georeferencing. The direct method, in contrast, does not require 
connectivitv information within a block of images to solve the 
georeferencing problem and, thus. offers much greater flexibility. 
It is especially intriguing to consider its use for close-range 
imaging applications which use either digital frame cameras. 
pushbroom scanners. or lasers as imaging components. 
In the following. common features in the design and analysis of 
mobile close-range imaging svstems will be discussed and 
illustrated by examples. Many of these features are also 
important for general multi-sensor systems. however, the 
discussion of a morc narrow field simplifies the. presentation. 
Svstem design and analysis comprises the following steps as a 
minimum: 
e Data acquisition 
e. Svnchronization and georeferencing 
e Integration and data fusion 
e Quality control 
e Data flow optimization and automation. 
These processes will be brieflv discussed in the following 
chapters. To illustrate the major steps. the development of the 
VISAT system will be taken as an example. The design 
objectives for this svstem were as follows (Schwarz et. al. 
(1993b)) : 
"A multi-sensor system is required that positions all 
visible objects of interest for an urban GIS with an RMS 
accuracy of 0.3 m while moving through a road corridor 
with a maximum speed of 60 km/h and a maximum 
distance to the desired objects of 30 m. Data acquisition 
must be automatic and should contain real-time quality 
control features. Data processing, except for quality 
control, will be done in post mission and should have 
separate modules for georeferencing, image data base 
management, imaging, and quality assessment." 
2. CONCEPT OF A MOBILE MULTI-SENSOR SYSTEM 
USING CLOSE-RANGE IMAGING SENSORS 
The conceptual layout and data flow of a multi-sensor svstem for 
close-range mapping applications is shown in Figure (1). The 
selection of sensors for such a system obviously depends on 
svstem requirements, such as accuracy. reliability, operational 
flexibility, and range of applications. The data acquisition 
module has therefore to be designed keeping both the carrier 
vehicle and the intended applications in mind. The data 
acquisition module contains navigation sensors and imaging 
sensors. Navigation sensors are used to solve the georeferencing 
problem. Although a number of different systems are used in 
general navigation, the rather stringent requirements in terms of 
accuracy and environment make the integration of an inertial 
navigation svstem (INS) with receivers of the Global Positioning 
System (GPS) the core of any sensor combination for an accurate 
mobile mapping svstem for short range applications. This 
combination also offers considerable redundancy and makes the 
use of additional sensors for this reliability purposes usually 
International Archives of Photogrammetry and Remote Sensing. Vol. XXXI, Part B3. Vienna 1996 
unnecessary. However, the addition of an odometer, such as the 
ABS, may be useful for operational reasons, as for instance 
keeping a fixed distance between camera exposures. 
Imaging sensors can be subdivided by the way they contribute to 
the information about the object space. They may provide 
descriptive information, as for instance grey scales, or geometric 
information, as for instance direction or ranges from the camera 
to the object. Table 2 summarizes the contribution of sensors 
typically used in close-range mapping applications. 
In close-range mapping, photogrammetric methods have been 
increasing in importance, due to the use of CCD cameras. These 
sensors have overcome two major disadvantages of film-based 
photographic cameras: single-frame, slow-rate photography and 
highly specialized processing equipment. Recent trends in CCD 
technology are characterized by increased resolution, color image 
acquisition and improved radiometric quality (anti-blooming, 
reduced cross talk). Another important development which 
supports thc usc of CCD cameras in photogrammetric 
applications, is the advancement of fast analogue-to-digital 
conversions (ADC). Frame grabbers integrated with high-speed 
computer buses and processing hardware have become a 
standard commodity. Compared to analog/analytical plotters 
used in conventional photogrammetry, the use of state-of-the-art 
computer image boards greatly simplifies measurements. 
The selected sensor configuration requires a certain data 
processing sequence. Part of the processing will have to be done 
in real time, such as data compression for the imaging data and 
initial quality control processing for the navigation data. Most of 
the data, however, will immediately be stored away for post- 
mission use. In post-mission, the data processing hierarchy is 
determined bv the fact that all images have to be georeferenced 
first before they can be used in the integration process. The first 
step is therefore the georeferencing of all recorded images and 
their storage in a multimedia data base. To determine 3-D 
coordinates of objects visible in CCD camera images, the 
following information is needed for a pair of cameras: 
e Position of the camera perspective center at exposure time ( 
3 parameters per image). 
e (Camera orientation at exposure time ( 3 parameters per 
image). 
e Interior geometry of the camera sensor. 
e The lens distortion parameters 
The first two set of parameters are known as exterior orientation 
parameters, while the other two sets are known as interior 
orientation parameters. The general problem in photogrammetry, 
aerial and terrestrial, can be seen as the determination of the 
camera's interior and exterior orientation parameters. The 
exterior orientation parameters are determined by a combination 
of GPS and INS, the interior orientation parameters by 
calibration. This means that exterior orientation is tied to a rcal- 
time measurement process and its parameters change quickly. In 
contrast. interior orientation is obtained by using a static field 
calibration procedure and can be considered as more or less 
constant for a period of time. Thus, it can be done before or after 
the mission and is of no concern in the data acquisition process.
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.