Full text: Proceedings, XXth congress (Part 7)

  
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol XXXV, Part B7. Istanbul 2004 
assembled into strings, whereby some elements (e.g. date and 
parts of coordinates) are unchanged for the whole video and 
therefore fixed. The following table gives and overview of the 
initial values and a typical readout (Table 2). 
  
Initial values Example output string 
  
latitude "0006 ..,...." Latitude 0006 54.900 
longitude "52 ..,..." longitude 52 13.000 
angleH -14.23 
angleV -33.97 
time 17h18m23s 
date 2000:05:13 
angleH ;........ 
angleV ........ 
time "l.h. m.s" 
date "2000:05:1." 
  
  
  
  
Table 2. Example result of frame information decoding 
    
  
rt 
2 mn 
Ze d 
   
   
   
  
  
Figure 4. (a) Pre-disaster Ikonos image overlaid with post-event 
aerial photographs, and with helicopter flightpath of 13 May 
plotted (yellow dots) based on information extracted from the 
video data. The immediate disaster area is indicated in yellow. 
(b) Camera viewing directions calculated from the GPS 
auxiliary data. These data can also be used to plot the footprints 
of individual frames, as illustrated in (c). The solid red box 
shows the approximate location of the frame shown in (d). The 
box in hatched red is the result of the automatic calculation, 
based on absolute azimuth, camera inclination, and helicopter 
location. Focal length and flying height were simulated. The 
apparent positional difference between the two footprints results 
from uncertainty in the absolute camera azimuth, as the 
helicopter orientation is not necessarily identical to the flight 
direction, further illustrating the need for IMU information. 
  
A problem for the correlation was the low video quality, in 
particular the horizontal instability between lines, a result of 
interlacing of two frames carrying half the information each 
(odd vs even lines). Therefore we first deinterlaced the frames 
(combining the two half-frames into one), and sharpened the 
result. The subsequent correlation and string processing then 
took approximately 4 sec per frame. 
The resulting table was then further processed to convert the 
geographic coordinates into UTM to make them fit our 
reference imagery, as well as to calculate the absolute flight 
vector between frames and absolute camera azimuth. We 
consider the inclination angle to be absolute, although it is 
dependent on the roll, yaw and pitch of the helicopter. The 
effect on the IFOV of the camera can be substantial, and should 
be corrected for with Inertial Measurement Unit (IMU) 
information. 
3.4 Video mosaicing . 
The erratic nature of video imagery detailed above complicates 
its use. Unlike with vertical aerial photographs and satellite 
imagery a simple geocoding is not possible. However, given the 
value for overview and orientation purposes of such a mosaic, 
we used RavenView (www.observera.com) to assemble a 
mosaic of the disaster site based on the police video data 
(Figure 5). The software also allows a geocoding, although for 
that a sensor model of the camera used is required. 
  
Figure 5. Mosaic of the Enschede disaster site, comprising 227 
video frames (red line approximates outline in Figure 1 b) 
4. CONCLUSIONS AND DISCUSSIONS 
In this project we investigated the utility of oblique airborne 
video data for urban post-disaster damage assessment. The main 
objectives were to enhance the overall quality (lower 
signal/noise ratio) of the imagery, and detect damage based on 
hue, intensity and saturation (HIS), as well as edge and variance 
characteristics in a specially created processing environment. 
We furthermore explored possibilities to register video data 
spatially based on encoded GPS information. 
The data quality enhancement carried out in AstroStack, based 
on aligning and stacking, led to a measurable improvement. The 
SA approach described by Gorny and Latypov (2002) was also 
verified in a theoretical experiment, but the method did not 
improve the quality of the video data. This was likely a result of 
several data conversion steps that led to severe image 
degradation and colour bleeding, but also of low contrast that IS 
584 
Inte 
kno 
low 
dan 
cha 
by t 
wel 
crea 
vide 
mor 
The 
orie 
to r 
autc 
Vid 
disa 
suit: 
prin 
and 
ther 
our 
requ 
imag 
som 
info 
aval 
also 
attiti 
last] 
at a 
usin, 
Alex 
mon 
Phys 
Gorr 
deve 
apert 
387( 
Hase 
Seki 
using 
Rem 
Hase 
2000 
aeria 
engit 
Kerle 
asat 
140- 
Mitoi 
detec 
aerial 
Taipe 
Ozisi 
asses
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.