Full text: Proceedings, XXth congress (Part 5)

    
  
   
       
      
   
    
   
     
       
  
    
   
   
   
  
      
       
   
   
     
    
    
     
    
  
  
    
  
  
  
   
   
   
  
   
    
  
   
    
  
    
   
  
ibul 2004 
that is 
of 3D 
urements 
r scanner. 
ige data. 
er detect 
/ digital 
action is 
e data. In 
vide into 
al plane, 
saki, R., 
ds have 
ts follow 
| aligned 
vays true. 
ed shape 
re points. 
ent range 
xtraction 
used to 
s. Then, 
images. 
ly, range 
l. 
Its 
es) 
d image 
eature IS 
features 
limited. 
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol XXXV, Part B5. Istanbul 2004 
  
But there is every possibility of extracting using by 3D shape 
and texture information. As a feature, various features have to 
be attempted to extract. Also, application to use this detailed 
extraction has to be considered such as earthquake disaster, 
volcanic eruption, urban mapping, etc. 
  
Detected Feature Distinctive Feature 
  
Concrete Ground - ground surface 
  
Surface. - gray color (concrete color) 
Grass Ground - ground surface 
Surface - green color (vegetation color) 
  
Scatter Point 
not green color (not vegetation) 
Artificial Shape 
Manmade object 
(complex shape) 
Manmade object 
  
  
(Box shape) - not green color (not vegetation) 
Natural vegetation | - Scatter Point 
not green color (not vegetation) 
Vertical Plane 
gray color 
  
Wall 
  
  
  
  
Table 5. Extracted Features 
5. CONCLUSION 
In conclusion, all the sensors, laser scanner, digital camera and 
IMU, GPS are integrated to construct digital surface model. 
Calibration of laser scanner and digital camera is conducted to 
know relative position and attitude of each sensor against IMU. 
This rigorous geometric relationship is used for constructing 
DSM and integrating digital camera images. In this paper, we 
propose a new method of direct geo-referencing by the 
combination of bundle block adjustment and Kalman filter. 
Because of the aiding Kalman filter by bundle block 
adjustment, geo-referenced range data and CCD images are 
overlap correctly. Feature extraction. from range data and 
image data is more effective than feature extraction from image 
data alone. 
In this paper, all the sensors and equipments are assembled on 
a unmanned helicopter. This paper focus on how integrate 
these sensors with mobile platform. 
6. REFERENCES 
Nagai, M., Shibasaki, R., Zhao, H., Manandhar D., 2003. 
Development of Digital Surface Model and Feature 
Extraction by Integrating Laser Scanner and CCD sensor, 
Proceedings of the 24th Asian Conference on Remote 
Sensing, 3-7, Busan, Korea. 
Kumagai, H., Kubo, Y., Kihara, M., and Sugimoto, S., 
2002. DGPS/INS/VMS Integration for High Accuracy 
Land-Vehicle Positioning, Journal of the Japan society of 
Photogrammetry and Remote Sensing, vol4l no.4 
pp.77-84 
Kuamgai, H., Kindo, T., Kubo, Y., Sugimoto, S., 2000. 
DGPS/INS/VMS Integration for High Accuracy Land- 
Vehicle Positioning, Proceedings of the Institute of 
Navigation ION GPS-2000, Salt Lake. 
Manandhar, D., Shibasaki, R., 2002. Auto-Extraction of 
Urban Features from  Vehicle-Borne Laser Data, 
ISPRS "GeoSpatial Theory, Processing and Application ", 
Ottawa. 
Zhao, H., Shibasaki, R., 2000. Reconstruction of 
Textured Urban 3D Model by Ground-Based Laser 
Range and CCD Images, [EICE Trans. Inf &Syst., 
vol.E83-D, No.7. 
Zhao, H., Shibasaki, R., 2001. High Accurate Positioning 
and Mapping in Urban Area using Laser Range Scanner, 
Proceedings of IEEE Intelligent Vehicles Symposium, 
Tokyo. 
7. ACKNOWLEDGEMENT 
We would like to express highly appreciation to Dr. Hideo 
Kumagai, Tamagawaseiki co.ltd. He provides the IMU for this 
research and his guidance for IMU data processing lead this 
research to success. 
  
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.