Full text: XVIIth ISPRS Congress (Part B4)

  
COMPUTER ASSISTED INFORMATION EXTRACTION FROM SATELLITE IMAGES 
FOR UPDATING NATIONAL LAND USE INFORMATION DATA BASE IN JAPAN 
K. Cho, M. Yoshimura & S. Takeuchi 
Remote Sensing Technology Center of Japan 
S. Murai 
University of Tokyo, Japan 
C. Otsuka 
Geographical Survey Institute of Japan 
K. KamadaX 
National Land Agency of Japan 
Commission IV 
ABSTRACT: 
A personal computer based image interpretation system called CASYII(Computer Assisted 
System for Image Interpretation) has been developed. In this system, human image inter- 
pretation technique is highly supported by it's various graphic functions, and allows 
users to update land use information effectively by comparing old land use data and 
latest satellite images on the display. Since FY 1991, the National Land Agency of Japan 
has been using this system for updating the national land use information data base from 
latest satellite images. This paper describes the outline of this project, and explains 
how the human image interpretation technique is combined with digital image processing. 
KEY WORDS: SPOT, HSI, Image Interpretation, Land Use Data, CASYII 
1. INTRODUCTION 1.2 Limitation of Point-wise 
1.1 Digital National Land Information clossification 
  
Landsat TM and SPOT data suggested us 
possibilities of extracting detailed land 
eover information from satellite data. 
However, at the same time, limitation of 
applying traditional point-wise classifi- 
cation methods, such as the maximum like- 
lihood classification, to the high spatial 
resolution data have become clear. One of 
the main reason of this is that the con- 
ventional land use items can not always be 
represented by particular spectral char- 
acteristics of satellite data. Fig. 1 
shows the spectral characteristics of the 
each land use item area which were updated 
by image interpretation of a SPOT/TM 
composite image. 
Since 1974, under the cooperation with 
Geographical Survey Institute and other 
agencies, the National Land Agency of 
Japan has been collecting and updating 
various land information in digital form 
called the Digital National Land Informa- 
tion. This information include but not 
limited to topographic data, geological 
data, climate data, land use data, admin- 
istrative division data. Most of the data 
are produced and updated by in situ inves- 
tigations, air photo interpretation, 
topographical map interpretation etc. As 
the conventional survey method takes lots 
of time and cost, the National Land Agency 
had been investigating the possibility of 
using satellite remote sensing technology RA 
for updating the data. 
Among the Digital National Land Informa- ; 
tion, the land use data was expected to be 
one of tbe most suitable data to be updat- 
ed from satellite images. The mesh size 
of the original land use data was 10m, and 
100m mesh data for public use were  creat- 
ed from the 10m data with majority vote 
method. The original land use data were 
consisted of 15 items, which are shown on 
Table 1 as "old land use items". In order 
to reduce the time and cost for updating 
the land use data, under the contract of : 
National Land Agency and Geographical T (see Table 1 for item number ) 
Survey Institute, Remote Sensing Technolo- 
gy Center of Japan(RESTEC) has been in- 0 + + i I + t > G 
volved in studying the possibility of 
using satellite data for it. Fig. 1 Spectral characteristics of the 
each land use item area which 
were updated by image interpreta- 
* Moved to the Geographical Survey Insti- tion of a SPOT/TM composite image. 
tute of Japan in April, 1992. (see Fig.3) 
so + 
  
  
  
  
122 
This r 
ing po 
data 
Textur 
the so 
These 
of ini 
approa 
renewa 
2.1 De 
In or 
venti 
author 
updati 
tion : 
tal iT 
conce 
summal 
(1)Ut: 
te: 
(2)Ut. 
(3)Di' 
* P
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.