601
distributed data analysis centres, such a data volume seems unreasonable as a basis for
routine environmental monitoring. A sampling strategy is suggested, both in space and in
time, in order to reduce the sheer scale of the problem to tractable levels [Specter and
Raney, 1989]. However, for active global monitoring, there is no escaping the fact that a
large amount of data will be required, and with a variety of resolutions, scales, and spectral
properties, even for one thematic subject such as tropical forests.
Once acquired, data processing is needed to transform the data into usable
information. Processing, interpretation, and information extraction must be accomplished
relatively routinely for environmental data bases, or else the point will be missed. Active
monitoring requires rapid and reliable transformation of data into usable information on
which environmental management decisions can be based.
For global applications, large area coverage is required, as noted above. As a
consequence, variations in pixel size on the order of 1000:1 are inevitable, stretching from
the AVHRR resolutions for the global vegetation index [Choudury and Tucker, 1987] to the
finer resolutions available from SPOT or RADARSAT [Raney, 1990a]. Information is
gained as resolution is improved, but of course the nominal size of the data base grows with
the square of resolution ambitions [e.g. Townshend and Justice, 1988]. Both image data and
tabular files are required, and cross-referenced, in both pixel specific and polygonal classes.
For SAR (synthetic aperture radar) data in particular, area norms rather than single pixel
classifiers are appropriate. SAR data, either logically analyzed (for spatial spectral
signature, for example) or as image files, will need to be merged with information from
other sensors, or with non-image GIS (geographic information system) records. Finally, for
many if not all environmental issues, the variation of localized geophysical parameters over
time is the most critical quantity to monitor. All of these issues are matters of current GIS
research, although not necessarily vigorously at the present time [e.g. Nagy and Swain, 1987].
But these problems are relatively tangible, and substantial progress may reasonably be
expected over the next decade.
Availability of the large amounts of data required for global monitoring rather than
data transformation emerges as the key issue. Thus we turn to considerations of remote
sensing data policy as it impacts data availability.
3. REMOTE SENSING IN THE MARKET PLACE
Current data policy is based on a market place philosophy, for which the watershed
development in North America was the Remote Sensing Act of Congress of 1984 (PL 98-
365). This Act set the stage for transfer of operational (land) remote sensing out of NASA,
with the objective to reach commercial viability for remote sensing within an aggressively
short period of time, thus removing the burden of support or subsidy from the Federal Go
vernment. The arguments considered by the U. S. Congress at that time were almost
exclusively ones of domestic market assessment [e.g., Office of Technology Assessment 1984]