2004
onal
‚dual
than
—
=
ta
Wavelength (nm
—
0
imate
longer
0 area
model
ystem.
ted as
e reef
ss and
mmon
ntional
pth of
als are
yosed on
rea using
)e direct
and the
; reduced
ves most
j 3). The
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol XXXV, Part B7. Istanbul 2004
performance of the estimator begins to deteriorate with the
introduction. of three more NIR bands. Although it was
discerned in the previous section that addition of Landsat
ETM+ bands may contribute to better classification results, it
also contributed to reduced accuracy in bathymetry estimation.
Actual Depth (m
lized rms error
0.9988x + 2.3185 S 040
R° = 0.7395 ë
Z 0.20
© Depth values + Ratio method
—— legressed line 9 Using fused data
0.00
0 5 10 15 20 0 5 10 15 20 25 30
; Estimated Depth (m) Actual Depth (m)
Figure 7. Comparison of estimated and actual depth (left) and
normalized rms error on depth estimates against depth (right).
There is a tendency to underestimate depths slightly at shallow
areas while discrepancies are much more severe in deeper areas
(Figure 7a). This could be caused by uneven spatial distribution
of water quality parameters which could not be incorporated in
the RTM. We compared the performance of the model against a
log-ratio method of two bands (See Stumpf, 2003) and found
out that although the error relative to depth is incr casing (by
0.20 rms points), the bathymetry estimates are much more
accurate than the previous method.
3.4 Discussion
Based on the findings above, we offer the postulate that there is
a threshold to classification precision achievable with
increasing number of bands, implying that it is sometimes
impractical to obtain as much data to process when there is a
limit to achievable accuracy. Optimum band placement and
spatial resolution are better avenues for improved classification
and depth estimates.
Despite the advances made in improving discrimination of reef
habitats, some caveats are in order, The procedure relies heavily
on the assumption that benthic cover spectral properties remain
invariant throughout the acquisition period. Corals and
seagrasses are highly dynamic environment and may have
change spectra accordingly. There are less minor issues such as
anomalies in solar spectrum alterations and satellite drifts which
could alter reflectance estimates but should be given careful
attention. Image registration is as good only as the GPS used
and the surveying technique employed to locate the transects.
Also, the method is dependent on a number of physical
parameters which could only be obtained by field surveys. It
Will be difficult to implement the model for areas where prior
data or field in-situ instruments are not available.
We recommend that evaluation of errors due to image matching
misregistration be addressed in future studies. Other promising
techniques such as underwater photogrammetric methods, is
another worthwhile in attempts to map morphology and
bathymetry pursuit given the refined spatial resolution and
precise positioning of the acquisition.
4. SUMMARY AND CONCLUSIONS
In this paper, we have presented an approach to spectrally
reconcile imageries produced by different sensors and acquired
at different dates. We have also presented the benefits and
1001
consequences of such synergy in data sources processing
methods in discriminating shallow water benthic habitats. This
is therefore a clear attempt at synergy, not only of techniques to
process images, but also a way to integrate various optical and
physical in-situ measurements and their application to radiative
transfer modelling to enhance information extraction.
Since typical results from activities where multisource
imageries are presented, this paper provides some specific tools
and guidelines that planners and decision-makers involved with
providing, producing and maintaining information resource on
the tropical marine habitats, can have practical use.
ACKNOWLEDGMENTS
We are grateful to Mr Akayoshi Nakayama of the National
Research Institute for Fisheries Engineering for providing the
IKONOS images. The ASTER images were obtained thru the
ARO (Announcement of Research Opportunity Program) of
ERSDAC (Earth Resources Data Analysis Center (No. ARO-
23). The SPOT images were purchased from support by the
Japan Ministry of Environment in monitoring the Sekesei
Lagoon National Park, Okinawa while Landsat was furnished
by H. Kadoya of NHK.
REFERENCES
Bird, R. E., and Riordan, C. J. 1986. Simple Simple Solar
Spectral Model for Direct and Diffuse Irradiance on Horizontal
and Tilted Planes at the Earth's Surface for Cloudless
Atmospheres." Journal of Climate and Applied Meteorology.
25(1), pp. 87-97.
Coppin, P., I. Jonckheere, K. Nackaerts, and B. Muys, 2004.
Digital change detection methods in ecosystem monitoring.
Internatiorfal Journal of Remote Sensing, 42(1), pp. 47-56.
Edingef, E.M and M.J. Risk, 2000. Reef classification bycoral
morphology predicts coral conservation value. Biological
Conservation, 92(1), pp. 1-13.
Gross, H. N. and J. R. Schott, 1998. Application of Spectral
Mixture Analysis and Image Fusion Techniques for Image
Sharpening. Remote Sensing of Environment, 63(2), pp. 85-94.
Hedley, J. D. and P. J. Mumby, 2002. Biological and remote
sensing perspectives of pigmentation in coral reef organisms,
Advances in Marine Biology, 43, pp. 277-317.
Mumby, P.J. and A.J. Edwards, 2002. Mapping marine
environments with IKONOS imagery: enhanced spatial
resolution can deliver greater thematic accuracy. Remote
Sensing of Environment. 82(2-3), pp 248-257.
Stumpf, R., K. Holderied, M. Sinclair, 2003. Determination of
water depth with high resolution satellite imagery over variable
bottom types. Limnology and Oceanography, 48(1, part 2), pp
547-556.
Paringit, E. C. and K. Nadaoka, 2003/ Deriving Relationships
between Reef Sedimentation and Inland Erosion Characteristics
based on field observation data, hydrological modelling and
remote sensing data analysis, Proceedings of the Asian and
Pacific Coasts (APAC 2003), Tokyo, Japan, March 2004 (in
CD-ROM).
Piella, G., 2003. A general framework for multiresolution
image fusion: from pixels to regions. /nformation Fusion, 4(3),
pp. 259-280.