Retrodigitalisierung Logo Full screen
  • First image
  • Previous image
  • Next image
  • Last image
  • Show double pages
Use the mouse to select the image area you want to share.
Please select which information should be copied to the clipboard by clicking on the link:
  • Link to the viewer page with highlighted frame
  • Link to IIIF image fragment

Technical Commission VIII (B8)

Access restriction

There is no access restriction for this record.

Copyright

CC BY: Attribution 4.0 International. You can find more information here.

Bibliographic data

fullscreen: Technical Commission VIII (B8)

Multivolume work

Persistent identifier:
1663813779
Title:
XXII ISPRS Congress 2012
Sub title:
Melbourne, Australia, 25 August-1 September 2012
Year of publication:
2013
Place of publication:
Red Hook, NY
Publisher of the original:
Curran Associates, Inc.
Identifier (digital):
1663813779
Language:
English
Additional Notes:
Kongress-Thema: Imaging a sustainable future
Corporations:
International Society for Photogrammetry and Remote Sensing, Congress, 22., 2012, Melbourne
International Society for Photogrammetry and Remote Sensing
Adapter:
International Society for Photogrammetry and Remote Sensing, Congress, 22., 2012, Melbourne
International Society for Photogrammetry and Remote Sensing
Founder of work:
International Society for Photogrammetry and Remote Sensing, Congress, 22., 2012, Melbourne
International Society for Photogrammetry and Remote Sensing
Other corporate:
International Society for Photogrammetry and Remote Sensing, Congress, 22., 2012, Melbourne
International Society for Photogrammetry and Remote Sensing
Document type:
Multivolume work

Volume

Persistent identifier:
1663822514
Title:
Technical Commission VIII
Scope:
590 Seiten
Year of publication:
2014
Place of publication:
Red Hook, NY
Publisher of the original:
Curran Associates, Inc.
Identifier (digital):
1663822514
Illustration:
Illustrationen, Diagramme
Signature of the source:
ZS 312(39,B8)
Language:
English
Additional Notes:
Erscheinungsdatum des Originals ist ermittelt.
Literaturangaben
Usage licence:
Attribution 4.0 International (CC BY 4.0)
Editor:
Shortis, M.
Shimoda, H.
Cho, K.
Corporations:
International Society for Photogrammetry and Remote Sensing, Congress, 22., 2012, Melbourne
International Society for Photogrammetry and Remote Sensing
Adapter:
International Society for Photogrammetry and Remote Sensing, Congress, 22., 2012, Melbourne
International Society for Photogrammetry and Remote Sensing
Founder of work:
International Society for Photogrammetry and Remote Sensing, Congress, 22., 2012, Melbourne
International Society for Photogrammetry and Remote Sensing
Other corporate:
International Society for Photogrammetry and Remote Sensing, Congress, 22., 2012, Melbourne
International Society for Photogrammetry and Remote Sensing
Publisher of the digital copy:
Technische Informationsbibliothek Hannover
Place of publication of the digital copy:
Hannover
Year of publication of the original:
2019
Document type:
Volume
Collection:
Earth sciences

Chapter

Title:
[VIII/9: Oceans]
Document type:
Multivolume work
Structure type:
Chapter

Chapter

Title:
EXTRACTION OF BENTHIC COVER INFORMATION FROM VIDEO TOWS AND PHOTOGRAPHS USING OBJECT-BASED IMAGE ANALYSIS M. T. L. Estomata, A. C. Blanco, K. Nadaoka, E. C. M. Tomoling
Document type:
Multivolume work
Structure type:
Chapter

Contents

Table of contents

  • XXII ISPRS Congress 2012
  • Technical Commission VIII (B8)
  • Cover
  • Title page
  • [Inhaltsverzeichnis]
  • [VIII/1:]
  • [VIII/2: Health]
  • [VIII/3: Atmosphere, Climate and Weather]
  • [VIII/4: Water]
  • [VIII/5: Energy and Solid Earth]
  • [VIII/6: Agriculture, Ecosystems and Bio-Diversity]
  • [VIII/7: Forestry]
  • [VIII/8: Land]
  • [VIII/9: Oceans]
  • EXTRACTION OF BENTHIC COVER INFORMATION FROM VIDEO TOWS AND PHOTOGRAPHS USING OBJECT-BASED IMAGE ANALYSIS M. T. L. Estomata, A. C. Blanco, K. Nadaoka, E. C. M. Tomoling
  • THE USE OF MODIS DATA TO DEFINE NATURAL BOUNDARIES AND REGIONS IN THE MARINE WATER COLOUMN. J. H. J. Leach and A. Kitchingman
  • SAC-D AQUARIUS A SATELLITE FOR OCEAN, CLIMATE AND ENVIRONMENT. ONE YEAR OF DATA. S. Torrusio, G. Lagerloef, M. Rabolli, D. LeVine
  • FLUORESCENT ANALYSIS OF PHOTOSYNTHETIC MICROBES AND POLYCYCLIC AROMATIC HYDROCARBONS LINKED TO OPTICAL REMOTE SENSING D. Zhang, J.-P. Muller, S. Lavender, D. Walton, L. R. Dartnell
  • [VIII/10: Cryosphere]
  • Cover

Full text

require such information (Marcos, et al., 2008). Since precise 
data will be readily available, reef processes can be more 
realistically studied as well as serve as a quick aid in 
developing improved management strategies (Scopélitis, et al., 
2010). Established Marine Protected Areas (MPAs) increase the 
importance of acquiring data from monitoring to determine if 
they are able to achieve their management goals (Hill & 
Wilkinson, 2004). MPAs in the Philippines will then be able to 
develop better management schemes, especially during these 
times of climate change when marine habitats are very 
vulnerable. 
1.3 Scope and limitations 
The main scope and focus of this research is to use object-based 
classification on underwater photos to extract cover 
information. Hence, the study mainly investigates the 
performance of OBIA vis-a-vis typical pixel-based spectral 
classifiers. It deals with the creation of rule sets necessary to 
obtain high classification accuracy using the OBIA approach. 
The chosen 50-meter transect was approximately 10 meters 
deep and the video used for producing the benthic cover map 
was taken at a 6-m depth, about 3-4 meters from the reef 
surface. Due to this distance of the camera from the reef and the 
absence of rocks, sea grass and other benthos, only a general 
classification was produced — coral, sand and rubble. Since a 
MRU nor a small bubble level were not available to ensure the 
vertical orientation of the underwater camera used during the 
fieldwork, georeferencing from one video snapshot to another 
lead to high root mean square error (RMSE), thus an 
uncontroled mosaic (ie. snapshots that were subjectively 
stitched together using Adobe Photoshop) was used as an 
alternative to create a panorama of the transect. An external 
motion sensor input for the MBES is the most important add-on 
because it compensates for roll, pitch and heave of the beams in 
real time but it was not available during the data gathering. 
Since corrections were not available and applied, the accuracy 
of the bathymetric data (vertical measurements) was not 
reliable enough for the underwater video snapshots to be 
georeferenced to it. Due to this, classification between elevated 
and non-elevated features could not be accurately performed. 
With the available MBES data, the uncontrolled mosaic was 
given only an approximate correction for scale and rotation, but 
its geographic location (horizontal measurements), through 
georeferencing to the relative depth of the MBES data, is 
accurate up to the centimeter level. A water level logger, which 
would record the rise and fall of tide, was also not available 
during the data gathering, thus a tide predictor program 
(WXTide) was used to supply tide information during the 
fieldwork. The data supplied by the program was used to 
generate corrections for the bathymetric data gathered by the 
MBES. 
1.4 References 
Acquisition and classification of underwater videos and 
photographs. Many studies have used video surveys and 
photographs to capture images of coral reefs, which were then 
used for classification. A study evaluated the different survey 
techniques used to validate maps derived from remotely sensed 
images and was able to conclude that among all survey 
techniques they evaluated, the best choice, if resources were 
unlimited and expertise was available, was the photographic 
transect, which was processed using the 1024 point analysis 
(Roelfsema, et al, 2006). Based on another study of these 
remotely sensed images, they said that manual delineation with 
    
   
   
   
   
   
   
    
    
    
      
        
   
    
     
   
   
  
    
    
   
   
   
   
   
    
    
    
   
   
   
   
    
   
   
   
   
   
   
   
   
    
    
   
   
   
    
  
   
   
   
   
   
field validation resulted to the highest accuracy (Scopélitis, et 
al, 2010). 
A research (Kaeli, et al., 2005) focused on deep water corals, 
which were present at around 30 — 100 meters, depths which 
are not reachable by SCUBA diving. Thus, the SeaBED 
Autonomous Underwater Vehicle (AUV) took the images, 
which were then analyzed using the existing random point 
method. They identified the Montastrea annularis complex, 
which was a dominant coral in their area and had a smooth 
texture. They also calculated the percent coverage of the coral 
upon classification. 
Another study (Marcos, et al., 2008) which used underwater 
videos recorded them at a near-reef distance, where the video 
was approximately 30cm from the reef surface. The video was 
divided into 625 sub-images, which were then used to train the 
test set to classify between living (live coral and algae) and 
non-living (dead coral, sand and rubble). The classifier used 
was based on the Bayesian theory called linear discriminant 
analysis (LDA) and both color and texture features were inputs 
to the classifier system. 79% was the overall success rate of the 
near-reef videos and at shallower depths, like 3 meters, a 
recognition rate of 85% was acquired for the living and 75% for 
nonliving. 
The neural network approach was also explored and used in a 
research (Marcos, et al., 2005).They captured the video at a 
constant range of 15-30 cm from the reef surface and the 
images that they classified were close-up of coral reefs. They 
used of a "feed-forward back-propagation neural network 
classification" to classify the images into three benthic 
categories: live coral, dead coral and sand. Color and texture 
features were inputs to the network and they were able to obtain 
86.5% success rate for test images that were not included in the 
training set and 79.7% recognition rate for the same set of 
images. 
A research (Scopélitis, et al., 2010) focused on satellite images 
and used three different classification methods — expertise- 
based, pixel-based and object-based. The expertise-based 
classification used ArcGIS 9.2 ® to apply visual analysis and 
manual delineation of polygons. These maps were the most 
accurate, based on the observation that sites for field validation 
were all classified correctly, thus they were used as reference so 
as to provide assessment of pseudo-error on the validity of the 
maps that were produced using pixel and object-based. All 
pixels of the reference image map were cross tabulated with the 
maps produced from object-based and pixel-based methods. 
The pixel-based classification was accomplished using ENVI 
4.4®’s maximum likelihood classification algorithm, which 
was trained using the sites which were known to be coral 
communities. Object-based classification used the software 
Definiens 7.0 ®, which divided the image to objects, which 
were then aggregated based on rules made by the user. The 
results of this research discovered that overall agreement of the 
object-based map with the expertise-based map was better than 
the pixel-based. Exploring the object-based classification 
method’s capability in mapping benthic cover may lead to 
improved results such as higher accuracies and faster result 
acquisition. 
Precision of multi-beam echo sounder. The precision of the 
use of multibeam echo sounding with high resolution, 
accompanied by positioning with high-accuracy Was 
investigated in a paper (Ernstsen, et al., 2006). Repetitive 
bathymetric data of a shipwreck was measured using a high 
    
resolu 
long r: 
annua 
seven 
the M 
vertice 
precis 
compa 
confid 
respec 
MBES 
the st 
throug 
meter) 
Integr 
using 
OBIA 
were | 
disco: 
compl 
Rangi 
descri 
pixel-l 
unsup 
hetero 
object 
to prc 
sugge: 
classi 
accura 
With f 
reliabl 
exploi 
acquit 
higher 
resolu 
MBE: 
be us 
accurz 
the bz 
reliab 
21:8 
Puertc 
provir 
Isla \ 
1980s 
numb 
Garde 
Puert 
meter: 
Vic 
i Wikip:
	        

Cite and reuse

Cite and reuse

Here you will find download options and citation links to the record and current image.

Volume

METS METS (entire work) MARC XML Dublin Core RIS Mirador ALTO TEI Full text PDF DFG-Viewer OPAC
TOC

Chapter

PDF RIS

Image

PDF ALTO TEI Full text
Download

Image fragment

Link to the viewer page with highlighted frame Link to IIIF image fragment

Citation links

Citation links

Volume

To quote this record the following variants are available:
Here you can copy a Goobi viewer own URL:

Chapter

To quote this structural element, the following variants are available:
Here you can copy a Goobi viewer own URL:

Image

To quote this image the following variants are available:
Here you can copy a Goobi viewer own URL:

Citation recommendation

Shortis, M., et al. Technical Commission VIII. Curran Associates, Inc., 2014.
Please check the citation before using it.

Image manipulation tools

Tools not available

Share image region

Use the mouse to select the image area you want to share.
Please select which information should be copied to the clipboard by clicking on the link:
  • Link to the viewer page with highlighted frame
  • Link to IIIF image fragment

Contact

Have you found an error? Do you have any suggestions for making our service even better or any other questions about this page? Please write to us and we'll make sure we get back to you.

How many letters is "Goobi"?:

I hereby confirm the use of my personal data within the context of the enquiry made.