NSE). The
c relations
id Remote
. Estimat-
3d-offsets.
orne laser-
, Dresden,
mple con-
ons to 1m-
ions of the
r function
1 and Out-
irds stereo
Commis-
delling of
Termany),
> errors in
te Univer-
segmenta-
'cognition
orial with
3. pp- 39=
inbul 2004
APPLICATIONS FOR MIXED REALITY
Sven Wursthorn, Alexandre Hering Coelho and Guido Staub
,
Institute for Photogrammetry and Remote Sensing
University of Karlsruhe (TH)
Englerstr. 7, D-76128 Karlsruhe, Germany
{wursthorn|coelho[staub}@ipf.uni-karlsruhe.de
KEY WORDS: Augmented Reality, GIS, Mobile, Application, Disaster, Floods.
ABSTRACT
Mixed or Augmented Reality (AR) systems align computer generated virtual objects and real world objects with each other
to enhance the vision of the physical reality with the virtual objects in a manner, that will allow a user to interact with
spatial data in his natural environment and scale (1:1). Especially when dealing with natural disasters the advantage of a
user's view augmented with additional supporting data is apparent. Activities like preventive protective measures, effective
reaction and reconstruction need to be assisted by a technology that improves the operators efficiency and performance.
This approach deals with a system that serves particular needs regarding earth science and disaster management. We
will present our work developing an AR-System (ARS) and its components with examples of practical applications. The
crucial point of this implementation is the use of laser scanning models with adequate accuracy, reliability, actuality
and completeness. VRML tiles produced from the laser scanning data are displayed in the operator's field of view
together with additional water surface models or simple features like points and lines. These geometrical objects are
visualised and controlled using scene graph parent-child relationship of Java3D. In this paper, flood disasters are used
to illustrate possible applications in a typical four-phased disaster management process. Additionally, several examples
representing the virtual water surface for flood damage prediction are discussed. Results of virtual water models are used
by applying digital image processing techniques to laser scanning data in order to provide tactical information for disaster
management.
1 INTRODUCTION AND RELATED WORK
The "Reality-Virtuality Continuum" (Milgram, 1994) (fig.
1) defines mixed reality as a generic term being in-between
the real world and complete virtual environments. Mixed
or augmented reality (AR) extends a user's vision of the
real world with additional information of computer gen-
erated objects with the user being in the real world. This
is in contrast to virtual reality where the physical world is
totally replaced by computer generated environment. Al-
though vision is not the only sense that can be augmented,
it is the strongest one (visual capture) - what we see is
"true" (Welch, 1978).
Pina Mixed Reality Te
real augmented
environment reality
augmented virtual
virtuality reality
Figure 1: The Reality-Virtuality Continuum
If augmented objects are derived from spatial data which
represent real features like buildings or streets, an exact
overlay of these objects with the real objects is a funda-
mental requirement. Additionally this overlay has to hap-
pen in real time (Azuma et al., 2001) to let the user’s vi-
sion keep the impression of the augmentation during move-
ments.
With augmented reality systems, the user can interact with
spatial data in his own natural scale, 1:1. Spatial features
and meta data can be queried by position and view direc-
tion and can be interactively manipulated in the viewed
scene. In contrast to handheld computers with 2D map dis-
plays, the user is unburdened from the task of comparing
1049
the results of a spatial query shown on the display with the
environment on site. Another possibility is to show objects
covered beneath the surface to see underground features
like tubes (Roberts et al., 2002).
The MARS Project (Hollerer et al., 1999) experimented
with AR solutions for pedestrian navigation, both indoors
and outdoors. (Livingston et al., 2002) provide a mobile
outdoor system for military operations. The Australian
"Tinmith" project has developed an indoor and outdoor ca-
pable system with a solution for interactive data manipu-
lation e.g. with gloves which are typically utilized in VR
applications (Piekarski and Thomas, 2001).
All of these projects have some typical hardware compo-
nents in common that are needed for an ARS. These con-
figuration affected our hardware selection which is dis-
cribed in the following section.
2 HARDWARE
The head mounted display (HMD) system, Nomad from
Microvision, projects an SVGA (800x600) image with a
semi-transparent mirror directly on the retina, overlaying
see-through information on the user’s vision. The advan-
tage of this solution is bright "display" that can be used
outdoors, the disadvantage is a lack of colors. The mono-
chrome display supports 32 shades, covering a field of view
of 23°x 17° which equals a 17” Display at arm's length.
Furthermore our system is monocular. There is no need for
a stereo display, because human vision is able to combine
the three dimensional impression of the real world with the
augmented objects from the HMD.