Full text: Proceedings, XXth congress (Part 7)

| 
| 
| 
| 
| Intei 
| 
INTEGRATION OF LASER SCANNING AND CLOSE-RANGE PHOTOGRAMMETRY - | 
| 
| 
| 
  
where different sensors measure the same object/site from 
different locations or times or even users. In order to deliver the 
best description of the object/site (lowest uncertainty), one must 
uses known shapes, CAD drawings, existing maps, survey data, 
and GPS data. 
972 
THE LAST DECADE AND BEYOND Si 
som. 
three 
J.-Angelo Beraldin* inrer 
e bean 
? National Research Council Canada, Ottawa, Ont., Canada, K1 AOR6 — angelo.beraldin@nrc-cnre.ge.ce | mx 
E " gc. | inspe 
Commission V, WG V/2 | af m 
| impr 
auth 
| ne 
KEY WORDS: Laser Scanning, Close-range photogrammetry, Sensor fusion, 3D sensors, Metric Performance, Heritage, Space | pas 
Went 
simu 
ABSTRACT: incre. 
In last decade, we have witnessed an increased number of publications related to systems that combine laser scanning and close- me 
range photogrammetry technologies in order to address the challenges posed by application fields as diverse as industrial, meas 
automotive, space exploration and cultural heritage to name a few. The need to integrate those technologies is driven by resolution, stripe 
accuracy, speed and operational requirements, which can be optimized using general techniques developed in the area of multi- To v 
sensor and information fusion theory. This paper addresses an aspect critical to multi-sensor and information fusion, i.c., the objec 
estimation of systems uncertainties. The understanding of the basic theory and best practices associated to laser range scanners, purpc 
digital photogrammetry, processing, modelling are in fact fundamental to fulfilling the requirements listed above in an optimal way. 2002 
[n particular, two categories of applications are covered, i.e., information augmentation and uncertainty management. Results from gener 
both space exploration and cultural heritage applications are shown. the te 
struct 
di TS 
1. INTRODUCTION manage the uncertainties link to the sensing devices, the ier 
environment and a priori information (e.g. a particular user). 2002) 
1.1 Why combine data from multiple sensors? The objectives of the data fusion are to minimize the impact of comb 
: Se those uncertainties and to get the most out of the multi-sensor : 
The topic of multi-sensor data fusion has received Over the platform. In other words. one must justify the increased cost QE 
years a lot oi attention by the scientific community for military and complexity of a multi-sensor solution: stat] 
and non-military applications.  Multi-sensor data fusion Sno, 
techniques combine data from multiple sensors and related 1.2 Related work mem 
information from associated databases, to achieve improved explo 
accuracies and more specific inferences than could be achieved A survey of the literature on multi-sensor data fusion can in d 
by the use of a single sensor alone (Hall, 1997). As noted by generate a long list of papers and books describing the theory Le 
Hong, 1999, in applying those techniques, one would expect to and the different applications where data fusion is critical. cs 
achieve the following benefits: Llinas et al. 1990 describe an application where a moving a 
e Robust operational performance aircraft is observed by both a pulsed radar (based on radio illustr. 
e Extended spatial/temporal coverage frequencies) system and an infrared imaging sensor. The pulsed Some 
e Reduced ambiguity radar system provides an accurate estimate of the aircraft's presen 
" Increased confidence range but with poor angular direction estimates (due to the systen 
e — Improved detection performance longer wavelengths compared to optical light). Instead, the choser 
e — Enhanced resolution (spatial/temporal) infrared imaging sensor determines only the aircraft’s angular sensin 
e Increased dimensionality direction but with a much higher accuracy when compared to should 
the pulsed radar system. If these two observations are correctly ülitica 
To discuss the integration of laser scanning and close-range associated, then the combination of the two sensor's data strate 
photogrammetry from a multi-sensor and information fusion provides an improved determination of location than could be : 
point of view, we present the key features of different laser obtained by either of the two independent sensors. This case 2 € 
scanner technologies and photogrammetry-based systems that represents a good example of uncertainty management through 
should be considered in order to realize the benefits expected in an adequate understanding of the sensors error and resolution 
a multi-sensor platform. Some examples are given to illustrate characteristics. In the 
the techniques. In particular, two categories of applications are field o 
covered, ie, information augmentation and uncertainty To model complex environments, those composed of several compu 
management. As defined by (Hong, 1999), information objects with various characteristics, it is essential to combine measui 
augmentation refers to a situation where each sensor provides a data from different sensors and information from different and pa 
unique piece of information to an application and fusion sources. El-Hakim, 2001 discusses the fact that there is no from 
extends, for example, the system’s spatial/temporal coverage. single approach that works for all types of environment and at technic 
Uncertainty management is a very important and a critical part the same time is fully automated and satisfies the requirements discern 
of multi-sensor data fusion techniques. It covers situations of every application. His approach combines models created measut 
from multiple images, single images, and range sensors. He also are use 
range 1 
the nak
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.