Full text: Technical Commission IV (B4)

2012 
t.ac.jp 
)Wever, no 
ation cost. 
with each 
anticipated 
leliberately 
n of a test 
ARPs), we 
ation of as 
navigation 
it seamless 
ggest some 
1sors. 
an Indoor 
entification 
QZSS) [2] 
ceiver, an 
ne-of-flight 
nal camera, 
r use as an 
navigation 
ent 
ntenna (JAVAD) 
(HITACHI), 
  
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XXXIX-B4, 2012 
XXII ISPRS Congress, 25 August — 01 September 2012, Melbourne, Australia 
We conducted experiments using listed sensors through four 
integrated experiments and then analyzed the data. The first 
experiment involved a coarse-resolution indoor navigation 
using position data taken at 22 points to investigate availability 
and continuity in an indoor navigation environment. The second 
experiment involved a fine-resolution indoor navigation using 
position data taken at 254 points with electric field maps 
generated from each sensor to investigate accuracy and 
continuity in an indoor navigation environment. The third 
experiment integrated navigation of both indoor and outdoor 
environments to investigate availability and continuity in an 
indoor-outdoor navigation environment. The fourth experiment 
involved outdoor navigation using multiple satellite systems to 
investigate accuracy, availability and integrity. 
2.1 Integrated sensor system 
The sensors listed in Table 1 were integrated to test seamless 
navigation. We prepared three integrations in our experiments, 
as follows. 
Indoor-outdoor navigation system 
Signals from satellites GPS, GLONASS, QZSS and IMES were 
received simultaneously with a DELTA receiver. These signals 
were then synchronized with GPS time. The receiver is packed 
in a backpack with an antenna directed vertically, as shown in 
Figure 1. Position estimation was conducted in offline 
processing. 
  
Figure 1. Position data acquisition in an indoor-outdoor 
environment 
Indoor navigation system 
A lighting tag receiver, an IMES receiver and an RFID receiver 
were integrated as an indoor navigation system. These receivers 
were connected to a mobile PC and were synchronized with the 
PC time. Two patterns were tested with this system, as shown in 
Figure 2. 
  
Figure 2. Position data acquisition in an indoor 
environment 
To produce the first pattern, the experimenter walked while 
holding the mobile PC to simulate navigation for pedestrians. 
This pattern was focused on the simultaneous use of lighting 
tags and IMES. Another pattern involved smooth movement by 
a truck to simulate navigation for autonomous robots. This 
pattern was focused on the simultaneous use of lighting tags, 
IMES and RFID tags. 
Pedestrian tracking sensor 
An omnidirectional camera and laser scanner were combined to 
track pedestrians, as shown in Figure 3. The omnidirectional 
camera captured a panorama movie. The panorama movie was 
mainly used to synchronize all position sensor data using 
pedestrian behavior in manual offline processing. The laser 
scanner was set at a point 30 cm above the floor. Pedestrian 
positions were extracted from the temporal laser scanner data 
using the scene-subtraction methodology. 
  
     
   
  
  
Capture video = Omnidirectional 
Laser scanner 
Figure 3. Position data acquisition and pedestrian tracking with 
an omnidirectional camera and laser scanner 
2.2 Construction of test environment for indoor-outdoor 
seamless navigation experiments 
For the outdoor experiment, we selected an area around our 
campus, as shown in Figure 4. This area includes parks, high- 
rise buildings, low-rise buildings, stations and wide and narrow 
roads. For the indoor experiment, we selected a large room in 
our campus with an outdoor opening, as shown in Figure 5. 
x 
High-rise : ET es X 
buildings 3 E E d Campus X 
+ Trajectory 
^ d » 
et 
   
     
     
  
  
   
   
Stations 
© GpenStreetMap contributors, CC-BY-SA 
Figure 4, Study area (outdoor) 
  
Figure 5. Study area (indoor) 
 
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.