Full text: Proceedings; XXI International Congress for Photogrammetry and Remote Sensing (Part B1-3)

963 
INTEGRATION OF GPS/INS/VISION SENSORS TO NAVIGATE 
UNMANNED AERIAL VEHICLES 
Jinling Wang a ’ , Matthew Garratt b , Andrew Lambert c , Jack Jianguo Wang \ Songlai Han a , David Sinclair d 
a School of Surveying & Spatial Information Systems, University of New South Wales, NSW 2052, Australia 
Jinling.Wang@unsw.edu.au 
b School of Aerospace, Civil and Mechanical Engineering, 
c School of Information Technology and Electrical Engineering, Australia Defence Force Academy, Canberra, Australia 
d QASCO Surveys Pty. Limited, 41 Boundary St. South Brisbane, Qld, 4101, Australia 
Commission I, ICWG I/V 
KEY WORDS: Aerial, Fusion, GPS/INS, Multisensor, Navigation, Vision, 
ABSTRACT: 
This paper presents an integrated GPS/INS/Vision navigation system for Unmanned Aerial Vehicles (UAVs). A CCD (Charge- 
Coupled Device) video camera and laser rangefinder (LRF) based vision system, combined with inertial sensors, provides the 
information on the vertical and horizontal movements of the UAV (helicopter) relative to the ground, which is critical for the safety 
of UAV operations. Two Kalman filers have been designed to operate separately to provide a reliable check on the navigation 
solutions. When GPS signals are available, the GPS measurements are used to update the error states in the two Kalman filters, in 
order to estimate the INS sensors, LRF and optic flow modelling errors, and provide redundant navigation solutions. With the 
corrected measurements from the vision system, the UAV’s relative movements relative to the ground are then estimated 
continuously, even during GPS signal blockages. The modelling strategies and the data fusion procedure for this sensor integration 
scenario are discussed with some numerical analysis results, demonstrating the potential performance of the proposed triple 
integration. 
1. INTRODUCTION 
Over the past decades, UAVs have been increasingly used for a 
wide range of applications, such as reconnaissance, surveillance, 
surveying and mapping, spatial information acquisition, 
geophysics exploration, and so on. The key to operating UAVs 
safely is to develop reliable navigation and control technologies 
suitable for UAV applications. 
Currently, the most widely used navigation technologies for the 
UAVs are GPS receivers and INS devices, alone or in 
combination. INS is a self-contained device which operates 
independently of any external signals or inputs, providing a 
complete set of navigation parameters, including position, 
velocity and attitude, with a high data rate. However, one of the 
main drawbacks of INS when operated in a stand-alone mode is 
the rapid growth of systematic errors with time. In contrast to 
INS's short-term positioning accuracy, satellite-based GPS 
navigation techniques can offer relatively consistent accuracy if 
sufficient GPS signals can be tracked during the entire UAV 
mission, however GPS itself does not provide attitude 
measurements. 
Integrated GPS/INS navigation systems have been successfully 
implanted for many applications. However, their performance 
heavily depends on the availability and quality of GPS signals. 
The signal blockage can cause a significant deviation in the 
GPS/INS navigation solutions. As the low power of the ranging 
signals makes GPS exceptionally vulnerable, the received GPS 
signals could be easily overwhelmed by either intentional or 
unintentional interferences. There are a variety of unintentional 
inference sources, such as broadcast television, personal 
electronic devices, mobile satellite services, ultra wideband 
communications, and mobile phone signal transmitters. 
For UAV navigation, integrated GPS/INS systems are also 
frequently suffered from the absent of GPS signals when 
travelling around high building, trees, etc. In order to increase 
the reliability of UAV navigation, there must be more redundant 
sensors or measurements used in the navigation system. 
Furthermore, the vertical distance and movement of a UAV 
relative to the ground is crucial for UAV automatic navigation 
and landing, but neither GPS nor INS can provide such crucial 
information. On contrary, vision sensors can sense the 
surrounding area directly. As GPS, INS and vision sensors have 
quite different characteristics they can complement each other 
in different situations. 
Vision sensors (e.g., such as camera, hyper-spectral sensors, 
laser scanners etc.) are mainly used for mapping and 
environments detection, and usually geo-referenced by other 
sensors. However, Vision-based navigation has also been 
investigated intensively (Jun et al., 2002; Kim and Sukkarieh, 
2004a). Terrain Aided Navigation System (TANS) typically 
makes use of onboard sensors and a preloaded terrain database 
(Chatteiji et al., 1997; Ogris et al., 2004). Simultaneous 
Localization And Mapping (SLAM) algorithm can navigate 
vehicles or robots in an unknown environment (Smith and 
Cheeseman, 1987). As the onboard vision sensors detect 
landmarks from the environments, the SLAM estimator 
augments the landmark locations to a map and estimates the 
vehicle position with successive observations. SLAM has been 
applied to field robot and air vehicle (Dissanayake et al., 2001; 
Kim and Sukkarieh, 2004a).
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.