Full text: Proceedings; XXI International Congress for Photogrammetry and Remote Sensing (Part B1-3)

The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Vol. XXXVII. Part Bl. Beijing 2008 
provides vital information on what the obstacle is doing. Third, 
and most importantly, what is the predicted path of the obstacle. 
All of these functions depend heavily on highly accurate pose 
estimation. 
Estimation of state and feature extraction all depend on accurate 
pose estimation. As mentioned previously, errors in pitch and 
roll of only 0.5 degrees can result in false characterization of 
terrain and obstacles. This is more critical in sensing obstacles 
far away rather than close to the vehicle. As the course 
presented obstacles in rapid succession the robots required 
accurate pose estimation to avoid colliding with them. However, 
errors in roll and pitch are more pronounced over longer 
distances and higher speeds. The absolute vertical error 
increases as the pitch error angle expands over the range of the 
sensor. 
One of the key elements which determined success in this 
Urban Challenge was real time situational awareness and data 
fusion. Such a capability required two levels of characterization, 
that of the robotic vehicle in relation to the road and the 
dynamic obstacles on it. The challenge is illustrated in figure 
10 where the robot is sensing the way to a waypoint, but 
encounters traffic around it. The vehicle must not only track 
and predict where it will go, but it must do this while tracking 
within its lane, sensing the terrain (road radius of curvature, 
grade / cross fall) to ensure any maneuvers are within the 
performance envelope and actually predict where the obstacle 
will move to. In the previous Grand Challenge robots had a 
choice of path candidates (in the Red Team example given 
previously, an onboard computer generates ‘s’ splines or 
multiple path candidates immediately adjacent to the intended 
path of travel, all which are viable alternate routes taking into 
consideration the vehicle’s dynamic state). Here, the path 
candidates around an obstacle need to be able to change rapidly 
and the vehicle will do most of the thinking. 
6. URBAN GRAND CHALLENGE LESSONS 
APPLIED TO REAL WORLD SCENARIOS 
The goal of the Urban Grand Challenge is to apply the various 
technologies employed to successfully navigate the course to 
real world problems. Looking at this competition at its most 
fundamental level, these autonomous vehicles are mobile 
mapping platforms. The advances made here have significant 
implications for how mobile mapping data is used. Consider 
the automotive industry for example. Currently, GPS is utilized 
as a convenience feature utilizing GPS, map matching and 
odometer data to route a driver (albeit not very accurately) 
through GPS outages. When looking at position and orientation 
data in terms of driver assistance / active safety systems, the 
accuracy required changes dramatically. Data needs to be 
thought of in a layered approach for this application much like 
the data fusion discussed above. Base maps utilized by onboard 
computers need to be very accurate for sensors to determine 
dynamics in relation to a vehicle’s current and predicted path so 
the vehicle can determine if a driver is making turns at unsafe 
speeds or passing through an intersection without stopping. By 
having detailed maps along with accurate position and 
orientation data, vehicles will be able to actively ensure the 
safety of passengers. 
Military applications present another example of how vehicle 
automation saves lives. The Pentagon is aiming to have one 
third of its forces automated by 2015. This applies to combat 
forces as well as re-supply elements. Mobile mapping will 
become particularly automated in this field and employ several 
layers of data from different sources to achieve a particular 
mission. For example, UAVs employing LIDAR and other 
sensors will provide up to date intelligence for automated 
ground convoys traveling through hostile terrain. Ground 
vehicles utilizing their own LIDAR and optical sensors will 
map their way to an objective relying on accurate base maps 
and accurate position and orientation data. 
5. 
RESULTS OF THE RACE 
7. 
SUMMARY 
The DARPA Urban Grand Challenge took place in Victorville 
California at George AFB. The National Qualifying Event 
(NQE) saw thirty six Teams participate in a number of rounds 
designed to illustrate the requisite skills required to successfully 
complete the three DARPA missions. Of the thirty six Teams, 
eleven were qualified to participate in the final race on 
November 3 rd , 2007. Of the eleven Teams, only six managed to 
finish all three missions. Applanix Corporation partnered with 
Tartan Racing, Stanford Racing and MIT to secure first, second 
and fourth place finishes. 
Team 
Name 
ID# 
Vehicle 
Type 
Time 
Taken 
(h:m:s) 
Result 
Tartan 
Racing 
19 
Boss 
2007 Chevy 
Tahoe 
4:10:20 
1st Place; averaged approximately 
14 mph (22.53 km/h) throughout 
the course 
Stanford 
Racing 
03 
Junior 
2006 
Volkswagen 
Passat Wagon 
4:29:28 
2nd Place; averaged about 13.7 
mph (22.05 km/h) throughout the 
course 
VictorTango 32 
Odin 
2005 Ford 
Hybrid Escape 
4:36:38 
3rd Place: averaged 13 mph (20.92 
km/h) throughout the course 
MIT 
79 
Talos 
Land Rover LR3 
6:00:00 
4th Place. 
The Ben 
Franklin 
Racing 
Team 
74 
Little Ben 
2006 Toyota 
Prius 
No 
official 
time. 
Finished 
Cornell 
26 
Skynet 
2007 Chevy 
Tahoe 
No 
official 
time. 
Finished 
Figure 11: DARPA Urban Grand Challenge Results 
Accurate and reliable position and orientation data is a 
fundamental part of autonomous vehicle guidance and control. 
What we have shown is that even small errors in pose 
estimation can lead to erroneous terrain characterization which 
impacts vehicle performance. The significance of accuracy was 
highlighted in the Urban Grand Challenge where dynamic 
obstacles and terrain characterization in adverse GPS 
environments were key skills that robots demonstrated in order 
to successfully navigate the course and complete the three 
DARPA missions. Position and orientation data accuracy was 
essential to win the race which required sensor fusion and 
precise vehicle dynamic control to interact with a constant 
changing environment. These core elements will revolutionize 
how we think about mobile mapping in general. The precise 
location of roads, their geometry and roadside features will be 
essential elements for vehicle guidance and control, not just 
basic navigation. Accurate geospatial information and the real 
time interpretation of that information are essential elements for 
autonomous vehicles to demonstrate before such technology 
becomes mainstream. 
1241
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.