Full text: XIXth congress (Part B5,1)

Handmann, Uwe 
  
This decision is found by considering the lane occupancy and the trajectory of the leader. If the leader is detected in a 
different lane or if the trajectory of the leader points to a different lane, an advice for lane change is given. If no lane 
change can be performed, advices for a safe and comfortable ride are given. If the leader has been detected in the same 
lane, the correct security distance has to be kept or reached by acceleration or deceleration. In this case safety and comfort 
also have to be considered. 
7 SIMULATION ENVIRONMENT 
To test different assistance systems with different sensor constellations in different traffic conditions a simulation envi- 
ronment was realized. In fig. 9 results of test environment are shown. The bird's eye view of the sensor constellation on a 
vehicle (black) with three cameras (white lines) and two radar sensors (black lines) is shown in fig. 9 (a). The appropriate 
sensor output is shown in fig. 9 (b)-(f). For the ICC the camera and the radar sensor, both mounted in the front of the car, 
are used. Fig. 9 (b) shows the output of the camera (aperture — 28.0724^). Fig. 9 (e) illustrates the relative velocity m 
of the object hypotheses of the radar sensor (x-coordinate: distance [m], y-coordinate: angle |deg]). 
  
  
  
  
  
  
  
  
  
  
B T " B rT = 
i 1 i gli 
= om 1 7 cas 3 
2.0 $ 24 $ 
oO =A $7278 Uu =F Ix 4200 
ki 4759 1 d 1 
0 20 40 BO &0 100 120 0 20 40 60 &0 100 120 
distance distance 
(e) (f) 
  
Figure 9: Sensor simulation: (a) bird's-eye view with mounted sensors on the vehicle; (b)-(d) camera output of the three 
cameras; (e)-(f) plotted results of the radar sensor output. 
Other assistance systems, e.g., blind-spot observer, can be realized with different combinations of the mounted sensors 
(fig. 9 (a)-(f)) in the vehicle. 
8 RESULTS 
The proposed results have been gained from a visual sensor mounted on the rear view mirror of a car. The sensor data 
were collected on a German highway. In fig. 10 the results of a scene with 1000 frames are shown. Four frames showing 
special situations have been chosen to present the main characteristics of the system. 
The first row (a)-(d) contains the segmentation results of the object-related analysis. In this part of the system ROI are 
extracted. The second row (e)-(h) shows the tracking results of the object-related analysis. In image (e) the necessity of the 
tracking module becomes obvious. An object not being found by the segmentation module (a) is detected by the tracking 
module (e) because of its history. In row three (1)-(l) the lane information provided by the knowledge base is shown. The 
three lanes of the highway are mapped from image coordinates to world coordinates. The bird's-eye view representations 
of the scene interpretation are shown in (m)-(p). The observing vehicle is symbolized by a dark triangle. The dimensions 
of the representation correspond to [x,y] = [32m, 110m] in world coordinates. The observer is located at the point 
[16.0m, 10.0m]. The observed objects are mapped to world coordinates according to the results of the object-related 
analysis. The results of the behavior planning module are presented in (q)-(t). The white dot in the images represents the 
actually chosen leader. If the leader has been chosen, the object is followed and a lane change is advised in case of a lane 
change of the leader (s). 
  
352 International Archives of Photogrammetry and Remote Sensing. Vol. XXXIII, Part B5. Amsterdam 2000.
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.