Full text: Resource and environmental monitoring (A)

   
Urbana- 
g system 
herbicide 
plication. 
n sensors 
ely. The 
ig the on- 
The maps 
real-time 
yn-making 
one at the 
ne system, 
thin a few 
in a map- 
precision 
based on 
al, 1996; 
ous, and 
pability to 
1, and age. 
1996) are 
stems. On 
- herbicide 
1999) are 
ol, but may 
A few real- 
:nsor-based 
inks, 1996) 
) and spray 
cy machine 
itdoor field 
] Slaughter, 
ystem were 
systems for 
, we used a 
ear infrared 
IAPRS & SIS, Vol.34, Part 7, “Resource and Environmental Monitoring”, Hyderabad, India, 2002 
  
2.1 Aerial Imaging System 
The aerial imaging system consisted of a high-resolution sensor, a 
GPS receiver, and a portable computer. A Kodak DCS 420 color 
infrared (CIR) multispectral camera (Eastman Kodak, Rochester, 
NY) was used as the sensor. The camera had a single array of 
CCD sensors (KAF1600) of 1524 pixels by 1012 pixels spatial 
resolution that was sensitive to light radiation in the spectral 
range of 400 nm to 1000 nm. With a CIR filter (650BP300, 
Eastman Kodak, Rochester, NY) mounted on the camera lens, the 
sensor divided the light spectrum into three broad bands of G 
(500 nm to 600 nm), R (600 nm to 710 nm), and NIR (710 nm to 
810 nm) forming a CIR image. The camera was fitted with a 35 
mm to 80 mm zoom lens. A TRIMBLE Ensign XL GPS receiver 
(Trimble Navigation, Sunnyvale, CA) was integrated to the 
camera to record the GPS location of the camera when the image 
was taken. A portable computer with a Pentium II 133 MHz 
processor controlled the camera. The imaging system was 
mounted in a Cessna 205 fixed wing airplane. The images were 
acquired from an altitude of approximately 200 m. 
2.2 Sprayer and Mapping System 
The smart sprayer, a machine-vision-controlled sprayer is shown 
in Figure 1. The system included a multiple-cameras vision 
system, a ground speed sensor and a nozzle controller. The 
application rate for each nozzle on the spraying boom is 
controlled separately based on local weed infestation conditions. 
The latest prototype was built on a Patriot XL sprayer (CASE- 
Tyler Industries Inc., Benson, MN). This self-propelled map- 
driven ready sprayer was equipped an AIM control system with a 
differential GPS receiver with 10 pps (positions per second) 
position updating rate. Nozzle drops 0.381 m (15 in.) long were 
connected to the nozzle bodies on the spray boom so that the 
nozzles (TeeJet 8006VS, Spraying Systems Co., Wheaton, IL) 
were 0.36 to 0.38 m (14 to 15 in.) above the ground. Video 
images were acquired from two color CCD cameras (Pulnix 
TMC-7EX) mounted in the nadir position over the crop on a 
camera boom 4 meter (10 feet) above the ground (Figure 2). The 
field-of-view (FOV) of each camera covered a 2.44-m by 3.05-m 
area with the longer side perpendicular to the crop rows. The 
machine vision system has a resolution of 640 by 480 pixel for 
each camera. A dual processor (Pentium 300 MHz CPU) portable 
computer was used as the main image-processing computer. A 
high speed CX-100 frame grabber (ImageNation, Inc., Beaverton 
OR) was used for field image acquisition. 
The image processing software was developed using Microsoft 
Visual C and the Windows application program interface (API) to 
create a graphical user interface which made possible a graphical 
display of the image processing results and ease in changing the 
software settings. Each image was first segmented with an 
environmentally adaptive segmentation algorithm (EASA, Tian 
and Slaughter, 1997). The EASA specifies the boundaries of a 
region in HSI color space which corresponded to the color of the 
objects in the outdoor scene through an interactive calibration 
window. Several variations of EASA program have been 
developed and tested with this machine vision system; the 
relatively reliable RDC-EASA was selected for the final system 
(Steward and Tian, 1999). To separate weeds from crop plants, 
additional information such as field location (different zones), 
crop row spacing, crop plant size (age), etc. was used in the 
image-processing algorithm. The crop rows were identified and 
the inter-row area was used for weed infestation condition 
measurement. The hypothesis here is that weed patches are 
normally distributed across the inter-row and crop row area and 
the weed density is similar in a relatively near neighborhood (say 
within one meter). So, the inter-row area weed density can be 
used to estimate the weed infestation condition in the crop row 
between plants. After all, we can only control and direct 
herbicide into unified grids (0.5-m by 0.5-m). To increase the 
image processing speed, several real-time weed density and weed 
leaf number extraction algorithms have been employed. 
Cameral x s m Camera 2. 
  
Figure 1. Smart sprayer prototytpe system set up. Camera 1 is for vision system calibration, camera 2 and 3 are the cameras for real- 
time applications. 
  
  
  
  
  
   
   
  
  
  
  
  
   
  
  
   
   
    
   
   
   
   
  
  
  
  
    
  
    
  
    
   
   
   
   
   
    
  
  
  
  
   
  
  
  
  
  
   
  
  
  
   
  
   
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.