Full text: Close-range imaging, long-range vision

  
d solar lens array — 
k of solar cells 
  
face measurement 
s detail of the top 
E. 
control system for in-orbit manoeuvres. Two small, circuit 
board CCD cameras were mounted on an independent fixture in 
order to image the inside surface of the lens through two 
viewing slots machined into the acrylic base of the lens 
element. This configuration was adopted to test the feasibility 
of in-flight monitoring of a lens array element that would be 
manufactured without the solar cells. 
The two CCD cameras produced RS-170 monochrome, analog 
video which was captured by a pair of Epix frame grabbers. 
The cameras were once more synchronised as a master-slave 
pair and frames were correlated using injected VITC time code. 
Passive targets were used throughout to avoid reflections off the 
lens surface. The cameras were calibrated in a similar fashion 
to the micro-flight vehicle, using a small step-block target array 
within the fields of view to create a convergent multi-station 
network. In this case a network of 10 exposures of the 36 
targets was sufficient to calibrate the cameras and derive the 
relative orientation. The coordinates of the targets on the step- 
block had been previously determined with a precision of ten 
micrometres from a self-calibration network imaged with a 
Kodak DC4800 digital still camera. 
Image pairs of the static lens and a number of sequences of the 
lens under induced vibration were captured as TIFF format 
images. As all the vibration periods were approximately 0.5 
seconds or longer, full frames were captured at 30 Hz. Target 
coordinates were once more computed from simple 
intersections, in this instance with an estimated precision of 40 
micrometres. An example pair of images of the lens is shown 
in figure 6 and an example of the visualisation of the motion of 
the surface targets is shown in figure 7. 
  
Figure 6. Top and bottom images of the Fresnel lens. 
dd auae esae 
  
  
gt " ed mae n d aC 
a ditm tid i mt Co 
Brn in > 
pee gr 5 = rs m 
SD fn a d 8. 3 5 
T = rai P css WE Hmm em al es 
so oe, A nm cttm am mst M an ; 
Ces C EL > dv pe ie ii 
ir maire mu eee retire 2 " pes 
* min 
  
Figure 7. Visualisation of the target movement 
of the Fresnel lens. 
—-983— 
4. TARGET TRACKING ISSUES 
The images shown in figures 2 and 6 demonstrate a number of 
typical issues associated with tracking targets on small objects 
or within constrained environments. The convergence of the 
cameras, used to enhance the object space accuracy or forced 
by the physical set-up, causes significant fall off in retro- 
reflector response or the intensity of the passive targets. 
Variations in background intensity are also present, due to 
reflections off the membrane surfaces and ambient light 
sources. The effects of variations in target and background 
intensity were minimised by using a local threshold within the 
target image window for the centroid computation. 
The convergence of the cameras also causes a fall-off in the 
size and spacing of the targets across the objects, with the 
Fresnel lens showing a particularly extreme size variation. This 
effect was partially ameliorated using a two level adaptive 
window for the target image centroids. First, the initial size of 
the window for each target was computed based on the relative 
depth of the target with respect to the imaging camera. Second, 
the algorithm progressively shrinks the window for each target 
image centroid if intrusions into the edge of the window are 
detected. Also, the targets are processed in depth order from 
the imaging camera, assuming that any nearer targets will be in 
the foreground and may obscure more distant targets that are in 
the background. 
However, there are a number of issues that require additional 
sophistication in procedures or algorithms to minimise the 
effects on the accuracy and reliability of target tracking. For 
example, variations in intensity are also seen on the large 
passive targets in the foreground for the Fresnel lens. The 
movement of the lens introduced a cyclic bias in these 
variations, leading to systematic errors in the target locations. 
The only immediate remedy for this bias would be more careful 
attention to lighting and image quality on a case by case basis. 
Perhaps the most challenging aspect of tracking applications is 
the “loss of lock” problem. Target images that are obscured, 
merge or fail to produce an acceptable centroid due to 
reflections, low intensity or marginal size, are not intersected 
and therefore are not included in the tracking process. Despite 
the use of the adaptive window and object space motion 
prediction, loss of lock on targets remained a regular problem, 
requiring operator intervention in an otherwise automated 
process. 
An enhancement to the tracking process that reduces the 
number of target losses is the use of a Delaunay triangulation 
(see figure 3). Such triangle meshes are often used as a surface 
descriptor or as a mechanism to densify surface points 
(Papadaki et al, 2001), whereas here the common object and 
image space connectivity between points in the mesh is used as 
a reliability test. Established in the initial, static epoch of 
measurement, the mesh simply provides a consistent description 
of the spatial relationships between targets that is independent 
of the induced vibration modes. The triangulation can be used 
for a number of tracking assistance purposes. Given any loss of 
lock on, or the mis-identification of, an individual surface 
target, the connectivity within the mesh can be consulted in the 
form of a simple look up table to resolve most ambiguities. 
The winding of the triangles forming the Delaunay mesh can be 
used to validate computed lines of sight to targets and detect if 
  
  
  
  
  
 
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.