Dimitris Skarlatos
3 ROBUST ESTIMATORS.
Robust estimators address the problem of accurate matches, especially in cases of occlusions, shades or sudden slope
changes. It was found that if more than one gross error exists in the data then the reliability of the LS solution decreases
significantly (Pilgrim, 1996). In cases such as shades or occlusions a large number of pixels in the template might differ
a lot and therefore the solution, if reached, would probably be wrong, In such cases robust estimators would be more
accurate if the center of the template fails within the largest continuous region (Calitz et al, 1996). This enhances the
matches near breaklines and can be used for their detection and localization, improving the surface reconstruction.
In order to avoid changing the Least Squares main algorithm, it is possible to simulate robust estimators by
recalculating the weights in each iteration, based on the residuals. The proposed re-weighting scheme by Calitz and
Rûther (1996) is based on the function
Wi= (1)
where p-l,
e a small constant to avoid division by zero and
vi the residual of the observation from the current iteration.
This approach is favorable since other weighting schemes can be applied and tested easily. There will be no further
analysis over the particular weighting scheme (more in Calitz & Rüther, 1996).
The re-weighting of the observation in every iteration is quiet an old concept and there are many proposals (Baltsavias
1991, Cross 1990, Pilgrim 1996). In most of them the new weights are a function of the residuals themselves or their
standard deviation, which causes huge computational cost, even for current computers. Depending heavily on the
template size, in a Pentium 200 MMX with 11x11 template the matching speed was double of an operator, without any
code or compiling optimization over speed. With 21x21 template the speed is by average 3 minutes per point. In
addition all the weighting schemes did not score well in success rates, which were below 70% in all cases.
The results were surprisingly poor, but these and other weighting schemes would be tested in conjunction with the 6
parameter LS model in order to avoid overparameterisation.
4 DOUBLE IMAGE CHECKING.
The basic concept of the method is the fact that the matching function should be one to one. Therefore a successful
match from left to right image should be repeated from right to left, ending on the initial pixel from left image. If the
match is correct and the solution stable the sub-pixel on the left image should lie roughly within
2 2 2 2 ns ire lave 9
+ Mss +S err fel + Serr » (for 99% probability level), (2)
Where Öxright» Öyright the standard deviations of the shifts from the first (left to right) match and
Oxteft» Oylefi the standard deviations from the second (right to left) match,
of the initial left pixel. The results were very encouraging since many spurious matches were removed. In figure 4 an
example of the algorithm, with fixed template size of 7x7 pixels, can be viewed. In comparison with the normal case of
single image matching, there has been a reduction of 3% in matches. These matches did not pass the double image
check and do not appear. They are probably spurious matches.
5 CONCLUSSIONS - FURTHER RESEARCH
It should be noted that all proposed modifications were done upon the basic 8 parameter LS model. Until now no
combinations among the methods has been done, although that by the time this paper is printed results of further
modifications and combinations would be available.
Another way to check the similarity of the two templates before the match is the comparison of the statistical measures.
In the case they are completely different the match shouldn't be attempted. On the other hand the template size which
shows the best similarity on statistical measures between the left and right image should be used as the correct template.
848 International Archives of Photogrammetry and Remote Sensing. Vol. XXXIII, Part B3. Amsterdam 2000.
Si
It:
In
A
Th
Pil
Zk
He
Gı
Ph
At
Sk