Full text: Sharing and cooperation in geo-information technology

77 
• a surface referred to a plane can be analyzed by surface 
reconstruction. 
• the contour of a curve referred to a suitable approximation 
of the contour itself can be studied by using two one- 
dimensional form descriptors of the two components of the 
contour. 
• the surface of an object referred to a suitable 
approximation of the surface itself can be studied by using 
three two-dimensional form descriptors for the three 
components of the surface. 
Notice that figures or objects delimited respectively by a 
contour or a surface are in this context simply geometric 
quantities, independent of their physical nature. Therefore 
contours and surfaces have a geometric meaning only and they 
differ from the first two cases. 
There are many mathematical approaches useful to solve 
problem: in particular, the tessellation method and finite 
elements (e.g. splines) represent the most interesting methods. 
Stochastic and/or mixed approaches (i.e. covariance estimation 
and optimal filtering and/or their suitable combination with 
deterministic methods) could be applied offering more 
sophisticated strategies to model the behavior of phenomena. 
However whilst these methods present some advantages from 
the point of view of data understanding, they suffer for some 
disadvantages from the point of view of data compression. 
Let remember that 3D models overcome intrinsic constraints of 
2D models, for which height data are processed as attributions, 
although metrical. Indeed, this approach, typical of classical 
cartography, is quite limited when complex objects are to be 
shown, such as hollow, non-stellar, possibly pluri-connected 
bodies. Voronoi’s diagrams, triangulation/tetrahedration by 
Delaunay, Thicsscns’ polygons and finite elements (splines on 
triangular patches) by Bezier prove as well suited tools for the 
said needs. All this is apt to modelization of objects, previously 
detected by acquisition operations and recorded by restitution 
techniques. 
Discrete network description 
A set of points may be connected by a network of straightline 
segments which assumes the feature of the Delaunay’s 
triangulation in 2D (tetrahedron network in 3D), if following 
conditions arc satisfied: 
• there exists, at least, a circle by two points which doesn’t 
contain any point inside; 
• three points are vertices of a triangle, if and only if there 
exists a circle by them which doesn’t contain any point 
inside; 
• an additional condition suggest to perform equilateral 
triangles, as much as possible, avoiding to have ill- 
conditioned configuration. 
The passage from 2D to 3D substitutes the sphere to the circle, 
the tetrahedron to the triangle which connects four points, 
instead of three ones, and where the regularity of the solid 
elements is always highly appreciated. A set of triangles around 
a points in 2D (of the tetrahedrons around a point in 3D) is 
called Thiessen’ polygon (polyhedron). The complementary 
region around a point passing by the mid point of each 
straightline segments is called Voronoi’s or Dirichlet’s 
diagrams. 
Continuous field description 
A set of points may be approximated by a continuous 
interpolation, in forma of suitable lines in 2D and suitable lines 
and surfaces in 3D. As the points arc generally irregular 
distributed, particular kinds of interpolation methods should be 
used. 
'file Catmull-Rom’s curves and the Bezier’s splines on 
triangular patches respond positively, to this aim. 
Indeed the former gives a local interpolation (in 2D or in 3D), 
hiking into account three points only; the curve passes throw 
these points and has the tangent vector in the mid point parallel 
to tlie cord vector between the beginning point and the end one. 
The latter is defined on triangular patches and has the analytical 
form given by Bernstein’s polynoms. The surface is obtained by 
multiplying three quadratic families of curves parallel to each 
edge of a triangular and it has a degree of sestic order. 
In such a way, special types of piecewise polynoms are apt to 
model the phenomena, according to the finite element method. 
6. Robust procedures (Bcllone, ct al., 1996) 
The most promising robust estimators are, among the 
downweighting methods, the redescending estimators, 
especially when their breakdown point is very high. In fact 
outliers of bigger size (e.g. leverages) and/or in a large number 
may be considered; moreover different explanations ca be set 
up, when the anomalous data, after rejection, show a 
homogeneous behavior. 
The basic idea follows some suggestions of Hampel to 
introduce a rejection point in the loss function, so that the data 
outside the interval get, automatically, weight equal to zero. On 
the contrary, the data inside get weight equal to one, it they 
belong to the inner core of data, or ranging from one to zero, it 
they stay in intermediate region of doubt. 
There are many ways to concretize these suggestions. 'ITie 
easiest one is represented by the Generalized M-estimators, 
where some suitable weight functions correct the behavior ol 
tlie M-estimators, as defined by Ilubcr. Unfortunately this 
strategy (called by some authoritative authors: minimax), 
although it increases the breakdown point, is unable to raise it 
substantially. 
'Hie most refined and conservative modality is represented by 
the least median of squares, where the median of tlie squares of 
the residuals in minimized in order to obtain the expected 
results. 
Unfortunately this strategy is, at present time, computationally 
to expensive, because no efficient algorithm is known to solve 
an enormous number of systems, selecting a subset ot 
observations among the whole set of observations, forming a 
sample whose dimension is exactly equal to the number of tlie 
unknown parameters. 
An advantageous alternative is represented by the least 
trimmed squares, where the average of squares of tlie residuals, 
belonging to the inner core of data, is minimized in order to 
obtain tlie expected results. 
hi other words, only a part of the observations are processed in 
each step or linear adjustment. The use of a sequential updating 
of a preliminary computed solution is possible alternative to 
repeating many times tlie whole adjustment. 
The weighted least trimmed squares could be minimized, 
avoiding a rough partition between inliers and outliers. The 
weighted average of the squares of the residuals takes into 
account the inner core of the data with weights equal to one, an 
intermediate doubt region with weights ranging from one to 
zero, whilst the data in the tails get weights equal to zero. 
Least median of squares and least trimmed squares (or weighted 
least trimmed squares) have the same breakdown point near to 
05, when the number of the observations actually processed is
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.