Full text: Proceedings, XXth congress (Part 1)

ETRY 
n various 
| all what 
essing for 
ular cases 
(4) 0, 
v(t) is a 
umber of 
ar form of 
(t), (t) 
ugh these 
sses some 
hods uses 
e solution 
substitute 
integrated 
formation 
the oppo- 
bservation 
aerial sur- 
ch (FrieD, 
aper is not 
oth classi- 
*one." The 
ithmic and 
res estima- 
(Maybeck, 
The actual 
—i.e., dif- 
me depen- 
cesses— in 
estimation 
state-space 
at both ap- 
aim that in 
to apply the 
pendent pa- 
nt problems 
treatment. 
ftware data 
es of a uni- 
n problems 
> modelling 
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol XXXV, Part B1. Istanbul 2004 
  
of trajectories for airborne and spaceborne imaging lin- 
ear arrays, the calibration of inertial instruments (angular 
rate sensors and accelerometers) with “cross-over” type 
of observation equations and the modelling/estimation of 
geodetic networks for monitoring and prediction purposes. 
It has to be mentioned that a parallel research effort is be- 
ing conducted by A. Térmens for inertial strapdown kine- 
matic airborne gravimetry (Térmens and Colomina, 2003, 
Térmens and Colomina, 2004) for an optimal calibration 
of accelerometers. 
The key idea behind this investigation is that a stochas- 
tic dynamic model (a stochastic differential equation) and 
its stochastic processes can be transformed through dis- 
cretization into a family of stochastic difference equations 
and discrete time processes. Those, in turn, can be seen as 
a family of observation equations and parameters that can 
be processed under the network approach. 
The paper begins by reviewing some definitions and con- 
cepts from the theory of stochastic processes and stochas- 
tic differential equations. We take this approach because 
of the available sound theory that includes continuity the- 
orems and numerical solution methods consistent with the 
stochastic nature of the problem. Then, the state-space and 
the network approaches are defined and compared. Once 
this is done, in section 6 we define time dependent net- 
works in a way that generalize the traditional least-squares 
based networks. Here, the scope of the concept of a dy- 
namic or time dependent network is precisely defined. The 
algorithmic and software implementation implications of 
section 6, should be clear at that point. However, we un- 
derline them in section 7 for readers not familiar with the 
development of network adjustment systems. 
2 STOCHASTIC PROCESSES 
A stochastic process is a parametrized collection of ran- 
dom variables defined on a probability space (£2, F, P) 
(Law ler, 1995). The parameter space T' is usually the time 
or a time interval. In other words, a stochastic process x is 
a set of random variables indexed by time 
mix faithite TE CR} 
where R is the set of real numbers. In this paper, and in 
most applications, the parametrizing, indexing or tagging 
subset T is either N, the set of natural numbers, or R. If 
T = N, x is called a discrete time process and in the other 
case, 7 = Ror T = [a,b] C R, it is called a continu- 
ous time process. The set where the random variables take 
values, typically R", is called the state space. 
From the definition, it is clear that for each t € T', we have 
a random variable w — z(t)(w) := z(t,w) for w € 9. 
But the function x(t, w), for a given fixed w, can be seen 
as a function of t, t — x(t,w) fort € T. This function 
is a path. We introduce the concept of a path because it IS 
close to our intuition in INS and GPS trajectories, satellite 
orbits, etc. When we look at a trajectory, w can be seen as a 
point or one of our repetitive experiments and thus z(f, w) 
179 
would represent the position of the point at time ¢ or the 
result of the particular experiment. 
A fundamental stochastic process is the Brownian motion 
(or Wiener process or continuous random walk) named af- 
ter a 19th century botanist who observed that pollen grains 
on a liquid described an irregular trajectory. Its formal 
derivative is called white noise. White noise is formally 
considered a stochastic process to facilitate the visualiza- 
tion and interpretation of the continuous idealization of 
discrete time processes whose random variables are inde- 
pendent, normally distributed ones. (Sometimes, in the en- 
gineering literature, it is said that the white noise process is 
a helpful concept that does not exist in the world of math- 
ematics. In fact, this statement is wrong. White noise ex- 
ists as a generalized stochastic process (Oksendal, 1993), a 
slightly more complex concept than a stochastic process.) 
The stochastic analogs of ordinary differential equations 
(ODE) are the stochastic differential equations (SDE). The 
theory for SDE can be found in (Oksendal, 1993). SDE 
arise naturally from real-life ODE whose coefficients are 
only approximately known because they are measured by 
instruments or deduced from other data subject to random 
errors. The initial or boundary conditions may be also 
known just randomly. In these situations, we would ex- 
pect that the solution p of the problem be a stochastic pro- 
cess. We will call p = p(t,w) a prediction. Under certain 
[non-restrictive] hypotheses p has a number of properties 
including that it is t-continuous (Oksendal, 1993, pp. 48- 
49). 
Assume now that we have managed to predict the stochas- 
tic process p —the system— over a time interval [to. t y]. 
In our application, determining p reduces to determine an 
estimate of the path E(p(t)) and estimates of the process 
auto-covariance functions 
al 
C(t5, t2) : E ((p(t:) — E(p(t1)))(p(t2) — F(p(ta)))") - 
Assume further that we are able to relate p through some 
linear model —the observation equations— to another pro- 
cess z —the observations— so we have additional infor- 
mation of p. A natural question arises: can we improve 
our estimates of p with the additional information z?. The 
answer, in general, is yes, and the tool is the well known 
filtering and smoothing. Filtering at time s refers to find- 
ing a best estimate for the system P(s), to < s « tr given 
the observations z in the interval [to, s]. Smoothing, refers 
to finding the best estimate for p(s) at any time by using 
the information of z all over [to, t5]. Saying that p(s) is 
best means that E (| p — p||?) is minimal over all solutions 
of the system SDE that verify the observation equations 
(see (Oksendal, 1993, pp. 58-59) for a detailed description 
ofthe probability function associated to the SDE and to the 
observations white noise processes). 
  
3 THE STATE-SPACE APPROACH 
We will call state-space approach (SSA), the methodology 
and principles of solving the above problem of prediction, 
 
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.