AMSU-B
WORK
ssify of cloud-
ıres of weather
hannels data of
nples for each
-65E, 22-45N),
orm, 2) heavy
[hey are based
emperatures in
dditional input
s using multi-
classifying of
thunderstorm
. (Snf), cloudy
1dy sky on the
n found using
22-45N). The
the 400
ing data from
of the ANN
| from the 20
erent location
uary 2001 to
temperatures
we call them
ng synoptic
Organization
AMSU-B data
emperatures.
(1)
in, frequency .
radiance in
juency f and
temperature T (in Kelvin). The corresponding T of areas,
where weather features (table 1, col. 1), were reported by IMO,
are selected. The validation of the weather features of class 1
through 5 is carried out with the hourly report of IMO. The
clear sky on the land and the clear or cloudy sky conditions on
the sea cases, were defined by collocated IR-Meteosat and
AMSU-B images. Each T is normalized between 0 and 1
according to equation (2).
2 (T, Tu) Q)
E x uu E
(Cua. = T yi, )
where Ts is normalized brightness temperature, T, is
observed brightness temperature calculated by eq.
(0, T o and T are common maximum and minimum
brightness temperatures for all channels frequency. Note that
T is related to which channel that T, is. For network training
and testing, the 400 input patterns are divided into two sets, 200
for training and 200 for testing.
2. 2 Artificial Neural Networks (ANNs)
The most commonly used form of ANN is the Feed-forward
neural network. A schematic diagram of the type of ANN that
was used for this study is presented in Fig. la. It consists of an
IAPRS & SIS, Vol.34, Part 7, “Resource and Environmental Monitoring”, Hyderabad, India,2002
input layer of Mnodes (to which an input vector X; =
(X, X5 ,***, X, ) is applied), one or more hidden layer, and an
output layer of Knodes with output vector Y; =
(V1 > V2»**"> Y, ). In our case, the number of input nodes is
the five Ty corresponding to five frequencies of AMSU-B
(n = 5), and the number of nodes in output layer is the eight
weather features (k =8 ). The number of hidden layers and
the number of nodes in each hidden layer must be determined
by trial and error. In this study, the best accuracy was achieved
using 2 hidden layers and eight nodes in each layer. Each of the
input nodes is connected to all 772 nodes in the first hidden
layer and each node in any hidden layer is connected to all
nodes in neighborhoods layer.
A node (neuron) is an information-processing unit that is
fundamental to the operation of an ANN. The three basic
elements of the neuronal model are identified in Fig.1b: i) A set
of synapses or connecting links. A signal X jt the input of
synapse J is connected to node k , is multiplied by the synaptic
weight Wy ; ii) An adders for summing the input signals,
weighted by the respective synapse of the node and iii) An
activation function for limiting the amplitude of the output of a
node.
Fig.1: (a) schematic of a feed forward ANN architecture. The components of input vector X, output vector Y, along with the weight
and bias links. The circles symbolize the nodes (neurons). (b) a sample node with the three basic elements
A mk EX — [biases] (a)
output vector
(b)
M 1 X
activation function
— tà ,
ai
inputs
n :
vector input layer hidden layer output layer
summing junction
INPUS sunoptic weights
The node model of Fig.1b also includes an externally applied
bias denoted by b,. The bias bi. depending on whether it is
positive or negative, has the effect of increasing or lowering the
net input of the activation function. In mathematical terms, a
neuron may describe as following equation:
n
U, = yx, à)
j=0
where W,0% = b, are called the bias, Xj and Wy are the
input signal and the synaptic weight and, is the linear
combiner output due to the bias and input signal of the node k.
The weights (W kj ) are determined during the training process.
In the present study, weights are obtained using back
Propagation algorithm. It adjusts the weights iteratively to
reduce the difference between teaching outputs and actual
outputs calculated by the network using the input values. The
effective incoming signal U, is passed through a nonlinear
activation function to produce the outgoing signal ( y, ) of the
node.
y, = fu) (4)