ijing 2008
The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Vol. XXXVII. Part Bl. Beijing 2008
strie and doesn’t
ns of A. In this
A. to 3D object
ICA. To the best
;d to appearance-
estimation. The
>n of the IPCA -
Dject recognition,
ether IPCA-ICA
: the appearance-
g an input image
ig the resulting
order to find the
s of n images and
takes input image
hich is passed as
components will
>m the previous
IPCA returns the
:nts subspaces of
/ vector, FastICA
eq(3) where the
iximize the non-
i
ion
ling non-Gaussian
stor x with a
atrix C satisfies
A.x = C.x (2)
By replacing C with the sample covariance matrix
n
and using v= A.X we will get the nth
/=l
eigenvector i.e. v(n) for the n images of the database.
Then, this vector will be the initial direction in the FastICA
algorithm.
w=v(l) (3)
v(l) is the first principal component.
The FastICA[16] algorithms will repeat until convergence the
following rule:
Wnew=E[v(l).g(w T .v(l))]-E[g’(w T v(l))].w (4)
Where g’(x) is the derivative of the function g(x) (6). It should be
noted that this algorithm uses an approximation of negentropy in
order to assure the non-Gaussianity of the independent vectors.
Before starting the calculation of negentropy, a non-quadratic
function G should be chosen, for example,
G(u)=-exp(-u 2 /2 (5)
And its derivative:
g(u)=u.exp(-u 2 /2) (6)
In general, the corresponding non-Gaussian vector w, for the
estimated eigenvector v(l), will be estimated using the following
repeated rule:
Wnew=E[v(l).g(w r .v(l))]-E[g’(w r v(l))].w (7)
The previous discussion only estimates the first non-Gaussian
vector. One way to compute the other higher order vectors is to
start with a set of orthonormalized vectors, update them using the
suggested iteration step and recover the orthogonality. Further,
the non-Gaussian vectors should be orthogonal to each other in
order to ensure the independency. So, it helps to generate
“observations” only in a complementary space for the
computation of the higher order eigenvectors. After convergence,
the non-Gaussian vector will also be enforced to be orthogonal,
since they are estimated in complementary spaces. As a result, all
the estimated vectors w k will be: Non-Gaussian according to the
learning rule in the algorithm. . Independent according to the
complementary spaces introduced in the algorithm.
The nearest neighbor algorithm is used to evaluate the object
recognition technique. Each Object Database is truncated into
two sets. The training set that contains images used to calculate
the independent non- Gaussian vectors and come up with the
appropriate basis and, the test set that contains images to be
tested by the Object recognition algorithm in order to evaluate
the performance of the proposed method. The whole set of
training images (rows in the image data matrix) are projected into
the basis found in order to calculate the coordinates of each
image with respect to the basis v^n Each new testing image v tes t
is compared to whole set of training images v^ in order to come
up with nearest one that corresponds to the maximum k in (8).
[k] =nearest_match (V test v train ) (8)
k th image gives the index of the object that is recognized from
the database.
3. RESULTS
Object recognition and pose estimation experiments were
performed by using Matlab7.1. The object set is COIL-20
(Columbia Object Image Library) database [14]. The images are
stored at every 5° of pose angle, from 0° to 360°. Hence 72
images of each object, and 1440 total number of images. The size
of the images is rescaled to 64x64.The 0° pose angle views are
shown in Fig.3 the maximum pixel value is 255.
Fig 3 COIL database of 20 objects
To construct the non Gaussian space of the object, a few of the
images were chosen as the training images. The representations
of images make a manifold with the pose angle as the parameter