Full text: Proceedings; XXI International Congress for Photogrammetry and Remote Sensing (Part B1-3)

The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Vol. XXXVII. Part Bl. Beijing 2008 
1084 
conditions and the features extracted from them are matched 
against the database to determine the identity of the object along 
with its pose angle. 
The traditional PCA algorithm [3] computes eigenvectors and 
eigenvalues for a sample covariance matrix derived from a well- 
known image data matrix, by solving an eigenvalue system 
problem. The incremental principal component analysis (IPCA) 
which is the incremental version of the principal component 
analysis. The independent component analysis (ICA) [4] is used 
to separate independent components from the set of unknown 
mixtures. It is known that there is a correlation or dependency 
between different objects, the set of objects is represented as a 
data matrix X. The correlation between the rows of the matrix X 
can be represented as the mixing matrix A. The independent basis 
objects are represented as rows of source matrix S. The ICA 
algorithm extracts these independent objects from the set of 
dependent ones using (l).ICA is much related to the method 
called the BSS, where a correlated source is separated into 
uncorrelated source without prior knowledge about the 
correlation between the elements of the source. When the 
dimension of the image is high, both the computation and storage 
complexity grow dramatically. Thus the idea of using the real 
time process becomes very efficient in order to compute the 
principal independent components for observations (objects). 
Each eigenvector or principal component will be updated using 
FastICA algorithm, to a non-Gaussian component. Here random 
vector is said to be non-Gaussian if its distribution is not a 
Gaussian distribution .In (1) if the source matrix S contains 
Gaussian uncorrelated elements in the mixed matrix X will also 
be Gaussian but correlated elements. 
The most common ICA algorithm FastICA method does not have 
a solution if the random variables to estimate, are Gaussian 
random variables. This is due to the fact that the joint distribution 
of the elements of X will be completely symmetric and doesn’t 
give any special information about the columns of A. In this 
paper, S is always a non- Gaussian vector. 
X=AS (1) 
In this paper we have applied the IPCA-ICA to 3D object 
recognition. The combined method of IPCA and ICA. To the best 
of our knowledge IPCA-ICA has not been applied to appearance- 
based 3D object recognition and pose estimation. The 
contributions of this paper are in the application of the IPCA - 
ICA representation based on the work for 3D object recognition, 
as well as investigating and determining whether IPCA-ICA 
would always outperform the PCA and ICA in the appearance- 
based 3D object recognition task. 
2. METHODOLOGY 
The object recognition can be done by projecting an input image 
onto the block diagram(Fig 2)and comparing the resulting 
coordinates with those of the training images in order to find the 
nearest appropriate image. The database consists of n images and 
a set of k non-Gaussian vectors. This algorithm takes input image 
finds the non-Gaussian vector (eigenvector) which is passed as 
input to the ICA algorithm. The non-Gaussian components will 
be updated using the updating rule (3) from the previous 
component values in a recursive way. While IPCA returns the 
estimated eigenvectors as a matrix that represents subspaces of 
data and the corresponding eigenvalues as a row vector, FastICA 
searches for the independent directions as in eq(3) where the 
projections of the input data vectors will maximize the non- 
Gaussianity 
Input image 
IPCA 
4 
Update using 
(3) 
Apply ICA 
Algorithm 
First estimated non 
Gaussian vector 
IPCA 
Update using 
ni 
Apply ICA 
— 
Î 
VP) 
Algorithm 
Fig 2 Steps in IPCA-ICA for Object recognition 
Last estimated non 
Gaussian vector 
The object recognition can be done by projecting the input test 
image onto this basis and comparing the resulting coordinates 
with those of the training images in order to find the nearest 
appropriate image. The data consists of n images and a set of k 
non-Gaussian vectors are given. Initially, all the non-Gaussian 
vectors are chosen to describe an orthonormal basis. In each step, 
all those vectors will be updated using an IPCA updating rule(3). 
Then, each estimated non-Gaussian vector will be an input for the 
ICA function in order to extract the corresponding non-Gaussian 
vector from it (Fig. 2). 
Mathematically,By definition, an eigenvector x with a 
corresponding eigenvalue /1 of a covariance matrix C satisfies
	        
Waiting...

Note to user

Dear user,

In response to current developments in the web technology used by the Goobi viewer, the software no longer supports your browser.

Please use one of the following browsers to display this page correctly.

Thank you.