Escolar Documentos
Profissional Documentos
Cultura Documentos
2, MARCH 2013
SAR Target Conguration Recognition Using
Locality Preserving Property and
Gaussian Mixture Distribution
Ming Liu, Yan Wu, Peng Zhang, Qiang Zhang, Yanxin Li, and Ming Li
AbstractFeature extraction is the key step of synthetic aper-
ture radar (SAR) target conguration recognition. A statistical
model embedding the locality preserving property is presented
to extract the maximum amount of desired information from the
data, which is of crucial help to recognition. The noise, or error,
of the SAR image samples is described by a Gaussian mixture
distribution, and the locality preserving property is embedded
into the statistical model to focus on the problem of conguration
recognition. Along with the extraction of the information of inter-
est through the use of the statistical model, also, the preservation of
the local structure of the data set is achieved. Parameter estimation
is implemented through the expectationmaximization algorithm.
Experimental results on the Moving and Stationary Target Acqui-
sition and Recognition data set validate the effectiveness of the
proposed method. SARtarget conguration recognition is realized
with satisfactory accuracy.
Index TermsConguration recognition, Gaussian mixture
distribution, locality preserving property, synthetic aperture radar
(SAR) image.
I. INTRODUCTION
T
HE AIM of synthetic aperture radar (SAR) target congu-
ration recognition is to nd the probable target in the SAR
scene and then recognize the conguration of the found target.
The target conguration indicates how the target is deployed,
and targets of the same type with different congurations
are called variants [1]. Traditional algorithms for SAR target
recognition focus on the recognition of target types, which
means that targets with different congurations of the same
type are regarded as the same [2][5]. However, the recognition
of the target conguration is of signicance to a number of
application areas, such as detailed information capturing of
interested targets and battleeld perception.
Manuscript received January 17, 2012; revised April 6, 2012; accepted
April 30, 2012. Date of publication July 6, 2012; date of current version
October 22, 2012. This work was supported in part by the National Natural
Science Foundation of China under Grant 60872137, by the National Defense
Foundation of China under Grant 9140C0103071003, by the Aviation Sci-
ence Foundation of China under Grant 2011018106, and by the Specialized
Research Fund for the Doctoral Program of Higher Education under Grant
20110203110001.
M. Liu, Y. Wu, Q. Zhang, and Y. Li are with the School of Electronic
Engineering, Xidian University, Xian 710071, China (e-mail: liuming0910@
gmail.com; ywu@mail.xidian.edu.cn; qzhang@mail.xidian.edu.cn;
liyanxin860@163.com).
P. Zhang and M. Li are with the National Key Laboratory of Radar Signal
Processing, Xidian University, Xian 710071, China (e-mail: zhangpeng4415@
hotmail.com; liming@xidian.edu.cn).
Color versions of one or more of the gures in this paper are available online
at http://ieeexplore.ieee.org.
Digital Object Identier 10.1109/LGRS.2012.2198610
The key step of a recognition algorithm lies in feature ex-
traction, which not only realizes the dimensionality reduction
of the data set but also preserves the information useful for
recognition of the samples as much as possible. An algorithm
is proposed for target conguration recognition using locality
preserving property and Gaussian mixture distribution, which
extracts the maximum desired information through a statistical
model. Gaussian mixture distribution has good adaptability, and
its parameter estimation is relatively easy. It can approximate
any distribution smoothly in theory, which has been widely
applied to many SAR image processing areas like segmentation
and classication [6][8]. Considering the existence of the
speckle noise in SAR images and the residual error of the
model, a mixture of Gaussian distributions is used to describe
the statistical property of the unwanted component of SAR
images. The locality preserving property can ensure that the
same conguration samples that are close to each other in
the high-dimension space are still close in the low-dimension
space. To preserve the local structure of the samples of the same
conguration, the locality preserving property is embedded into
the statistical model, which is of importance to SAR target
conguration recognition.
The main steps of the proposed recognition algorithm is
shown in Fig. 1. In the rst step, the images are preprocessed
to enhance the recognition performance [3], [4]; in the second
step, the projection matrix is obtained by using the Gaussian
mixture distribution statistical model embedding the locality
preserving property, and the SAR image to be recognized is
projected by the projection matrix, and parameter estimation is
implemented through the expectation maximization (EM) algo-
rithm [9][11]; in the last step, a nearest neighbor classier [12]
is applied to identify the target conguration. The effectiveness
of the proposed algorithm is veried with experimental results,
and the comparisons with other algorithms further prove the
advantages of the proposed one.
II. SAR TARGET CONFIGURATION
RECOGNITION ALGORITHM
As mentioned earlier, effective feature extraction is the
precondition of accurate recognition. For a given SAR im-
age sample y
i
(i = 1, 2, . . . , N), N is the number of the
SAR images which are used as the training data, and we
have [13]
y
i
= Wx
i
+m+n
i
(1)
1545-598X/$31.00 2012 IEEE
LIU et al.: SAR TARGET CONFIGURATION RECOGNITION 269
Fig. 1. Flow diagram of the proposed algorithm.
where W is the projection matrix and x
i
(i = 1, 2, . . . , N) is
the reduced-dimensionality representation of y
i
. mis the mean
of y, y = {y
1
, y
2
, . . . , y
N
} is the original data set, and n
i
is
the corresponding noise, or error. As can be seen, the original
sample y
i
consists of both the useful component x
i
and the
unwanted component n
i
. The essential of feature extraction is
to preserve the useful information as much as possible, which
is helpful for recognition. How to achieve the elimination of
the inuence of n, n = {n
1
, n
2
, . . . , n
N
}, and the preservation
of the desired x, x = {x
1
, x
2
, . . . , x
N
}, is the very point that
feature extraction focuses on.
In the view of statistic, the objective function can be ex-
pressed as
arg
W
max p(x|y). (2)
The marginal distribution p(y) does not give a direct in-
uence to the objective function, and through the Bayesian
equation p(x|y) = p(x)p(y|x)/p(y), we can get
p(x|y) p(x)p(y|x). (3)
Thus, the objective function in (2) can be updated as
arg
W
max [p(x)p(y|x)] = arg
W
max p(y, x). (4)
A. Gaussian Mixture Distribution for the Likelihood Function
Taking the special statistical property of SAR images into
consideration, we describe the error (consists of the speckle
noise caused by SAR imaging and the residual error of the
model) by Gaussian mixture distribution, which can approxi-
mate any distribution smoothly in theory [6]. Utilizing (1), the
likelihood function is given as
p(y
i
|x
i
) =
C
c=1
p(c)p(y
i
|x
i
, c) (5)
where C is the number of Gaussian distributions, p(y
i
|x
i
, c)
N(Wx
i
+m+
c
,
2
c
),
c
and
2
c
are the corresponding
mean and variance of n
i
, respectively, p(c) is the weight of the
cth part (c = 1, 2, . . . , C), and we have
C
c=1
p(c) = 1.
Substituting p(y
i
|x
i
, c) into (5), the likelihood function is
shown as
p(y
i
|x
i
)=
C
c=1
p(c)
_
1
2
2
c
_D
2
exp
_
(y
i
Wx
i
m
c
)
T
(y
i
Wx
i
m
c
)
2
2
c
_
(6)
where D is the dimensionality of the original samples and the
symbol
T
represents the transpose of matrix.
B. Locality Preserving Property Embedded Into the Prior
As to the prior p(x) in (4), it is regarded as p(x)
N(0, I) [11], and I is an identity matrix
p(x) =p(x
1
, x
2
, . . . , x
N
) =
N
i=1
p(x
i
)
=
N
i=1
_
1
2
_d
2
exp
_
1
2
x
T
i
x
i
_
=
_
1
2
_Nd
2
exp
_
1
2
tr(x
T
x)
_
(7)
where d is the dimensionality of x and tr() represents the trace
of matrix.
In order to preserve the local structure of the data that is
useful for conguration recognition, here, p(x) is modied as
p(x) =p(x
1
, x
2
, . . . , x
N
) =
N
i=1
p(x
i
)
=
_
1
2
_Nd
2
exp
1
2
N
i,j=1
(x
i
x
j
)
T
S
ij
(x
i
x
j
)
=
_
1
2
_Nd
2
exp
_
tr(xLx
T
)
(8)
where L = HS is the Laplacian matrix, H is a diagonal
matrix whose entries H
ii
=
j
S
ij
(or H
ii
=
i
S
ij
, since
S is symmetric) are column sums of S, and S is the afnity
matrix that reects the similarity between any two samples. It
is constructed as
S
ij
=
_
exp
_
y
i
y
j
2
t
_
y
i
, y
j
belong to the same class
0 y
i
, y
j
belong to different classes
(9)
where t is a constant.
Here, we take a close look at (8). The physical implication
can be seen in this way: Provided that the samples y
i
and
y
j
are close, (8) guarantees that the closer x
i
and x
j
are,
the larger p(x) will be, and vice versa. In other words, the
samples that are close in the high-dimension space are still
close in the low-dimension space through the construction of
the prior given by (8). Hence, the local structure of the data set
has been preserved, which is just as what is discussed in [14]
named locality preserving projection (LPP), or better, we have
fused the objective function presented in [14] into the prior. In
270 IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, VOL. 10, NO. 2, MARCH 2013
addition, as can be seen from (9), the closer y
i
and y
j
are,
the bigger S
ij
will be. Since we wish p(x) to be as large as
possible demanded by (4), when S
ij
is big implying that y
i
and
y
j
are close to each other, to be in accordance with the objective
function, the value of (x
i
x
j
)
2
has to be smaller. It means that
x
i
and x
j
should be even closer after dimensionality reduction,
which is just what we expect.
Note that the matrix S used here is different from that in
[14]. The S constructed here establishes relationships among
all the samples that belong to the same class, while weights of
the samples that belong to different classes are all set to zero.
The global topological structure of the data is preserved by this
improvement. Due to the full use of the given labels, both the
local and global topological properties of the data are retained.
C. Parameter Estimation Using EM Algorithm
The objective function expressed by (4) can move to
arg
W
max p(y, x) arg
W
max [ln p(y, x)]
= arg
W
max
N
i=1
ln [p(y
i
, x
i
)] . (10)
In this part, we come to the solution of the parameters
using the EM algorithm [9][11]. We note that P
ic
(missing
data in this situation) denotes indicator variable labels whose
model is responsible for generating n
i
, and we can derive an
EM algorithm by considering the corresponding log-likelihood
function of the complete data which takes the form
L =
C
c=1
N
i=1
P
ic
ln [p(c)p(y
i
, x
i
|c)] . (11)
In the expectation step, taking the expectation of L with
respect to the posterior distributions and neglecting the constant
parameters that do not give an inuence to the results, we have
=
C
c=1
N
i=1
P
ic
_
ln p(c)
D
2
ln
2
c
1
2
2
c
y
i
Wx
i
m
c
1
2
N
j=1
(x
i
x
j
)
T
S
ij
(x
i
x
j
)
_
(12)
where P
ic
=(p(n
i
|c)p(c)/
C
c=1
p(n
i
|c)p(c))=(p(n
i
|c)p(c)/
p(n
i
)) is the posterior responsibility of component c for gen-
erating n
i
, p(n
i
|c) N(
c
,
2
c
).
Then, we calculate the expectation of x through the deriva-
tive of the function shown in (12)
x
i
=
_
C
c=1
P
ic
_
1
2
c
W
T
W+ (H
ii
1)
_
_
1
_
C
c=1
P
ic
_
1
2
c
W
T
(y
i
m
c
) +
0
_
_
(13)
where
0
=
N
j=1,j=i
x
j
S
ij
.
TABLE I
CONFIGURATIONS AND SIZES OF THE TRAINING AND TESTING DATA SETS
In the maximization step, is maximized with respect to
c
,
2
c
, and W to get the updated values with the newly obtained
x. As to p(c), we can take advantage of the Lagrange multiplier
; the new value of p(c) can be obtained through maximizing
the following equation:
+
_
1
C
c=1
p(c)
_
. (14)
The newly obtained parameters are
p(c) =
1
N
N
i=1
P
ic
c
=
1
N
i=1
P
ic
N
i=1
P
ic
(y
i
Wx
i
m)
2
c
=
1
D
N
i=1
P
ic
N
i=1
P
ic
y
i
Wx
i
m
c
2
W =
_
C
c=1
N
i=1
P
ic
_
1
2
c
(y
i
m
c
)x
T
i
_
_
_
C
c=1
N
i=1
P
ic
_
1
2
c
x
i
x
T
i
_
_
1
. (15)
The algorithm iterates these two steps until convergence.
Hereto, the parameters are obtained, which are ready for
identication of the target conguration through the use of a
nearest neighbor classier [12].
III. EXPERIMENTAL RESULTS AND ANALYSIS
Experimental results on the Moving and Stationary Target
Acquisition and Recognition database supported by the
Defense Advanced Resarch Projects Agency/Air Force
Research Laboratory of the U.S. [1] verify the effectiveness
of the proposed algorithm. The armored personnel carrier
BMP2 consists of three different congurations, which are
sn-9563, sn-9566, and sn-c21; the congurations of the main
battle tank T72 are sn-132, sn-812, and sn-s7; and the armored
personnel carrier BTR70 is with only one conguration sn-c71.
The SAR images obtained with the depression angle 17
are
LIU et al.: SAR TARGET CONFIGURATION RECOGNITION 271
TABLE II
PERFORMANCE OF TARGET-TYPE RECOGNITION UNDER DIFFERENT ALGORITHMS
Fig. 2. Recognition rate of each conguration versus dimensionality.
TABLE III
PERFORMANCE OF TARGET CONFIGURATION RECOGNITION UNDER DIFFERENT ALGORITHMS
used as the training data set, and the SAR images obtained
with the depression angle 15
and 360