Escolar Documentos
Profissional Documentos
Cultura Documentos
Presented at ASAP 99
L
Autofocus algorithms Properties of entropy
Outline
Examples of the entropy curve Applications and comparison with PGA Concluding remarks
L
Data Collection
Introduction to SAR
SAR Image
Range
Cross-range Range
MIT Lincoln Laboratory
Aperture
L
Error Sources
- Off-track motion - Terrain height variation - Index of refraction - Antenna pattern
Blurring Filter
Autofocus Filter
B 1; 2; : : : ;
p
H ^ 1; ^ 2; : : : ; ^ p
H B ,1
Ideal Image
Blurred Image
Autofocused Image
L
Autofocus algorithms Properties of entropy
Outline
Examples of the entropy curve Applications and comparison with PGA Concluding remarks
Sensitive measure of image focus quality Smooth dependence on autofocus parameters facilitates numerical minimization No specic target or clutter model required Extensive literature on blind deconvolution using entropy
Wiggins minimum entropy deconvolution (MED) for seismic reection data (1977) Shalvi & Weinstein demonstrate entropy-based deconvolution converges to correct solution under fairly general conditions (1990)
Data must be non-Gaussian Transfer function of blurring lter must not have zeros
Cafforio, Prati, and Rocca consider minimum entropy for Seasat SAR autofocus (1991)
L
i
Numerical Minimizer
i
=0
H
S;
Test
@S
i
Input
Output
L
where and
X S X = , nm pnm logpnm;
pnm
jxnmj2 =
P
= total power =
X jxnmj2 nm
Renyi entropy generalizes Shannon entropy to family of entropy functions smoothly parameterized by r 0:
1 X SrX = 1 , r lognm pr nm
As r
! 1, Renyi ! Shannon
FFT
xn
~ xk
i p f p (k ) ~ hk = e p
~ yk
FFT
yn
zn = yn log yn
FFT
~ zk
S 2 = p N
ASAP99-2 AFY 4/2/99
~ ~* f p (k )Im yk z k
{ }
S =
* zn yn
Z
FFT Phase Correction
vi hi = vi
r v = top eigenvector
R=Z Z
H
L
PGA
Makes strong assumptions about clutter (point scatterers) Makes weak assumptions about lter (constant modulus transfer function)
Minimum Entropy
Makes weak assumptions about clutter (non-Gaussian) Makes strong assumptions about lter (parameterized lter based on phase error model)
L
Autofocus algorithms
Outline
Properties of entropy Examples of the entropy curve Applications and comparison with PGA Concluding remarks
S = 10.585
S = 11.245
S = 11.625
Scale invariance
Permutation invariance
1 4 2 5 3 6 2 1 6 4 5 3
MIT Lincoln Laboratory
Subadditivity of Entropy
S = k S k
k
log k
Sk =
4 5 6
k =
First term is weighted average of subimage entropies Second term is entropy among subimages The Shannon entropy is the only image function with
subadditivity, scale invariance, and permutation invariance
L
E
Entropy of Noise
Assume pixels are I.I.D. random variables Expected entropy of pure noise image with N pixels is
L
Autofocus algorithms Properties of entropy
Outline
Examples of the entropy curve Applications and comparison with PGA Concluding remarks
10
Power (dB)
3.8 3.6 3000 2000 1000 0 1000 2000 Quadratic Phase Error (degrees) 3000
Assume point scatterer centered at t0, with quadratic phase error Gaussian spectral window:
and
2 r 2 jxtj = +
,t,t02
2 2 + 2
The entropy is
Entropy is minimized at
Height-of-Focus Example
CARABAS VHF SAR
Use image from CARABAS VHF SAR Apply height-of-focus correction lter based on known aircraft motion Plot Shannon entropy as a function of terrain height parameter
Focused Image
11.02 11
Shannon Entropy
200
300
Shannon Entropy
4 3000
3000
L
Autofocus algorithms Properties of entropy
Outline
Original Image
Minimum-Entropy Autofocus
PGA
Minimum-Entropy
PGA
L
Phase error (degrees)
100
500
600
Original Image
Minimum-Entropy Autofocus
Concluding Remarks
Minimum entropy can outperform PGA at expense of greater computation Minimum-entropy has the most benet over PGA under certain circumstances:
Low-contrast clutter Well-modeled phase errors Low-dimensional parameter space for phase errors Severe phase errors