Você está na página 1de 27

L

Minimum Entropy SAR Autofocus


Ali F. Yegulalp MIT Lincoln Laboratory 10 March 1999

Presented at ASAP 99

MIT Lincoln Laboratory

L
Autofocus algorithms Properties of entropy

Outline

Introduction to SAR and the autofocus problem

Minimum entropy autofocus Phase gradient autofocus (PGA)

Examples of the entropy curve Applications and comparison with PGA Concluding remarks

MIT Lincoln Laboratory

L
Data Collection

Introduction to SAR

Phase History Data

SAR Image

Matched Filter Bank

Range

Cross-range Range
MIT Lincoln Laboratory

Aperture

L
Error Sources
- Off-track motion - Terrain height variation - Index of refraction - Antenna pattern

The SAR Autofocus Problem

Blurring Filter

Autofocus Algorithm Estimate ^ 1; ^ 2; : : : ; ^ p

Autofocus Filter

B  1; 2; : : : ;

p

H  ^ 1; ^ 2; : : : ; ^ p

H  B ,1

Ideal Image

Blurred Image

Autofocused Image

MIT Lincoln Laboratory

L

Autofocus algorithms Properties of entropy

Outline

Introduction to SAR and the autofocus problem

Minimum entropy autofocus Phase gradient autofocus (PGA)

Examples of the entropy curve Applications and comparison with PGA Concluding remarks

MIT Lincoln Laboratory

Reasons for Using Entropy

Sensitive measure of image focus quality Smooth dependence on autofocus parameters facilitates numerical minimization No specic target or clutter model required Extensive literature on blind deconvolution using entropy
Wiggins minimum entropy deconvolution (MED) for seismic reection data (1977) Shalvi & Weinstein demonstrate entropy-based deconvolution converges to correct solution under fairly general conditions (1990)
Data must be non-Gaussian Transfer function of blurring lter must not have zeros

Cafforio, Prati, and Rocca consider minimum entropy for Seasat SAR autofocus (1991)

MIT Lincoln Laboratory

L
i

Minimum Entropy Algorithm

Numerical Minimizer
i

=0

H

S;
Test

@S
i

Input

Output

MIT Lincoln Laboratory

L
where and

Denition of Image Entropy


For image X with complex-valued pixels xnm, the Shannon entropy is

X S X  = , nm pnm logpnm;
pnm

jxnmj2 =
P

= total power =

X jxnmj2 nm

Renyi entropy generalizes Shannon entropy to family of entropy functions smoothly parameterized by r 0:
1 X SrX  = 1 , r lognm pr  nm

As r

! 1, Renyi ! Shannon

No obvious information-theoretic interpretation


MIT Lincoln Laboratory

Entropy and Gradient Calculation

FFT

xn

~ xk
i p f p (k ) ~ hk = e p

~ yk

FFT

yn

zn = yn log yn
FFT

~ zk

S 2 = p N
ASAP99-2 AFY 4/2/99

~ ~* f p (k )Im yk z k

{ }

S =

* zn yn

MIT Lincoln Laboratory

Phase Gradient Autofocus


Input Image Find Brightest Points Center FFT Data Matrix

Z
FFT Phase Correction

IFFT Output Image

vi hi = vi

r v = top eigenvector

R=Z Z
H

ASAP99-1 AFY 4/2/99

MIT Lincoln Laboratory

L
PGA

Philosophical Comparison of PGA and Minimum Entropy

Makes strong assumptions about clutter (point scatterers) Makes weak assumptions about lter (constant modulus transfer function)

Minimum Entropy
Makes weak assumptions about clutter (non-Gaussian) Makes strong assumptions about lter (parameterized lter based on phase error model)

MIT Lincoln Laboratory

L
Autofocus algorithms

Outline

Introduction to SAR and the autofocus problem

Minimum entropy autofocus Phase gradient autofocus (PGA)

Properties of entropy Examples of the entropy curve Applications and comparison with PGA Concluding remarks

MIT Lincoln Laboratory

Examples of Image Entropy


S=0 S = log(3) S = log(7)

S = 10.585

S = 11.245

S = 11.625

ASAP99-3 AFY 4/2/99

MIT Lincoln Laboratory

Invariance Properties of Entropy

Scale invariance

Permutation invariance
1 4 2 5 3 6 2 1 6 4 5 3
MIT Lincoln Laboratory

ASAP99-4 AFY 4/2/99

Subadditivity of Entropy
S = k S k
k

log k

Sk =
4 5 6

Entropy of region k Fraction of total power in region k

k =

First term is weighted average of subimage entropies Second term is entropy among subimages The Shannon entropy is the only image function with
subadditivity, scale invariance, and permutation invariance

ASAP99-5 AFY 4/2/99

MIT Lincoln Laboratory

L
E

Entropy of Noise

Assume pixels are I.I.D. random variables Expected entropy of pure noise image with N pixels is

fS g  log N , Efjxj2 logjxj2g

For zero-mean complex Gaussian noise, the entropy is


E

fS g  log N ,  log N , 0:422784

The expected entropy of Gaussian noise is invariant under image ltering

MIT Lincoln Laboratory

L
Autofocus algorithms Properties of entropy

Outline

Introduction to SAR and the autofocus problem

Minimum entropy autofocus Phase gradient autofocus (PGA)

Examples of the entropy curve Applications and comparison with PGA Concluding remarks

MIT Lincoln Laboratory

Entropy of Point Scatterers in Gaussian Noise


Simulate one-dimensional data with randomly located point scattering centers and complex Gaussian noise Plot Shannon entropy as a function of quadratic phase error
Point Scatterers in Gaussian Noise Shannon Entropy vs. Quadratic Phase
5.2 0 5
Shannon Entropy

5 4.8 4.6 4.4 4.2 4

10
Power (dB)

15 20 25 30 35 40 50 100 150 Pixel Number 200 250

3.8 3.6 3000 2000 1000 0 1000 2000 Quadratic Phase Error (degrees) 3000

MIT Lincoln Laboratory

Closed Form Solution for Point Scatterers

Assume point scatterer centered at t0, with quadratic phase error Gaussian spectral window:

and

Z 1 i!t,t ,i ! ,! xt = e d!;


,1
0 2 2

2 r 2 jxtj =  +

,t,t02
2 2 + 2 

The entropy is

1 S = 1 , 1 log 2  + 2 log2 + 2 2  = 0, as expected

Entropy is minimized at

Multiple point scatterers can be approximated using subadditivity

MIT Lincoln Laboratory

Height-of-Focus Example
CARABAS VHF SAR
Use image from CARABAS VHF SAR Apply height-of-focus correction lter based on known aircraft motion Plot Shannon entropy as a function of terrain height parameter
Focused Image
11.02 11
Shannon Entropy

Entropy vs. Height-of-Focus

10.98 10.96 10.94 10.92 10.9 300

200

100 0 100 200 HeightofFocus Error (m)

300

MIT Lincoln Laboratory

Entropy of Random Clutter


Generate random clutter image where each pixel is drawn independently from a log-normal distribution Plot entropy as a function of quadratic phase error
Random Clutter Image Entropy vs. Quadratic Phase Error
7.5 7

Shannon Entropy

6.5 6 5.5 5 4.5

4 3000

2000 1000 0 1000 2000 Quadratic Phase Error (degrees)

3000

MIT Lincoln Laboratory

L
Autofocus algorithms Properties of entropy

Outline

Introduction to SAR and the autofocus problem

Minimum entropy autofocus Phase gradient autofocus (PGA)

Examples of the entropy curve

Applications and comparison with PGA Concluding remarks

MIT Lincoln Laboratory

Autofocus on Low-Contrast Image


ADTS Stockbridge Data

Original Image

Image Blurred with Crossrange Quadratic Phase

Minimum-Entropy Autofocus

PGA

MIT Lincoln Laboratory

Autofocus with High-Order Phase Errors


CARABAS Keystone Data (Mission 2, Pass 2)
Original Image Image with High-Order Phase Error

Minimum-Entropy

PGA

MIT Lincoln Laboratory

L
Phase error (degrees)

High-Order Phase Error Function

0 200 400 600 800 1000 1200 0

100

200 300 400 Frequency bin

500

600

MIT Lincoln Laboratory

Minimum-Entropy on 2D Phase Errors


ADTS Stockbridge Data

Original Image

Image Blurred with 2D Quadratic Phase Error

Minimum-Entropy Autofocus

MIT Lincoln Laboratory

Concluding Remarks

New components developed for minimum-entropy method


Parameterized lter design
Exploits knowledge of blurring lter structure Drastically reduces dimension of space for minimization procedure

Gradient formula for numerical minimization

Minimum entropy can outperform PGA at expense of greater computation Minimum-entropy has the most benet over PGA under certain circumstances:
Low-contrast clutter Well-modeled phase errors Low-dimensional parameter space for phase errors Severe phase errors

MIT Lincoln Laboratory

Você também pode gostar