Escolar Documentos
Profissional Documentos
Cultura Documentos
Sean Rocke
1 / 25
Outline
Digital Communication Preliminaries Models for Information Sources Measures of Information Source Coding Conclusion
2 / 25
Channel
Output transducer
Source decoder
Channel decoder
Digital demodulator
Elements not specically included in the illustration: Carrier and Symbol Synchronization A\D interface Channel interfaces (e.g., RF front end (RFFE), ber optic front end (FOFE), BAN front end (BANFE), . . . )
4 / 25
Value Discrete
ECNG 6703 - Principles of Communications
Continuous
5 / 25
Source coding: The process of efciently converting the output of either an analog or digital source into a bit sequence. Source coding challenge: How can the source output (digital or analog) be represented in as few bits as possible? Key performance metrics: coding efciency, redundancy, rate distortion, implementation complexity To answer the above, it is essential to model the signal sources. . .
6 / 25
An information source produces a random output The analog output at any time, t , is a random variable, X (t ) with CDF FX (x1 , t1 ) = P [X (t1 ) x1 ] The joint CDF is dened as, FX (x1 , . . . , xn ; t1 , . . . , tn ) = P [X (t1 ) x1 , . . . , X (tn ) xn ] We consider statistically stationary outputs where, FX (x1 , . . . , xn ; t1 , . . . , tn ) = FX (x1 , . . . , xn ; t1 + , . . . , tn + )
7 / 25
AnalogtoDigital Conversion
1 0.8 0.6 0.4 Amplitude 0.2 0 0.2 0.4 0.6 0.8 1 0 1 2 3 Time, t 4 5 6 Original signal Quantized signal Sampled signal Quantization error
For bandlimited X (t ) with bandwidth W , we can obtain samples {X ( 2n W )} to obtain a discretetime output Precision is lost when discretetime analog signals are quantized (Unavoidable!)
ECNG 6703 - Principles of Communications 8 / 25
9 / 25
Continuous
Discrete
Memory
Memoryless
Questions:
1
Measures of Information
Measures of Information
So given a particular source model, how do we quantify the information content?. . . Consider two discrete RVs, X and Y , and assume that an outcome Y = y is observed Can we determine quantitatively the amount of information the occurrence of Y = y provides about the event X = x ? Mutual information between outcomes x and y : measure of information provided by the occurrence of Y = y about X = x : I (x ; y ) = log
PX |Y (x ,y ) PX (x )
PXY (x , y )I (x ; y )
Measures of Information
Measures of Information
Properties of Mutual Information: I (X ; Y ) = I (Y ; X ) I (X ; Y ) 0 I (X ; Y ) = 0 if and only if X and Y independent I (X ; Y ) min{|X |, |Y|} Questions: Consider I (x ; y ) for the following cases:
1 2 3
X and Y are statistically independent X and Y are fully dependent X and Y are partially dependent
12 / 25
Measures of Information
x X
PX (x )log PX (x )
What is the entropy of a deterministic information source? When is the entropy of a DMS with alphabet size, |X |, maximized? What is H (X ) in this case?
13 / 25
Measures of Information
Calculate the entropy for a binary source with probability, p, that a 1 occurs. Plot the Entropy function (i.e., H (X ) vs p) using MATLAB.
14 / 25
Measures of Information
PY |X (x , y )log PY |X (x , y )
We obtain the conditional entropy if this quantity is averaged over all X values: H (Y |X ) = E log PY |X (x , y ) = (x ,y )X Y PXY (x , y )log PY |X (x , y ) Question: Show that H (X , Y ) = H (X ) + H (Y |X ).
ECNG 6703 - Principles of Communications 15 / 25
Measures of Information
Properties of Joint and Conditional Entropy: 0 H (X |Y ) H (X ) H (X |Y ) = H (X ) if and only if X and Y independent H (X , Y ) = H (X ) + H (Y ) if and only if X and Y independent H (X , Y ) = H (X ) + H (Y |X ) = H (Y ) + H (Y |X ) H (X ) + H (Y ) I (X ; Y ) = H (X ) H (X |Y ) = H (Y ) H (Y |X ) = H (X ) + H (Y ) H (X , Y )
16 / 25
Measures of Information
Note: I (X ; Y ) = H (X ) H (X |Y ) = H (Y ) H (Y |X ).
ECNG 6703 - Principles of Communications 17 / 25
Source Coding
You are required to know how to use both the Huffman and LempelZiv algorithms to determine source codes!
ECNG 6703 - Principles of Communications 18 / 25
Source Coding
Questions:
1
Source Coding
Questions:
1
What is the tradeoff compared to the rst code for single letters?
ECNG 6703 - Principles of Communications 20 / 25
Source Coding
Questions:
1 2 3
Classify the following codes. Which is the most efcient? Which would you choose?
ECNG 6703 - Principles of Communications 21 / 25
Source Coding
k ) d (xk , x
Distortion, D : k ) = E d (Xk , X
1 n n k =1
k ) E d (Xk , X
22 / 25
Source Coding
Rate distortion function: Minimum bits/source output symbol required to represent source output, X , with distortion less than or equal to D: R (D ) = minf Note: Evaluation of code performance using rate distortion applies to lossy coding, where the data is compressed, subject to a maximum tolerable distortion (i.e., some of the information is lost during coding and hence cannot be regained via reconstruction).
X |X
)] D {I (X ; X )} ):E [d (X ,X (x ,x
23 / 25
Conclusion
Conclusion
We covered: Elements of a digital communications system Mathematical models for information sources Measures of information systems Lossless & lossy source coding Source code evaluation More MATLAB Your goals for next class: Continue ramping up your MATLAB skills Make sure you can apply Huffman and Lempel-Ziv algorithms! Complete HW 2 Review notes on Channel Coding in prep for next class
ECNG 6703 - Principles of Communications 24 / 25
Q&A
Thank You
Questions????
25 / 25