Você está na página 1de 55

Lecture # 2

(a) Review of Probability, Random


Variables and Stochastic Process
(b) Noise and Bandwidth
Sheraz Alam Khan
Asst. Professor, Department of Engineering
Sheraz.alam@gmail.com
National University of Modern Language

1
1.2 Classification of signals
Deterministic and random signals
Deterministic signal: No uncertainty with respect to the signal
value at any time.
x(t) = 5 cos10t
Random signal: Some degree of uncertainty in signal values
before it actually occurs.
Thermal noise in electronic circuits due to the random movement of
electrons
Reflection of radio waves from different layers of Ionosphere
Interference

When a random waveform is observed over a long period then it is referred as


Random Process, that may exhibit certain regularities which may be describes
by probabilities and statistical averages

2
Classification of signals.

3
Classification of signals
Energy and power signals
A signal is an energy signal if, and only if, it has nonzero but finite energy for all
time:

A signal is a power signal if, and only if, it has finite but nonzero power for all time:

General rule: Periodic and random signals are power signals. Signals that are both
deterministic and non-periodic are energy signals. 4
Classification of signals
The performance of a communication system depends upon
received energy; higher energy signals are detected more
reliably(with few errors) than are the lower energy signals
Energy signal has finite energy but zero avg. power
On the other hand power is the rate at which energy is delivered
Power determines the voltages that must be applied to transmitter
and the intensities of EM fields that one must contend with in a radio
system
Power signal has finite avg. power but finite energy

5
Classification of signals

The Unit Impulse Function


Dirac delta function d(t) or impulse function is an
abstractionan infinitely large amplitude pulse, with zero
pulse width, and unity weight (area under the pulse),
concentrated at the point where its argument is zero.

6
1.3 Spectral Density
The spectral density of a signal characterizes the
distribution of the signals energy or the power in the
frequency domain.
This concept is particularly important in
Communication Systems. We need to be able to
evaluate the signal and noise at the filter output.
The energy spectral density (ESD) and power spectral
density (PSD) is used in the evaluation

7
Energy Spectral Density (ESD)

Where X(f) is Fourier transform of the non-periodic signal x(t)


ESD describes the signal energy per unit bandwidth measured
in joules/Hz
There are equal energy contributions from both +ive and ive
frequency components, since for real signal x(t), |X(f)|, is an
even function of frequency. Therefore, the ESD is symmetrical
in frequency about the origin:

= 2 (f) dt 8
Power Spectral Density (PSD)

Where the |cn|terms are complex Fourier series coefficients of


the periodic signal
Avg. normalized power of real valued signal is given by:

= 2 (f) dt

9
PSD of a non-periodic signal
If x(t) is a non-periodic signal it cannot be expressed by a
Fourier series and if it is a non-periodic power signal (having
infinite energy) it may not have a Fourier transform
However , we may still express the PSD of such signals in the
limiting sense. If we form a truncated version xT(t) of non-
periodic power signal x(t) by observing it only in the
interval(-T/2, T/2), then xT(t) has finite energy and has a
proper Fourier transform XT(f), which is defined by:

10
1.4 AUTOCORRELATION

11
AUTOCORRELATION.
Autocorrelation of an Energy Signal
Correlation is a matching process; autocorrelation refers to the matching of a
signal with a delayed version of itself.
Autocorrelation function of a real-valued energy signal x(t) is defined as:

The autocorrelation function () provides a measure of how closely the


signal matches a copy of itself as the copy is shifted units in time.
() t is not a function of time; it is only a function of the time difference
between the waveform and its shifted copy.

12
Properties of an autocorrelation function for
Energy Signals
For real-valued (and WSS in case of random signals):
1. Autocorrelation and spectral density form a Fourier
transform pair.

2. Autocorrelation is symmetric around zero.

3. Its maximum value occurs at the origin.

4. Its value at the origin is equal to the average power


or energy.
13
AUTOCORRELATION.
Autocorrelation of an Periodic(Power) Signal
Autocorrelation function of a real-valued power signal x(t) is
defined as:

When the power signal x(t) is periodic with period T0 , the


autocorrelation function can be expressed as

14
Properties of an autocorrelation function for
Power Signals
The autocorrelation function of a real-valued periodic signal has
the following properties similar to those of an energy signal:

15
1.5 Random Signals

All useful message signals appear random; that is, the receiver
does not know, a priori, which of the possible waveform have
been sent.
Also noise that accompanies the message signal is due to
random electrical signal
Let a random variable X(A) represent the functional
relationship between a random event A and a real number.
Notation - Capital letters, usually X or Y, are used to denote
random variables.
Corresponding lower case letters, x or y, are used to denote
particular values of the random variables X or Y
16
Random Signals.
Cumulative Distribution Function (CDF)
The Distribution function FX(x) of the random variable X is given by:

It means probability the value taken by the random variable X is less than
or equal to real number x
FX(x) has following properties:

17
Random Signals.
Probability Density Function (PDF)

Like CDF, PDF is also function of a real number x. The name density function
arises from the fact that the probability of the event is:

PDF has following important properties:

18
Ensemble Averages

19
Random Processes
A random process X(A, t) can be viewed as a function of two variables:
an event A and time.
In the figure, we have N sample
functions of time, [ Xj(t)]
The totality of all sample
functions is called ensemble
For a specific event, A= Aj, and a
specific time , t=tk, X(Aj,tk) is
simply a number
For notational convenience we
shall designate the random
process by X(t), and let the
functional dependence upon A
be implicit

20
Statistical Averages of a
Random Process
Because the value of a random process at any future time is unknown(since
the identity if the event A is unknown), a random process whose distribution
functions are continuous can be described statistically with a probability
density function (pdf).
A partial description consisting of the mean and autocorrelation function are
often adequate for the needs of communication systems.
Mean of the random process X(t) :

Autocorrelation function of the random process X(t):

21
Stationarity
A random process X(t) is said to be stationary in the strict sense if none
of its statistics are affected by a shift in the time origin.
A random process is said to be wide-sense stationary (WSS) if two of its
statistics, its mean and autocorrelation function, do not vary with a shift
in the time origin.

Strict-sense Stationarity implies WSS Stationarity, but not vice versa


Most of the useful results un communication theory are predicted on
random information signals and noise being WSS
If the mean and autocorrelation functions are periodic in time then the
random process is said to be Cyclostationary.
22
Autocorrelation of a Wide-Sense
Stationary Random Process
Just as the Variance provides a measure of randomness for random variables, the
autocorrelation function provides a similar measure for random processes.
For a wide-sense stationary process, the autocorrelation function is only a
function of the time difference

Properties of the autocorrelation function of a real valued wide sense stationary


process are:

23
Time Averaging and Ergodicity
A random process is ergodic in mean and autocorrelation, if

Since time averages equals ensemble averages for ergodic


processes, fundamental electrical engineering parameters,
such as dc value , rms vale, and average power can be related
to the moments of an ergodic random process

24
Power Spectral Density and
Autocorrelation
A random process X(t) can generally be classified as a power signal having a
power spectral density (PSD) Gx(f)
Gx(f) is particularly very useful in communication system since it describes the
distribution of a signals power in the frequency domain.
The PSD enables us to evaluate the signal power that will pass through a network
having known frequency characteristics.
We summarize the principal features of PSD as:

25
26
27
PSD and Autocorrelation
PSD and autocorrelation functions performs Fourier
transform pairs
Relative shape of autocorrelation function gives us
information in frequency domain-----information
about B.W of the underlying signal
If it ramps down gently, then we are dealing with low B.W
signal
If it is steep, then we are dealing with high B.W signal
Obviously, PSD will look opposite
This can be observed in previous example in fig 1.5, 1.6

28
Noise in communication systems
Noise is referred to unwanted electrical signals that are
always present in electrical systems.
Noise superimposed on a signal tends to obscure or mask the
signal.
Noise limits the rate of information transmission
Noise arises from a variety of sources both man-made and
natural.
The man-made noise includes such sources as spark-plug
ignition noise, switching transients and other radiating
electromagnetic signals.
Natural noise includes such elements as the atmosphere, the
sun and other galactic sources.
29
Elimination of Noise
Noise or its undesirable effect can be eliminated
through
Filtering

Shielding

The choice of modulation


The selection of an optimal receiver site

30
Thermal or Johnson Noise

One natural source of noise called thermal noise


cannot be eliminated.
Thermal noise can be described as zero-mean
Gaussian random process.
A Gaussian process n(t) is a random function whose
value n at any arbitrary time t is statistically
characterized by the Gaussian probability density
function.

31
Gaussian density function
Normalized or standardized Gaussian density function of a
zero mean process is obtained by assuming =1
We often represent a random signal as the sum of a Gaussian
noise random variable and a dc signal i.e.
z=a+n
z is the random signal
a is the dc component
n is the Gaussian noise random variable
The pdf p(z) is expressed as

32
Gaussian Distribution
The Gaussian distribution is often used as the system
noise model because of central limit theorem.
The central limit theorem states that under very
general conditions the probability distribution of the
sum of j statistically independent random Variables
approaches the Gaussian distribution as j ,no
matter what the individual distribution function may
be.
Therefore, even though individual noise mechanisms
will tend toward the Gaussian distribution

33
White Noise
The spectral characteristics of thermal noise is that its power
spectrum density is same for all frequencies of interest in
most communication systems .
A thermal noise source emanates an equal amount of noise
power per unit bandwidth at all frequencies.
Therefore a simple model for thermal noise assumes that its
power spectral density Gn(f) is flat for all frequencies

When noise power has such a uniform spectral density we


refer to it as white noise.

34
White Noise
The autocorrelation function of white noise is

The average power of white noise is infinite because its bandwidth is


infinite

35
Additive White Gaussian Noise(AWGN)
The effect on the detection process of a channel with AWGN is that the
noise affects each transmitted symbol independently. Such a channel is
called memory-less channel.
The term additive means that the noise is simply superimposed or added to
the signal----no multiplicative mechanism at work
Since thermal noise is present in all communication systems and is the
prominent noise source for the most systems, the thermal noise
characteristics-----additive, white and Gaussian------are most often used to
model the noise communication systems
Since zero-mean Gaussian noise is completely characterized by its
variance, this model is particularly simple to use in the detection of signals
and in the design of optimum receivers
Throughout this book, unless otherwise stated, we assume that the system
is corrupted by zero-mean AWGN, even though this is sometimes an
oversimplification
36
1.6 SIGNAL TRANSMISSION TROUGH LINEAR
SYSTEM
A system can be characterized equally well in the time domain
or the frequency domain, techniques will be develop in both
domains to analyze the response of linear system to an
arbitrary.
Input Output
Linear Network

x(t) h(t) y(t)


X(f) H(f) Y(f)

37
Impulse Response
The linear time invariant system is characterized in the time
domain by an impulse response.
h(t) which is the response when the input is equal to a unit
impulse (t)

The response of the network to an arbitrary input signal x(t) is


found by the convolution of x(t) with h(t).

The system is assumed to be causal

38
FREQUENCY TRANSFER FUNCTION
Convolution in the time domain transforms to multiplications
in the frequency domain
Y( f ) X( f )H( f )

Transfer function: Y(f )


H(f)
X(f)
Frequency response: H ( f ) H ( f ) e j ( f )

Phase response: ( f ) tan 1


Im { H ( f )}
R e{ H ( f )}

39
Random Processes and Linear Systems
If a random process forms the input to LTI system then
the output will also be a random process
The input PSD and output PSD is related as:

Where is PSD of input Random process


is PSD of output Random process

40
DISTORTIONLESS TRANSMISSION
For an ideal transmission line..it may have some
delay and different amplitude but NO DISTORTION
Thus for an ideal distortion-less transmission, we can
describe the output signal as:
y(t) = K x(t-t0) ; K and t0 are constants
Taking Fourier transform:

Transfer function is :

41
DISTORTIONLESS TRANSMISSION.
To achieve ideal distortionless transmission the overall system
response must have a constant magnitude response and its
phase shift must be linear with frequency.
All of the signals frequency components must also arrive with
identical time delay in order to add up correctly
The time delay is:

The phase shift must be proportional to frequency in order for


the time delay of all components to be identical.
Envelope or group delay is used to measure delay distortion of
a signal

42
DISTORTIONLESS TRANSMISSION.
For distortion less transmission an equivalent of
characterizing phase to be a linear function of
frequency is to characterized the envelope delay as a
constant.
In practice, a signal will be distorted in passing
through some parts of a system.
Phase or amplitude correction (equalization)
networks may be introduced elsewhere in the system
to correct for this distortion
It is the overall input-output characteristic of the
system that determines its performance.
43
Ideal Filters
This describes an ideal network which cannot be
built.
This T.F implies infinite B.W, but in actual practice one has to
use band limited approximation with upper and lower limit (fu
and fl )
Passband: fu < f < fl
Filter B.W: Wf = fu - fl

BPF: when fl 0 and fu


LPF: when fl = 0 and fu has a finite value

HPF: when fl has a non-zero value and when fu


44
Ideal Filters..
For the ideal low-pass filter transfer function with bandwidth Wf = fu
hertz can be written as:

H ( f ) H ( f ) e j ( f )

1 for | f | fu
H( f )
0 for | f | fu

e j ( f ) e j 2 ft0
Figure1.11 (b) Ideal low-pass filter
45
Ideal Filters
The impulse response of the ideal low-pass filter:
h ( t ) 1 { H ( f )}



H ( f ) e j 2 ft df

fu


fu
e j 2 ft0 e j 2 ft df

fu


fu
e j 2 f ( t t0 ) df

sin 2 f u ( t t 0 )
2 fu
2 f u ( t t 0 )
2 f u sin nc 2 f u ( t t 0 )
46
Ideal Filters..
For the ideal band-pass For the ideal high-pass filter
filter transfer function transfer function

Figure1.11 (a) Ideal band-pass filter Figure1.11 (c) Ideal high-pass filter

47
Realizable Filters
The simplest example of a realizable low-pass filter; an RC filter
1 1
H(f) e j ( f )
1 j 2 f 1 (2 f ) 2

RC filter Magnitude response


Phase response

48
Realizable Filters
There are several useful approximations to the ideal low-pass
filter characteristic and one of these is the Butterworth filter

1
Hn ( f ) n 1
1 ( f / fu ) 2n

(1.65)

Butterworth filters are


popular because they are
the best approximation to
the ideal, in the sense of
maximal flatness in the filter
passband.

49
1.7 Bandwidth Of Digital Data
Baseband versus Bandpass

An easy way to translate the spectrum


of a low-pass or baseband signal x(t)
to a higher frequency is to multiply or
heterodyne the baseband signal with
a carrier wave cos 2fct
xc(t) is called a double-sideband (DSB)
modulated signal
xc(t) = x(t) cos 2fct (1.70)
From the frequency shifting theorem
Xc(f) = 1/2 [X(f-fc) + X(f+fc) ] (1.71)
Generally the carrier wave frequency
is much higher than the bandwidth of
the baseband signal
50

fc >> fm and therefore WDSB = 2fm


Bandwidth Of Digital Data
Bandwidth Dilemma
Theorems of communication
and information theory are
based on the assumption of
strictly bandlimited channels
For all bandlimited spectra,
the waveforms are not
realizable, and for realizable
waveforms, the absolute B.W
is infinite
The mathematical description
of a real signal does not
permit the signal to be strictly
duration limited and strictly
bandlimited.

51
Bandwidth Of Digital Data
All bandwidth criteria have in common the attempt to specify a
measure of the width, W, of a nonnegative real-valued spectral
density defined for all frequencies f <

The single-sided power spectral density for a single heterodyned


pulse xc(t) takes the analytical form:

2
sin ( f f c )T
Gx ( f ) T
( f f c )T

52
Different Bandwidth Criteria
(a) Half-power
bandwidth.
(b) Equivalent
rectangular or noise
equivalent bandwidth.
(c) Null-to-null
bandwidth.
(d) Fractional power
containment
bandwidth.
(e) Bounded power
spectral density.
(f) Absolute bandwidth.
53
Questions?

Questions are guaranteed in


Life.Answers are not!

NUML Engineering & IT 54


Thank You!

NUML Engineering & IT 55

Você também pode gostar