Você está na página 1de 33

Probability and Random

Processes
Sanjit Kaul

Asymptotic Approximations of
The Binomial Random Variable

Why Approximations?
For a RV X that is Binomial(n,k) we know that

Note that n choose k term increases very rapidly


We want approximations that make computation
easier and result in acceptable error

The Normal Approximation (DeMoivreLaplace Theorem) Sec 4-5 PP

Paraphrasing From Wiki:


It is believed that de Moivres attempt to approximate
coefficients of (a+b)^n is where the normal distribution
first appeared
Also, Gauss first showed that the error in
measurements can be described by a probability law,
which is the normal law of errors

The Normal Approximation


We can write (see Eq 4-96 in PP)

where

Animation From Wiki

http://en.wikipedia.org/wiki/De_Moivre%E2%80%93Lapla
ce_theorem (Also, see proof. Use of the Stirlings
approximation for fact(n) when n is large)

The Poisson Approximation (The Law of


Rare Events)
We are interested in the following case

Suppose I ask you stand besides a stall and count


the number of customers that arrive at the stall.
Or I ask you to count (manually?) the number of
packets that arrive at the IIITD internet gateway

The Poisson Approximation


Lets divide our observation interval T into smaller
intervals
We can come up with a model where we perform a
Bernoulli trial in every
A customer/packet arrives in with probability p = /T

As becomes smaller, the number of trials n increases


and p decreases, that is n -> and p -> 0.
Average!
Under the condition that

the probability that k arrivals take place in n trials can be


approximated by a Poisson distribution

The Poisson Approximation


The Poisson approximation is useful for cases
where p is very small and n is very large
We have, for our Binomial RV Kn

The Poisson Approximation


We have

Therefore

which is the Poisson distribution


The law of rare events states that for a large number of
trials (n) and a small probability of occurrence of the event
during a trial, the number of event occurrences follows
(approximately) a Poisson distribution

Inequalities!

Markovs Inequality (See 5.5.1 RN)


Inequality on the survivor function P[X >= t] of a
RV X
Expectation of a function h(X) of RV X is given by

Assume h(z) >= 0 for all z and that h(z) is a nondecreasing function
For any t we have

Markovs Inequality (See 5.5.1 RN)


For any t we have

?
Therefore

We get
Markovs
Inequality

Markovs Inequality
An example of h(x) that is non-negative and nondecreasing is h(x) = x+ where

We have

The above is also called the Simple Markov


Inequality

Markovs Inequality
Inequality is useful to make observations about
the tail of a distribution
Let RV X be the service time at a restaurant or the
time it takes to load a webpage
Clearly, X >= 0 and E[X+] = E[X]
Using the inequality, since X>=0, we get

If the expected time is 1 sec


P[X >= 10] <= 1/10 = 0.1

Chebyshevs Inequality
We will use the Markov Inequality
Define
We have (using Markov Inequality and since Y>=0)

Thus

Chebyshevs Inequality
We have

Let
Substituting

For any RV X, the probability that the RV is more


than c standard deviations away from the mean is
<= 1/c2

Chebyshevs Inequality
We have

If Var[X] = 0, then probability that X = E[X] is 1.


The bound may not be a tight upper bound
For X that is Gaussian

Chebyshevs Inequality
The Chebyshev bound is

Clearly a very loose bound in our example!


Note that the calculation of the bound does not
require knowledge of the distribution

Lyapunov Inequality (Eq 5-92 PP)

Define RV Y as

We know that E[Y2] >= 0


That is

Lyapunov Inequality (5-92 PP)


The quadratic

Its discriminant must be non-positive


We have
That is
Via Wikipedia

We get

Lyapunov Inequality (5-92 PP)


Note that 0 = 1
Substituting for 0 in the first inequality, 1 in the
second inequality and so on, we get

Thus we have

Jensens Inequality
For any convex function f(.) and RV X

Calculating Moments

Prove the Identity

Think Gaussian pdf


Use the fact that area under it is 1

Calculating Moments of a Gaussian RV


E[Xn] = 0 for odd n
For even n: n=2k, k =0,1,2,
Start with
Differentiate k times with respect to . We get

Set =1/(22) and we can calculate the even


moments E[X2k] for k=0,1,2,

Moments of a Gaussian RV
What about E[|Xn|]?
Same as E[Xn] for even n

For odd n = 2k+1, k=0,1,2,

Let y=x2/(22) , we get

Moments of a Gaussian RV
Finally, note that, for integer k

HW: A Rayleigh density is given by

Find E[Xn]. Start with the definition and use E[|Y|n]


we calculated for a Gaussian Y. You must get

Problem 5-23 from PP


X has a Rayleigh density

Let Y = b + c X2
Show that E[Y2] = 4c24

HW: Problem 4-21 PP


The probability of heads of a random coin is a RV P
uniform in the interval (0,1). (a) Find P[0.3 <= P <=
0.7]. (b) The coin is tossed 10 times and heads
show 6 times. Find the a posteriori probability that
P[0.3 <= P <= 0.7]
This is similar to a problem we solved in class

Some More Interesting Problems


Problem 5-23 PP
Problem 5-14 PP
Problem 5-12 PP
Problem 5-2 PP
Problems 5-51 and 5-52 PP

Summary For a RV X
CDF is given by FX(x) = P[X <= x]

Nondecreasing
Starts at 0 and ends at 1
Is continuous for a continuous RV
Contains steps for a discrete RV. Step size at x is the
PMF at x, PX(x) = P[X=x]

PDF fX(x) is the derivative of FX(x)


Always >= 0
It is the slope of the CDF at x
Area under PDF is 1

Summary For a RV X
Moments of RV X

When conditioning on an event E

Você também pode gostar