Você está na página 1de 4

Discrete Probability Distributions

In statistics and probability theory, a discrete probability distribution is a distribution


characterized by a probability mass function. This distribution is commonly used in
computer programs which help to make equal probability random selections between a
number of choices. The most common applications of discrete probability distribution
are binomial distribution, Poisson distribution, geometric distribution and Bernoulli
distribution.

Any random variable is called discrete random variable which is the part of discrete
distribution. A random variable can take two types of values, either fix numbers that is
discrete values or a range that is continuous type of values. In continuous type data, the
values can lie anywhere within the specified range. For example: the number of apples
in the basket is discrete while the time needed to drive from school to home is
continuous.

So the probability distribution over a random variable X where X takes discrete values,
is commonly said to be discrete probability distribution.

For Example: consider the event of tossing two coins, SS = HH,TH,HT,TTHH,TH,HT,TT. Let
us consider the event e Y to of occurrence of a tail. Now clearly Y = 0, 1, 2 only, that is discrete
values only.

For YY = 0, that is HHHH, P(Y)P(Y) = 1414

For YY = 1, that is TH,HTTH,HT, P(Y)P(Y) = 2424

For YY = 2, that is TTTT, P(Y)P(Y) = 1414

On adding all three we get 1414 + 2424 + 1414 = 1.

Thus we have proved our formula using a very common example.

The discrete probability distribution can always be represented in the form of a table as below:
YY P(Y)P(Y)

0 1414 = 0.25

1 2424 = 0.50
2 1414 = 0.25

For any discrete probability distribution we can always find the mean or the expected value by:

eP(X=e)eP(X=e)

In above example, the expected value = 0 + 2424 + 2424 = 1. But it is not necessary to have
expected value equal to 1. It can be

Example 1: Find the expected value of the following discrete distribution.

YY P(Y)P(Y)
0 0.30
1 0.20
2 0.25
3 0.15
4 0.10

Solution:

YY P(Y)P(Y) YP(Y)YP(Y)
0 0.30 0
1 0.20 0.20
2 0.25 0.50
3 0.15 0.45
4 0.10 0.40
So expected value = 0 + 0.20 + 0.50 + 0.45 + 0.40 = 1.55

Example 2: We flip a coin 10 times. Find the probability that 6 heads are obtained.

Solution:
We solve this using binomial distribution.

A binomial distribution is expressed by B (n, p) where n is the number of trials made, k


is the number of successes out of n trials and p is the probability of a success in each
trial. So (1 p) will be the probability of failure in each trial. Then, the binomial
distribution is calculated as below.
p(X=k)=(nk)pk.(1p)nkp(X=k)=(nk)pk.(1p)nk

The term (nk)(nk) is known as the binomial coefficient and is calculated as:

n!((k!)(nk)!)n!((k!)(nk)!)

Here, n = 10, k = 6, p = 1212 = 0.5. So, 1 p = 0.5

Using this we get, P(X = 6) = 0.2051

Probability distribution

In probability and statistics, a probability distribution is a mathematical function that, stated in


simple terms, can be thought of as providing the probability of occurrence of different possible
outcomes in an experiment. For instance, if the random variable X is used to denote the
outcome of a coin toss ('the experiment'), then the probability distribution of X would take the
value 0.5 for , and 0.5 for .

In more technical terms, the probability distribution is a description of a random phenomenon in


terms of the probabilities of events. Examples of random phenomena can include the results of
an experiment or survey. A probability distribution is defined in terms of an underlying sample
space, which is the set of all possible outcomes of the random phenomenon being observed.
The sample space may be the set of real numbers or a higher-dimensional vector space, or it
may be a list of non-numerical values; for example, the sample space of a coin flip would be.

Probability distributions are generally divided into two classes. A discrete probability
distribution (applicable to the scenario where the set of possible outcomes is discrete, such as
in a coin toss or a flip of a dice) can be encoded by a discrete list of the probabilities of the
outcomes, known as a probability mass function. On the other hand, a continuous probability
distribution (applicable to the scenarios where the set of possible outcomes can take on values
in a continuous range (e.g., real numbers), such as the temperature on a given day) is typically
described by probability density functions (with the probability of any individual outcome actually
being 0). The normal distributionrepresents a commonly encountered continuous probability
distribution. More complex experiments, such as those involving stochastic processes defined
in continuous time, may demand the use of more general probability measures.
A probability distribution whose sample space is the set of real numbers is called univariate,
while a distribution whose sample space is a vector space is called multivariate. A univariate
distribution gives the probabilities of a single random variable taking on various alternative
values; a multivariate distribution (a joint probability distribution) gives the probabilities of
a random vectora list of two or more random variablestaking on various combinations of
values. Important and commonly encountered univariate probability distributions include
the binomial distribution, the hypergeometric distribution, and the normal distribution.
The multivariate normal distribution is a commonly encountered multivariate distribution.

Você também pode gostar