Você está na página 1de 6

Name Barnali Chakraborty

Roll No. 520924936

Subject Statistics for Management

Assignment No. MB0024 – Set 1

Study Centre Cybertech Info (1626)

Date of Submission 30 November, 2009

MB0024 SET 1
1. Explain the limitations of statistics in your own words.

Ans. Statistics with all its wide application in every sphere of human activity has
its own limitations. Some of them are given below below:-

1) Statistics is not suitable to the study of qualitative phenomenon: Since


statistics is basically a science and deals with a set of numerical data, it is
applicable to the study of only these subjects of enquiry, which can be
expressed in terms of quantitative measurements. As a matter of fact,
qualitative phenomenon like honesty, poverty, beauty, intelligence etc,
cannot be expressed numerically and any statistical analysis cannot be
directly applied on these qualitative phenomenons. Nevertheless, statistical
techniques may be applied indirectly by first reducing the qualitative
expressions to accurate quantitative terms. For example, the intelligence of
a group of students can be studied on the basis of their marks in a particular
examination.

2) Statistics does not study individuals: Statistics does not give any specific
importance to the individual items, in fact it deals with an aggregate of
objects. Individual items, when they are taken individually do not constitute
any statistical data and do not serve any purpose for any statistical enquiry.

3) Statistical laws are not exact: It is well known that mathematical and
physical sciences are exact. But statistical laws are not exact and statistical
laws are only approximations. Statistical conclusions are not universally
true. They are true only on an average. They are probabilistic statements.

4) Statistics table may be misused: Statistics must be used only by experts;


otherwise, statistical methods are the most dangerous tools on the hands of
the inexpert. The use of statistical tools by the inexperienced and untraced
persons might lead to wrong conclusions. Statistics can be easily misused by
quoting wrong figures of data.

5) Statisticians only can handle statistics: Common men cannot handle


Statistics properly; only Statisticians can handle statistics properly.

2. Briefly explain relative frequency of occurrence in your own words.

Ans. Proportion of times that an event occurs in the long run when the conditions
are stable, or the observed relative frequency of an event in a very large
number of trials.

This method uses the relative frequencies of past occurrences as


probabilities. We determine how often something has happened in the past
and use that fi8gure to predict the probability that it will happen again in
the future.

For example, suppose that an accounts receivable manager knows from past
data that about 70 of 1000 accounts usually become uncollectible after 120

MB0024 SET 1
days. The manager would estimate the probability of bad debts as 70/1000
= .07 or 7%.

A second characteristic of probabilities established by the relative


frequency of occurrence method can be shown by tossing one of our fair
coins 300 times. Here we can see that although the proportion of heads was
far from 0.5 in the first 100 tosses, it seemed to stabilize and approach 0.5
as the number of tosses increased.

In statistical language, we would say that the relative frequency becomes


stable as the number of tosses becomes large (if we are tossing the coin
under uniform conditions). Thus when we use the relative frequency
approach to establish probabilities, our probability figure will gain accuracy
as we increase the number of observations. Of course, this improved
accuracy is not free; although more tosses of our coin will produce a more
accurate probability of heads occurring, we must bear the time and the cost
of additional observations.

One difficulty with the relative frequency approach is that people often use
it without evaluating a sufficient number of outcomes. If you heard someone
say, “My aunt and uncle got the flu this year, and they are both over 65, so
everyone in that age bracket will probably get the flu,” you would know
that your friend did not base his assumptions on enough evidence. His
observations were insufficient data for establishing a relative frequency of
occurrence probability.

3. Write a short notes on Bernoulli Distribution. Also write the use of


Bernoulli process.

Ans. Individuals and corporate generate several data that resembles certain
theoretical distributions. Since mathematically we have many derived
characteristics of the theoretical distributions, we can make use of them for
a quick analysis of the observed distributions. These theoretical
distributions are divided into two groups:

a) Discrete probability distributions and


b) Continuous probability distributions

The above theoretical distributions are formed under certain assumptions:

i) Bernouli Process
ii) Application of Binomial, Poisson and Normal Distribution

MB0024 SET 1
Bernoulli Distributions:

A variable which assumes values 1 and 0 with probabilities p and q=1-p, is


called Bernoulli variable. It has only one parameter p. for different values
of p(0≤p≤1), we get different Bernoulli distribution,

1 represents the occurrence of success


0 represents the occurrence of failure

In other words the assumption for the distribution is outcome of a


experiment is of dichotomous nature i.e. success/failure, present/absent,
defective/non defective, yes/no etc.

Example, when a fair coin is tossed the outcome is either head or tail. The
variable “X” assumes 1 or 0.

Repetition of Bernoulli experiment:

An experiment which results in two mutually exclusive and exhaustive


outcome is called a Bernoulli experiment. Let a Bernoulli experiment be
repeated “n” times under identical conditions, let Xi, for i=1 to n, assume
the values 1 or 0. Then Xi is a Bernoulli variate with probability p. let X=X 1 +
X2 + ……… + Xn denote the number of success in the “n” repetition. Then X
forms Bernoulli Distribution. Its mean is p and variance is pq.

4. Discuss briefly the continuous probability distribution.

Ans. Probability distribution is a Statement about the possible values of a random


variable along with their respective probabilities. It is a probability
distribution for a continuous random variable.

A continuous random variable X has a zero probability of assuming exactly


any of its values. Apparently, this seems to be a surprising statement. Let us
try to understand this by considering a random variable say, weight.
Obviously weight is a continuous random variable since it can vary
continuously. Suppose, we do not know the weight of a person exactly but
have a rough idea that her weight falls between 60 kg and 61 kg. Now, there
are an infinite number of possible weights between these two limits. As a
result, by its definition, the probability of the person's assuming a particular
weight say, 60.3 kg will be negligibly small, almost equal to zero. But we
can definitely attach some probability to the person's weight being between
60 kg and 61 kg. Thus, for a continuous random variable X, one assigns a
probability to an interval and not to a particular value. Here, we look for a
function p(x), called the probability density function, such that with the
help of this function we can compute the probability

P(a < x < b), a and b are the limits of an interval (a, b) where, a < b

MB0024 SET 1
A probability density function is defined in such a manner that the area
under its curve bounded by x-axis is equal to one when computed 'over the
domain of X for which p(x) is defined. The probability density h t i o n for a
continuous random variable X defined over the entire set of real numbers R
should satisfy the following
conditions.

1) p(x)≥ 0 for all x € R

2) ∫p(x)dx = 1
b
3) P(a<X<b) = ∫p(x)dx
a

Although the probability distribution of a continuous random variable cannot


be presented in the form of a table like that of a discrete random variable,
it can nevertheless be expressed by a specific form of the probability
density function p(x). We shall study some of these forms in the next unit on
the theoretical distributions for continuous random variables.

5. Write short notes on simple random sampling.

Ans. There are two methods of studying the characteristics of population,


census and sampling. Mainly there are two methods of sampling namely,

i) Probability Sampling and


ii) Non-probability Sampling

Simple Random Sampling is one of the important sampling designs of


probability sampling. Under this technique sample units are drawn in such a
way that each and every unit in the population has an equal and
independent chance of being included in the sample. If sample unit is
replaced before drawing next unit, then it is known as Sample Random
Sampling with replacement. If the sample unit is not replaced before
drawing next unit, then it is called Sample Random Sampling without
replacement. In first case probability of drawing a unit is //N, where N is
the population size. In the second case probability of drawing a unit is 1/Nn.

Selection of Simple Random Sampling can be done by the following


methods:

a) Lottery Method
b) The use of table of random numbers

a) Lottery Method: In lottery method we identify each and every unit


with distinct numbers by allotting an identical card. The cards are put in
a drum and thoroughly shuffled before each unit is drawn.

MB0024 SET 1
b) The use of table of random numbers: There are several Random
Numbers Tables. They are Tipper’s Random Number Table, Fisher’s and
Yate’s Table, Kendall and Babington Smiths Random Tables, Rand
Corporation random numbers etc.

6. State Central Limit Theorem. Explain it in your own words.

Ans. In probability theory, the central limit theorem (CLT) states conditions
under which the mean of a sufficiently large number of independent random
variables, each with finite mean and variance, will be approximately
normally distributed (Rice 1995). The central limit theorem also requires
the random variables to be identically distributed, unless certain conditions
are met. Since real-world quantities are often the balanced sum of many
unobserved random events, this theorem provides a partial explanation for
the prevalence of the normal probability distribution. The CLT also justifies
the approximation of large-sample statistics to the normal distribution in
controlled experiments.

In more general probability theory, a central limit theorem is any of a set of


weak-convergence theories. They all express the fact that a sum of many
independent random variables will tend to be distributed according to one
of a small set of "attractor" (i.e. stable) distributions. For other
generalizations for finite variance which do not require identical
distribution.

The central limit theorem is also known as the second fundamental theorem
of probability. If X1, X2 ………….Xn is a random sample of size “n” from any
population, then the sample mean (X) is normally distributed with mean μ
and variance σ2/ n provided “n” is sufficiently large.

From the theorem we can conclude the following:

i) the mean of the sampling distribution mean will be equal to the


population mean.
ii) The sampling distribution of the mean approaches normal distribution
as the sample size increases.
iii) It permits us to use sample statistics to make inference about
population parameters irrespective of the shape of frequency
distribution of the population.

The Central Limit Theorem tells us, quite generally, what happens when we
have the sum of a large number of independent random variables each of
which contributes a small amount to the total.

***END***

MB0024 SET 1

Você também pode gostar