Você está na página 1de 26

Lecture X

Definition 2.3.3. Let X be a random variable with cdf


F
X
. The moment generating function (mgf ) of X (or
F
X
), denoted M
X
(t), is


provided that the expectation exists for t in some
neighborhood of 0. That is, there is an h>0 such that,
for all t in h<t<h, E[e
tX
] exists.


( ) | |
tX
X
e E t M =
If the expectation does not exist in a neighborhood of
0, we say that the moment generating function does
not exist.
More explicitly, the moment generating function can
be defined as:

( ) ( )
( ) | | variables random discrete for
and , variables random continuous for

}
= =
=


x
tx
X
tx
X
x X P e t M
dx x f e t M
Theorem 2.3.2: If X has mgf M
X
(t), then


where we define
( ) ( )
0
) (
0
=
=
t
X
n
n
n
X
t M
dt
d
M
( )
( )
0
n
n
X
E X M
(
=

First note that e
tx
can be approximated around zero
using a Taylor series expansion:




( ) ( ) ( ) ( )
| |
2 3
0 0 2 0 3 0
2 3
2 3
1 1
0 0 0
2 6
1
2 6
tx t t t
X
M t E e E e te x t e x t e x
t t
E x t E x E x
(
( = = + + + +
(

( ( = + + + +

Note for any moment n:


Thus, as t0

( )
( )
1 2 2
n
n
n n n
X X
n
d
M M t E x E x t E x t
dt
+ +
( ( (
= = + + +

( )
( ) | |
n n
X
x E M = 0
Leibnitzs Rule: If f(x,), a(), and b() are
differentiable with respect to , then
( )
( )
( )
( ) ( ) ( ) ( ) ( ) ( )
( )
( )
( )
}
}
c
c
+
=
u
u
u
u
u
u
u
u
u u u
u
u u u
u
b
a
b
a
dx x f
b
d
d
a f a
d
d
b f dx x f
d
d
,
, , ,
Berger and Casella proof: Assume that we can
differentiate under the integral using Leibnitzs rule,
we have

( ) ( )
( )
( )
}
}
}


=
|
.
|

\
|
=
=
dx x f xe
dx x f e
dt
d
dx x f e
dt
d
t M
dt
d
tx
tx
tx
X
Letting t->0, this integral simply becomes



This proof can be extended for any moment of the
distribution function.
( ) | |
xf x dx E x

=
}
Moment Generating Functions for
Specific Distributions
Application to the Uniform Distribution:
( ) (
( ) a b t
e e
e
t a b
dx
a b
e
t M
at bt
b
a
tx
b
a
tx
X

=
}
1 1
Following the expansion developed earlier, we have:
( )
( ) ( ) ( ) ( )
( )
( )
( )
( )
( )
( )( )
( )
( )( )
( )
( ) ( )

+ + + + + + =
+

+ +
+

+
+ =
+

+ =

+ + + +
=
2 2 2
3 2 2 2
3 3 3 2 2 2
3 3 3 2 2 2
6
1
2
1
1
6
1
2
1
1
6 2
1
6
1
2
1
1 1
t b ab a t b a
t
t
a b
a ab b a b
t
t
a b
a b a b
t a b
t a b
t a b
t a b
t a b
t a b t a b t a b
t M
X
Letting b=1 and a=0, the last expression becomes:



The first three moments of the uniform distribution
are then:
( ) + + + + =
3 2
24
1
6
1
2
1
1 t t t t M
X
( )
( )
( )
( )
( )
( )
4
1
6
24
1
0
3
1
2
6
1
0
2
1
0
3
2
1
= =
= =
=
X
X
X
M
M
M
Application to the Univariate Normal Distribution
( )
( )
( )
}
}


=
=
dx
x
tx
dx e e t M
x
tx
X
2
2
2
1
2
1
exp
2
1
2
1
2
2
o

t o
t o
o

Focusing on the term in the exponent, we have


( ) ( )
( )
( )
2
2 2 2
2
2 2 2
2
2 2 2
2
2
2
2
2
2
2
1
2
2
1
2 2
2
1
2
2
1
2
1
o
o
o
o
o
o
o
o
o

+ +
=
+ +
=
+
=

=

t x x
tx x x
tx x x
tx x x
tx
The next state is to complete the square in the
numerator.
( )
( ) ( )
( )
4 2 2
4 2 2 2 2
2
2
2 2 2
2
0 2 2
0
0 2
o o
o o o
o
o
t t c
t t t x x
t x
c t x x
+ =
= + + + +
= +
= + + +
The complete expression then becomes:
( )
( )
( )
2 2 2 4 2
2 2
2
2 2
2
2
1 1
2 2
1 1
2 2
x t t t
x
tx
x t
t t
o o o

=
o o
o
= + + o
o
The moment generating function then becomes:
( )
( )
|
.
|

\
|
+ =
|
|
.
|

\
|

|
.
|

\
|
+ =
}


2 2
2
2
2 2
2
1
exp
2
1
exp
2
1
2
1
exp
t t
dx
t x
t t t M
X
o
o
o
t o
o
Taking the first derivative with respect to t, we get:




Letting t->0, this becomes:
( )
( ) ( )
|
.
|

\
|
+ + =
2 2 2 1
2
1
exp t t t t M
X
o o
( )
( ) = 0
1
X
M
The second derivative of the moment generating
function with respect to t yields:





Again, letting t->0 yields
( )
( )
( )( )
|
.
|

\
|
+ + +
+
|
.
|

\
|
+ =
2 2 2 2
2 2 2 2
2
1
exp
2
1
exp
t t t t
t t t M
X
o o o
o o
( )
( )
2 2 2
0 o + =
X
M
Let X and Y be independent random variables with
moment generating functions M
X
(t) and M
Y
(t).
Consider their sum Z=X+Y and its moment generating
function:
( )
( )
( ) ( )
t x y
tz tx ty
Z
tx ty
X Y
M t E e E e E e e
E e E e M t M t
+
(
( (
= = = =


( (
=

We conclude that the moment generating function for
two independent random variables is equal to the
product of the moment generating functions of each
variable.
Skipping ahead slightly, the multivariate normal
distribution function can be written as:


where is the variance matrix and is a vector of
means.
( ) ( ) ( )
|
.
|

\
|
E E =


t
x x x f
1
'
2
1
exp
2
1
In order to derive the moment generating function, we
now need a vector t. The moment generating function
can then be defined as:
( )
1
exp ' '
2
X
M t t t t
| |
= + E
|
\ .
Normal variables are independent if the variance
matrix is a diagonal matrix.
Note that if the variance matrix is diagonal, the
moment generating function for the normal can be
written as:
( )
( )
( ) ( ) ( )
1 2 3
2
1
2
2
2
3
2 2 2 2 2 2
1 1 2 2 3 3 1 1 2 2 3 3
2 2 2 2
1 1 1 1 2 2 2 3 3 3
0 0
1
exp ' ' 0 0
2
0 0
1
exp
2
1 1 1
exp
2 2 2
X
X X X
M t t t t
t t t t t t
t t t t
M t M t M t
| |
( o
|
(
= + o
|
(
|
(
o

\ .
| |
= + + + o + o + o
|
\ .
| |
| | | | | |
= + o + + o + + o
| | | |
\ . \ . \ .
\ .
=

Você também pode gostar